African Credit Scoring Challenge
¶
Contributeurs : Amadou BAH, Frederic AKADJE
Introduction¶
African scoring challenge est une compétition type kaggle organisé par Zindi Visitez Zindi Africa ici.
Objectif : Construire un modèle machine learning généralisable afin de prédire le remboursement ou non d'un prêt bancaire. lien compétion
Contexte :
Un dataset est mis à dispositions, il regroupe les demandes de prêt de différents acteurs économique du Kenya ainsi que leur issue (remboursé ou non).
L'évaluation sera réalisée sur les demandes de prêts des population du Kenya et du Ghana.
La demarche proposée ici s'inspire grandement de l'approche de PhD. Oluwatobi Afolabi, 3è au classement du challenge.
Son notebook est disponible ici : https://www.kaggle.com/code/tobby18/african-credit-scoring-challenge-zindi-final#Model-Development
Points clés du notebook :
Analyse exploratoire
Data processing
Feature Engineering
Construction du modèle
Construction du score de crédit
Recommandation
# === Librairies ===
import warnings
warnings.filterwarnings('ignore')
import pandas as pd
import numpy as np
# Visualisation
import matplotlib.pyplot as plt
import seaborn as sns
# Machine Learning
from sklearn.preprocessing import LabelEncoder
from sklearn.model_selection import StratifiedKFold
from sklearn.metrics import classification_report
from sklearn.metrics import f1_score
from scipy.stats import mode
# Advanced ML Models
from imblearn.over_sampling import BorderlineSMOTE
import xgboost as xgb
from lightgbm import LGBMClassifier
from catboost import CatBoostClassifier
from sklearn.tree import DecisionTreeClassifier
Apperçu du jeu de données¶
Variables du jeu de données¶
| Nom de la variable | Description |
|---|---|
ID |
Identifiant unique pour chaque entrée du jeu de données. |
customer_id |
Identifiant unique pour chaque client dans le jeu de données. |
country_id |
Identifiant ou code représentant le pays du client ou du prêt. |
tbl_loan_id |
Identifiant unique pour chaque prêt associé au client. |
Total_Amount |
Montant total du prêt initialement versé au client. |
Total_Amount_to_Repay |
Montant total à rembourser par le client (capital + intérêts + frais). |
loan_type |
Catégorie ou type de prêt. |
disbursement_date |
Date à laquelle le montant du prêt a été versé au client. |
duration |
Durée du prêt, généralement exprimée en jours. |
lender_id |
Identifiant unique pour le prêteur ou l'institution ayant accordé le prêt. |
New_versus_Repeat |
Indique si le prêt est le premier ("New") ou non ("Repeat"). |
Amount_Funded_By_Lender |
Portion du prêt financée directement par le prêteur. |
Lender_portion_Funded |
Pourcentage du montant total du prêt financé par le prêteur. |
due_date |
Date limite à laquelle le remboursement du prêt est dû. |
Lender_portion_to_be_repaid |
Portion du prêt impayé qui doit être remboursée au prêteur. |
target |
Variable cible : 0 = prêt remboursé, 1 = prêt non remboursé (défaut). |
# Chargement des jeux de données train et test
train = pd.read_csv('data/Train.csv')
test = pd.read_csv('data/Test.csv')
# Affichage des jeux de données
display("Train", train.head(3), train.shape, "Test", test.head(5), test.shape)
'Train'
| ID | customer_id | country_id | tbl_loan_id | lender_id | loan_type | Total_Amount | Total_Amount_to_Repay | disbursement_date | due_date | duration | New_versus_Repeat | Amount_Funded_By_Lender | Lender_portion_Funded | Lender_portion_to_be_repaid | target | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | ID_266671248032267278 | 266671 | Kenya | 248032 | 267278 | Type_1 | 8448.0 | 8448.0 | 2022-08-30 | 2022-09-06 | 7 | Repeat Loan | 120.85 | 0.014305 | 121.0 | 0 |
| 1 | ID_248919228515267278 | 248919 | Kenya | 228515 | 267278 | Type_1 | 25895.0 | 25979.0 | 2022-07-30 | 2022-08-06 | 7 | Repeat Loan | 7768.50 | 0.300000 | 7794.0 | 0 |
| 2 | ID_308486370501251804 | 308486 | Kenya | 370501 | 251804 | Type_7 | 6900.0 | 7142.0 | 2024-09-06 | 2024-09-13 | 7 | Repeat Loan | 1380.00 | 0.200000 | 1428.0 | 0 |
(68654, 16)
'Test'
| ID | customer_id | country_id | tbl_loan_id | lender_id | loan_type | Total_Amount | Total_Amount_to_Repay | disbursement_date | due_date | duration | New_versus_Repeat | Amount_Funded_By_Lender | Lender_portion_Funded | Lender_portion_to_be_repaid | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | ID_269404226088267278 | 269404 | Kenya | 226088 | 267278 | Type_1 | 1919.0 | 1989.0 | 2022-07-27 | 2022-08-03 | 7 | Repeat Loan | 575.7 | 0.300000 | 597.0 |
| 1 | ID_255356300042267278 | 255356 | Kenya | 300042 | 267278 | Type_1 | 2138.0 | 2153.0 | 2022-11-16 | 2022-11-23 | 7 | Repeat Loan | 0.0 | 0.000000 | 0.0 |
| 2 | ID_257026243764267278 | 257026 | Kenya | 243764 | 267278 | Type_1 | 8254.0 | 8304.0 | 2022-08-24 | 2022-08-31 | 7 | Repeat Loan | 207.0 | 0.025079 | 208.0 |
| 3 | ID_264617299409267278 | 264617 | Kenya | 299409 | 267278 | Type_1 | 3379.0 | 3379.0 | 2022-11-15 | 2022-11-22 | 7 | Repeat Loan | 1013.7 | 0.300000 | 1014.0 |
| 4 | ID_247613296713267278 | 247613 | Kenya | 296713 | 267278 | Type_1 | 120.0 | 120.0 | 2022-11-10 | 2022-11-17 | 7 | Repeat Loan | 36.0 | 0.300000 | 36.0 |
(18594, 15)
# Types de données et potentielles valeurs manquantes
train.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 68654 entries, 0 to 68653 Data columns (total 16 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 ID 68654 non-null object 1 customer_id 68654 non-null int64 2 country_id 68654 non-null object 3 tbl_loan_id 68654 non-null int64 4 lender_id 68654 non-null int64 5 loan_type 68654 non-null object 6 Total_Amount 68654 non-null float64 7 Total_Amount_to_Repay 68654 non-null float64 8 disbursement_date 68654 non-null object 9 due_date 68654 non-null object 10 duration 68654 non-null int64 11 New_versus_Repeat 68654 non-null object 12 Amount_Funded_By_Lender 68654 non-null float64 13 Lender_portion_Funded 68654 non-null float64 14 Lender_portion_to_be_repaid 68654 non-null float64 15 target 68654 non-null int64 dtypes: float64(5), int64(5), object(6) memory usage: 8.4+ MB
# Valeurs manquantes
print(f"Il y a {train.isna().sum().sum()} valeur manquantes.")
Il y a 0 valeur manquantes.
Analyse exploratoire¶
# Copie des datasets pour éviter les modifications en place
train_copy = train.copy()
test_copy = test.copy()
# Ajout d'une colonne 'dataset' pour différencier train et test
train_copy['dataset'] = 'Train'
test_copy['dataset'] = 'Test'
Répartition des prêts par pays dans les échantillions train test
# Concaténation des deux datasets sur la variable d'intérêt
combined = pd.concat([train_copy[['country_id', 'dataset']], test_copy[['country_id', 'dataset']]])
# Calcul des proportions (%)
counts = combined.groupby(['country_id', 'dataset']).size().reset_index(name='count')
total_per_dataset = counts.groupby('dataset')['count'].transform('sum')
counts['percentage'] = (counts['count'] / total_per_dataset * 100).round(1)
# Barplot groupé
plt.figure(figsize=(12, 6))
bar = sns.barplot(
data=counts,
x='country_id',
y='percentage',
hue='dataset',
palette='Set2'
)
# Ajout des annotations en pourcentage sur les barres
for p in bar.patches:
height = p.get_height()
if height > 0:
bar.annotate(f"{height:.1f}%",
(p.get_x() + p.get_width() / 2., height),
ha='center', va='bottom', fontsize=9, color='black',
xytext=(0, 4), textcoords='offset points')
# Titres et mise en forme
plt.title("Répartition des prêts par pays - Train vs Test (%)")
plt.xlabel("Pays (country_id)")
plt.ylabel("Proportion (%)")
plt.xticks(rotation=45)
plt.legend(title="Jeu de données")
plt.tight_layout()
plt.show()
Une forte disproportion géographique entre les jeux de données d'entraînement et de test apparaît. Le Kenya domine largement le jeu d'entraînement, tandis que le Ghana ni figure pas. Ce biais géographique doit être pris en compte pour éviter des performances décevantes du modèle sur des données provenant d'autres pays.
Cette différence peut poser des problèmes de généralisation si le modèle est entraîné uniquement sur des données provenant du Kenya, car il pourrait ne pas bien performer sur des données provenant d'autres pays comme le Ghana.
Occurrences des étiquettes (target) dans le train
plt.figure(figsize=(6, 4))
sns.countplot(data=train, x='target', palette='Set2')
plt.xticks([0, 1], ['Remboursé', 'Non remboursé'])
plt.title("Répartition des remboursements (train)")
plt.xlabel("Statut de remboursement")
plt.ylabel("Nombre de clients")
plt.tight_layout()
plt.show()
Une distribution fortement déséquilibrée apparaît au niveau des étiquettes dans le jeu d'entraînement, avec 98 % de prêts remboursés (classe 0) et seulement 2 % de défauts de paiement (classe 1). Cela traduit une sous-représentation marquée des défauts de paiement (la classe minoritaire).
Répartition des types de prêts dans train vs test
plt.figure(figsize=(10, 6))
loan_type_counts = pd.concat([train_copy[['loan_type', 'set']], test_copy[['loan_type', 'set']]])
sns.countplot(data=loan_type_counts, y='loan_type', hue='set', palette='pastel')
plt.title("Répartition des types de prêts (train vs test)")
plt.xlabel("Nombre (log) d’occurrences")
plt.ylabel("Type de prêt")
plt.legend(title="Jeu de données")
plt.xscale('log')
plt.tight_layout()
plt.show()
- Type_1 est le type de prêt le plus fréquent dans le jeu d'entraînement, avec un nombre d'occurrences très élevé.
- Le jeu de test présente également une forte prédominance du Type_1, mais à un niveau légèrement inférieur par rapport au jeu d'entraînement.
- Les autres types de prêts (Type_2, Type_3,...) sont beaucoup moins représentés, avec des occurrences très faibles ou quasi-nulles.
Une comparaison des occurrences des types de prêts (loan_type) entre les jeux de données d'entraînement (train) et de test, révèle que :
Par ailleurs, La distribution globale des types de prêts semble incohérente entre train et test. Cela se perçoit par la présence de certains prêteurs présent dans les données test, pourtant abscents dans les données d'entraînements.
Occurrences des types de prêteurs (lender_id) dans train vs test
top_lenders = train_copy['lender_id'].value_counts().nlargest(10).index
filtered_train = train_copy[train_copy['lender_id'].isin(top_lenders)]
filtered_test = test_copy[test_copy['lender_id'].isin(top_lenders)]
lender_counts = pd.concat([filtered_train[['lender_id', 'set']], filtered_test[['lender_id', 'set']]])
plt.figure(figsize=(10, 6))
sns.countplot(data=lender_counts, y='lender_id', hue='set', palette='coolwarm')
plt.title("Top preteurs dans train vs test")
plt.xlabel("Nombre (log) d’occurrences")
plt.ylabel("Identifiant du preteur")
plt.xscale('log')
plt.tight_layout()
plt.show()
De la comparaison des occurrences des identifiants de prêteurs (lender_id) entre les jeux de données d'entraînement (train) et de test il ressort que :
- Lender_267278 est le prêteur le plus fréquent dans le jeu d'entraînement, avec un nombre d'occurrences très élevé. Il démeure également le prêteur le plus sollicité dans le jeu de test, mais à un niveau beaucoup plus faible par rapport au train
- Les autres prêteurs (Lender_251804, Lender_245684,...) ont des occurrences très faibles ou quasi-nulles, présumant un rôle minime de ces acteurs.
- La distribution globale des prêteurs semble cohérente entre train et test, bien que l'écart soit marqué pour le prêteur principal (Lender_267278).
Distribution du montant total du prêt (Total_Amount)
# Ajouter target aux donnees train pour la coloration
train_with_label = train.copy()
train_with_label['dataset'] = 'Train'
test_with_label = test.copy()
test_with_label['dataset'] = 'Test'
combined_amount = pd.concat([
train_with_label[['Total_Amount', 'dataset', 'target']],
test_with_label[['Total_Amount', 'dataset']]
])
# Convertir target en label lisible
combined_amount['target_label'] = combined_amount['target'].map({0: 'Rembourse', 1: 'Non rembourse'})
plt.figure(figsize=(8,6))
sns.boxplot(data=combined_amount, x='dataset', y='Total_Amount', hue='target_label', palette='Set2')
sns.stripplot(data=combined_amount, x='dataset', y='Total_Amount', jitter=True, alpha=0.6, color='gray', size=3)
plt.title("Distribution du montant total du pret (Total_Amount) - Train vs Test")
plt.xlabel("Jeu de donnees")
plt.ylabel("Montant total du pret")
plt.legend(title="Statut du pret (Train uniquement)")
plt.yscale('log') # Pour mieux visualiser les ecarts
plt.tight_layout()
plt.show()
La distribution des montants totaux reste assez similaire des échantillons test et train. Les valeurs sont concentrées autour de quelques ordres de grandeur, avec une majorité de prêts ayant des montants relativement faibles. Il existe quelques outliers à très hauts montants, mais ils sont rares.
Cependant, une différence notable émerge, si l'on jette un regard au niveau de la répartition des montants selon les modalités 'remboursé' et 'non-rembourser'. Une dominance prononcée des défauts de paiements apparaît ainsi pour les montants plus importants.
Distribution du montant total à rembourser (Total_Amount_to_Repay)
combined_duration = pd.concat([
train_with_label[['duration', 'dataset', 'target']],
test_with_label[['duration', 'dataset']]
])
combined_duration['target_label'] = combined_duration['target'].map({0: 'Rembourse', 1: 'Non rembourse'})
plt.figure(figsize=(8,6))
sns.boxplot(data=combined_duration, x='dataset', y='duration', palette='Set2')
sns.stripplot(data=combined_duration, x='dataset', y='duration', jitter=True, alpha=0.6, color='gray', size=3)
plt.title("Distribution de la dutrée totale du pret (duration) - Train vs Test")
plt.xlabel("Jeu de donnees")
plt.ylabel("Montant total du pret")
plt.legend(title="Statut du pret (Train uniquement)")
plt.tight_layout()
plt.show()
Une similitude apparaît au niveau des distributions des durées des prêts. Les valeurs sont concentrées autour de 200 jours (6mois) avec une majorité de prêts ayant des montants relativement faibles. Il existe quelques outliers à très haute durée de paiements, mais ils sont également rares.
Croisement Total_Amount vs Total_Amount_to_Repay (train vs test)
import matplotlib.pyplot as plt
import seaborn as sns
# Setup style
sns.set(style='whitegrid')
# Figure avec 2 lignes (vertical)
fig, axes = plt.subplots(2, 1, figsize=(10, 12), sharex=True, sharey=True)
# Graphe 1 : Donnees TRAIN avec target
sns.scatterplot(
data=train,
x='Total_Amount',
y='Total_Amount_to_Repay',
hue='target',
palette={0: 'green', 1: 'red'},
alpha=0.7,
ax=axes[0]
)
axes[0].plot(
[train['Total_Amount'].min(), train['Total_Amount'].max()],
[train['Total_Amount'].min(), train['Total_Amount'].max()],
color='blue', linestyle='--', label='egalite'
)
axes[0].set_title("Train : Montant a rembourser vs Montant initial")
axes[0].set_xlabel("")
axes[0].set_ylabel("Total_Amount_to_Repay (log)")
axes[0].set_xscale('log')
axes[0].set_yscale('log')
axes[0].legend(title='Statut (target)', labels=['Rembourse (0)', 'Non rembourse (1)'])
# Graphe 2 : Donnees TEST sans target
sns.scatterplot(
data=test,
x='Total_Amount',
y='Total_Amount_to_Repay',
color='gray',
alpha=0.4,
ax=axes[1]
)
axes[1].plot(
[test['Total_Amount'].min(), test['Total_Amount'].max()],
[test['Total_Amount'].min(), test['Total_Amount'].max()],
color='blue', linestyle='--', label='egalite'
)
axes[1].set_title("Test : Montant a rembourser vs Montant initial")
axes[1].set_xlabel("Total_Amount (log)")
axes[1].set_ylabel("Total_Amount_to_Repay (log)")
axes[1].set_xscale('log')
axes[1].set_yscale('log')
axes[1].legend(title='Donnees test', labels=['echantillons', 'egalite'])
plt.tight_layout()
plt.show()
La plupart des points suivent une tendance ascendante proche de la ligne d'égalité (y = x), indiquant que le montant à rembourser est généralement proche du montant initial. Les points rouges (non remboursés) se situent au dessus de la ligne d'égalité, suggérant des montants à rembouser plus importants. A contratio, les points verts (remboursés) sont généralement dispersés en dessous de la ligne d'égalité avec des concentrations autour de cette ligne. Cette dispersion initiale se généralise au niveau de l'échantillon test.
Nombre de crédits pris vs remboursés par utilisateur
# Compter le nombre de prets par client
loan_count_train = train.groupby('customer_id').size().reset_index(name='total_loans')
# Compter le nombre de prets rembourses par client
repaid_loans = train[train['target'] == 0].groupby('customer_id').size().reset_index(name='repaid_loans')
# Fusionner les deux
loan_stats = pd.merge(loan_count_train, repaid_loans, on='customer_id', how='left')
loan_stats['defaulted_loans'] = loan_stats['total_loans'] - loan_stats['repaid_loans'].fillna(0)
# Ajouter une colonne pour identifier si le dernier pret est en defaut
last_loan_status = train.groupby('customer_id')['target'].last().reset_index()
last_loan_status.columns = ['customer_id', 'last_loan_default']
loan_stats = pd.merge(loan_stats, last_loan_status, on='customer_id', how='inner')
plt.figure(figsize=(10,6))
scatter = sns.scatterplot(
data=loan_stats,
x='total_loans',
y='repaid_loans',
size='defaulted_loans',
sizes=(20, 300),
hue='last_loan_default',
palette='Set2',
alpha=0.7,
legend='brief'
)
plt.title("Nombre total de prets vs Prets rembourses\n(Taille = Nombre de defauts, Couleur = Defaut sur le dernier pret)")
plt.xlabel("Nombre (log) total de prets")
plt.xscale('log')
plt.yscale('log')
plt.ylabel("Nombre (log) de prets rembourses")
plt.grid(True)
plt.legend(title="Dernier pret", loc='lower right')
plt.tight_layout()
plt.show()
- Les points verts (dernier prêt remboursé) majoritairement concentrés sur ou près de la diagonale, indiquant que ces utilisateurs ont généralement bien géré leurs prêts et ont remboursé la plupart d'entre eux.
- Les points oranges (dernier prêt non remboursé) souvent situés en dessous de la diagonale, indiquant que ces utilisateurs ont contracté plusieurs prêts mais n’ont pas réussi à les rembourser complètement.
- Certains de ces points sont également plus grands, ce qui signifie qu'ils ont déjà eu plusieurs défauts de paiement dans le passé.
Il existe une tendance générale où les utilisateurs ayant contracté plus de prêts ont également remboursé plus de prêts. Cela suggère une corrélation positive entre le nombre total de prêts et le nombre de prêts remboursés.
Les utilisateurs ayant un bon historique de remboursement (points verts) suivent généralement une trajectoire stable, avec un nombre de prêts remboursés proche du nombre total de prêts contractés. Quant aux utilisateurs avec un grand nombre de défauts (points plus gros), ils sont plus susceptibles de présenter des comportements à risque, comme ne pas rembourser leur dernier prêt.
Feature Engineering¶
# Combinaison des ensembles de données pour un traitement cohérent des caractéristiques
data = pd.concat([train, test]).reset_index(drop=True)
# Conversion des colonnes de dates en datetime
data['disbursement_date'] = pd.to_datetime(data['disbursement_date'], errors='coerce')
data['due_date'] = pd.to_datetime(data['due_date'], errors='coerce')
# Mise à jour du mo
# ntant total à rembourser par la portion du prêteur à rembourser lorsque le montant total à rembourser est égal à 0
data.loc[data['Total_Amount_to_Repay'] == 0, ['Total_Amount_to_Repay']] += data['Lender_portion_to_be_repaid']
# Calcul des moyennes et médianes des montants à rembourser par client
aggregates = data.groupby('customer_id')['Total_Amount_to_Repay'].agg(['mean', 'median']).reset_index()
aggregates.rename(columns={'mean': 'Mean_Total_Amount', 'median': 'Median_Total_Amount'}, inplace=True)
data = data.merge(aggregates, on='customer_id', how='left')
# Extraction des caractéristiques temporelles à partir des dates
date_cols = ['disbursement_date', 'due_date']
for col in date_cols:
data[col] = pd.to_datetime(data[col])
# Extraction du mois, jour, année
data[col+'_month'] = data[col].dt.month
data[col+'_day'] = data[col].dt.day
data[col+'_year'] = data[col].dt.year
# Calcul de la durée du prêt et des caractéristiques des jours de la semaine
data[f'loan_term_days'] = (data['due_date'] - data['disbursement_date']).dt.days
data[f'disbursement_weekday'] = data['disbursement_date'].dt.weekday
data[f'due_weekday'] = data['due_date'].dt.weekday
# Création de certains ratios financiers et transformations
data['repayment_ratio'] = data['Total_Amount_to_Repay'] / data['Total_Amount']
data['amount_due_per_day'] = (data['Total_Amount_to_Repay'] / data['duration'])
data['log_Total_Amount'] = np.log1p(data['Total_Amount'])
data['log_Total_Amount_to_Repay'] = np.log1p(data['Total_Amount_to_Repay'])
data['log_Amount_Funded_By_Lender'] = np.log1p(data['Amount_Funded_By_Lender'])
data['log_Lender_portion_to_be_repaid'] = np.log1p(data['Lender_portion_to_be_repaid'])
data['amount_to_repay_greater_than_average'] = data['Mean_Total_Amount'] - data['Total_Amount_to_Repay']
# Des valeurs aberrantes ont été observées dans les champs montant total et montant total à rembourser.
# Nous compensons cela en utilisant le 90e percentile
q = 0.9
data['Total_Amount_to_Repay'] = np.where(data['Total_Amount_to_Repay'] >= data['Total_Amount_to_Repay'].quantile(q), data['Total_Amount_to_Repay'].quantile(q), data['Total_Amount_to_Repay'])
data['Total_Amount'] = np.where(data['Total_Amount'] >= data['Total_Amount'].quantile(q), data['Total_Amount'].quantile(q), data['Total_Amount'])
# Traitement des variables catégorielles
cat_cols = data.select_dtypes(include='object').columns
# Encodage des variables catégorielles
le = LabelEncoder()
for col in [col for col in cat_cols if col not in ['loan_type', 'ID']]:
data[col] = le.fit_transform(data[col])
# Séparation en train et test
train_df = (data[data['ID'].isin(train['ID'].unique())]).fillna(0)
test_df = (data[data['ID'].isin(test['ID'].unique())]).fillna(0)
# Définition des caractéristiques pour la modélisation
features_for_modelling = [col for col in train_df.columns if col not in date_cols + ['ID', 'target', 'country_id', 'loan_type']]
print(f"La forme de train_df est : {train_df.shape}")
print(f"La forme de test_df est : {test_df.shape}")
print(f"La forme de train est : {train.shape}")
print(f"La forme de test est : {test.shape}")
print(f"Les caractéristiques pour la modélisation sont :\n{features_for_modelling}")
La forme de train_df est : (68654, 34) La forme de test_df est : (18594, 34) La forme de train est : (68654, 16) La forme de test est : (18594, 15) Les caractéristiques pour la modélisation sont : ['customer_id', 'tbl_loan_id', 'lender_id', 'Total_Amount', 'Total_Amount_to_Repay', 'duration', 'New_versus_Repeat', 'Amount_Funded_By_Lender', 'Lender_portion_Funded', 'Lender_portion_to_be_repaid', 'Mean_Total_Amount', 'Median_Total_Amount', 'disbursement_date_month', 'disbursement_date_day', 'disbursement_date_year', 'loan_term_days', 'disbursement_weekday', 'due_weekday', 'due_date_month', 'due_date_day', 'due_date_year', 'repayment_ratio', 'amount_due_per_day', 'log_Total_Amount', 'log_Total_Amount_to_Repay', 'log_Amount_Funded_By_Lender', 'log_Lender_portion_to_be_repaid', 'amount_to_repay_greater_than_average']
cv = StratifiedKFold(n_splits=4, shuffle=True, random_state=42)
Entraînement et optimisation du modèle¶
Dans cette étape, nous avons cherché à entraîner et optimiser trois modèles de machine learning : XGBoost, LightGBM et CatBoost. L'objectif est d'ajuster leurs hyperparamètres afin de réaliser un stacking des trois modèles.
Pour chaque modèle, nous avons utilisé une recherche des hyperparamètres, afin de sélectionner les valeurs optimales qui maximisent la performance du modèle tout en évitant le sur-apprentissage.
from sklearn.model_selection import GridSearchCV, train_test_split
# Paramètres de la graine aléatoire
seed = 42
# Données
X, y = train_df[features_for_modelling], train_df['target']
# Séparation des données en ensembles d'entraînement et de test
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=42)
# XGBoost
xgb_model = xgb.XGBRegressor(random_state=seed)
xgb_param_grid = {
'n_estimators': [100,200,250],
'max_depth': [5,6,7],
'learning_rate': [0.1, 0.09, 0.01],
'subsample': [0.91, 0.92, 0.93],
'colsample_bytree': [0.68, 0.69, 0.7],
'gamma': [0.01, 0.005, 0.015],
'min_child_weight': [1]
}
xgb_grid_search = GridSearchCV(estimator=xgb_model, param_grid=xgb_param_grid, cv=cv, verbose=1, n_jobs=-1)
xgb_grid_search.fit(X_train, y_train)
# LightGBM
lgb_model = LGBMClassifier(random_state=seed)
lgb_param_grid = {
'n_estimators': [100,200,250],
'max_depth': [6,7,8],
'learning_rate': [0.1, 0.075, 0.07,0.05],
'num_leaves': [70, 75, 80],
'feature_fraction': [0.66, 0.65, 0.67],
'bagging_fraction': [0.66, 0.65, 0.67],
'lambda_l1': [0.05, 0.06, 0.07],
'lambda_l2': [0.85, 0.86, 0.87]
}
lgb_grid_search = GridSearchCV(estimator=lgb_model, param_grid=lgb_param_grid, cv=cv, verbose=1, n_jobs=-1)
lgb_grid_search.fit(X_train, y_train)
# CatBoost - Modèle et grille de recherche
cat_model = CatBoostClassifier(silent=True, random_state=seed)
cat_param_grid = {
'iterations': [500],
'depth': [6,7,8],
'learning_rate': [0.1, 0.09, 0.01]
}
cat_grid_search = GridSearchCV(estimator=cat_model, param_grid=cat_param_grid, cv=cv, verbose=1, n_jobs=-1)
cat_grid_search.fit(X_train, y_train)
# Affichage des meilleurs résultats
print("Meilleurs paramètres pour XGBoost : ", xgb_grid_search.best_params_)
print("Meilleurs paramètres pour LightGBM : ", lgb_grid_search.best_params_)
print("Meilleurs paramètres pour CatBoost : ", cat_grid_search.best_params_)
xgb_best_params= {'n_estimators': 250, 'max_depth': 7, 'learning_rate': 0.1, 'subsample': 0.91, 'colsample_bytree': 0.68, 'gamma': 0.01, 'min_child_weight': 1}
lgb_best_params= {'n_estimators': 250, 'max_depth': 8, 'learning_rate': 0.07, 'num_leaves': 70, 'feature_fraction': 0.66, 'bagging_fraction': 0.66, 'lambda_l1': 0.06, 'lambda_l2': 0.87}
cat_best_params= {'iterations': 500, 'depth': 7, 'learning_rate': 0.1}
import warnings
from sklearn.metrics import classification_report, f1_score, confusion_matrix, roc_curve, auc
from imblearn.over_sampling import BorderlineSMOTE
from scipy.stats import mode
from sklearn.metrics import roc_auc_score
warnings.filterwarnings('ignore', category=UserWarning, module='lightgbm')
# Validation croisée pour évaluer la performance du modèle
cv_reports = []
f1_scores = []
roc_auc_scores = []
predictions = []
pred_prob = []
base_estimator = DecisionTreeClassifier()
# Boucle sur chaque pli (fold) de la validation croisée
for fold, (train_idx, valid_idx) in enumerate(cv.split(X, y), 1):
# Sélection des données d'entraînement et de validation pour chaque pli
X_fold_train = X.iloc[train_idx]
X_fold_valid = X.iloc[valid_idx]
y_fold_train = y.iloc[train_idx]
y_fold_valid = y.iloc[valid_idx]
# Application de SMOTE pour générer des échantillons synthétiques
smote = BorderlineSMOTE(sampling_strategy=0.45, random_state=seed)
# Appliquer SMOTE uniquement sur les données d'entraînement de ce pli
X_fold_train_smote, y_fold_train_smote = smote.fit_resample(X_fold_train, y_fold_train)
# Calcul de scale_pos_weight pour compenser le déséquilibre des classes
scale_pos_weight = len(y_fold_train_smote[y_fold_train_smote == 1]) * 0.75 / len(y_fold_train_smote[y_fold_train_smote == 0])
# Initialisation des modèles avec les paramètres optimaux et suppression des warnings
model1 = xgb.XGBClassifier(**xgb_best_params, scale_pos_weight=scale_pos_weight, random_state=seed, verbosity=0)
model2 = LGBMClassifier(**lgb_best_params, scale_pos_weight=scale_pos_weight, random_state=seed, verbosity=-1)
model3 = CatBoostClassifier(**cat_best_params, scale_pos_weight=scale_pos_weight, random_state=seed, silent=True)
# Entraînement des modèles sur le pli avec SMOTE
model1.fit(X_fold_train_smote, y_fold_train_smote)
model2.fit(X_fold_train_smote, y_fold_train_smote)
model3.fit(X_fold_train_smote, y_fold_train_smote)
# Prédiction sur l'ensemble de validation
pred_1 = model1.predict(X_fold_valid)
pred_2 = model2.predict(X_fold_valid)
pred_3 = model3.predict(X_fold_valid)
# Combinaison des prédictions via la méthode de mode (majorité des votes)
predictions = mode([pred_1, pred_2, pred_3], axis=0).mode.flatten()
# Rapport de classification pour ce pli
report = classification_report(y_fold_valid, predictions, output_dict=True)
cv_reports.append(report)
# Calcul et stockage du score F1 pour ce pli
f1 = f1_score(predictions, y_fold_valid)
f1_scores.append(f1)
# Calcul de la matrice de confusion
cm = confusion_matrix(y_fold_valid, predictions)
# Calcul des pourcentages par classe réelle (normalisation par ligne)
cm_sum = cm.sum(axis=1, keepdims=True)
cm_perc = (cm / cm_sum.astype(float)) * 100
# Création d'une matrice d'annotations combinant compte et pourcentage
labels = np.empty_like(cm, dtype=object)
for i in range(cm.shape[0]):
for j in range(cm.shape[1]):
labels[i, j] = f"{cm[i,j]}\n({cm_perc[i,j]:.1f}%)"
# Création de la figure pour afficher les graphiques sur la même ligne
fig, axes = plt.subplots(1, 2, figsize=(14, 5)) # 1 ligne et 2 colonnes pour les graphiques
# Affichage de la matrice de confusion
sns.heatmap(cm, annot=labels, fmt="", cmap="Blues", ax=axes[0])
axes[0].set_xlabel("Classe prédite")
axes[0].set_ylabel("Classe réelle")
axes[0].set_title(f"Matrice de confusion Pli {fold}")
# Courbe ROC
fpr, tpr, thresholds = roc_curve(y_fold_valid, model2.predict_proba(X_fold_valid)[:, 1])
roc_auc = auc(fpr, tpr)
roc_auc_scores.append(roc_auc)
axes[1].plot(fpr, tpr, color='darkorange', lw=2, label=f'Courbe ROC (AUC = {roc_auc:.2f})')
axes[1].plot([0, 1], [0, 1], color='navy', lw=2, linestyle='--')
axes[1].set_xlim([0.0, 1.0])
axes[1].set_ylim([0.0, 1.05])
axes[1].set_xlabel('Taux de Faux Positifs')
axes[1].set_ylabel('Taux de Vrais Positifs')
axes[1].set_title(f'Courbe ROC Pli {fold}')
axes[1].legend(loc='lower right')
# Sauvegarder les graphiques
plt.tight_layout()
plt.savefig(f"output/Fold_{fold}_Confusion_ROC.svg", format="svg")
plt.show()
# Résumé des résultats de la validation croisée
print(f"Score F1 moyen sur tous les plis : {np.mean(f1_scores):.4f}")
Score F1 moyen sur tous les plis : 0.8852
Prédiction sur les données de test à l'aide des modèles¶
X, y = train_df[features_for_modelling], train_df['target']
predictions = []
pred_prob = []
base_estimator = DecisionTreeClassifier()
# Boucle sur chaque pli (fold) de la validation croisée
for fold, (train_idx, valid_idx) in enumerate(cv.split(X, y), 1):
# Sélection des données d'entraînement et de validation pour chaque pli
X_fold_train = X.iloc[train_idx]
X_fold_valid = X.iloc[valid_idx]
y_fold_train = y.iloc[train_idx]
y_fold_valid = y.iloc[valid_idx]
# Application de SMOTE pour générer des échantillons synthétiques
smote = BorderlineSMOTE(sampling_strategy=0.45, random_state=seed)
# Appliquer SMOTE uniquement sur les données d'entraînement de ce pli
X_fold_train_smote, y_fold_train_smote = smote.fit_resample(X_fold_train, y_fold_train)
# Calcul de scale_pos_weight pour compenser le déséquilibre des classes
scale_pos_weight = len(y_fold_train_smote[y_fold_train_smote == 1]) * 0.75 / len(y_fold_train_smote[y_fold_train_smote == 0])
# Initialisation des modèles avec les paramètres optimaux
model1 = xgb.XGBClassifier(**xgb_best_params, scale_pos_weight=scale_pos_weight, random_state=seed)
model2 = LGBMClassifier(**lgb_best_params, scale_pos_weight=scale_pos_weight, random_state=seed)
model3 = CatBoostClassifier(**cat_best_params, scale_pos_weight=scale_pos_weight, random_state=seed)
# Entraînement des modèles sur le pli avec SMOTE
model1.fit(X_fold_train_smote, y_fold_train_smote)
model2.fit(X_fold_train_smote, y_fold_train_smote)
model3.fit(X_fold_train_smote, y_fold_train_smote)
# Prédiction sur l'ensemble de test avec les modèles
pred_1 = model1.predict(test_df[features_for_modelling])
pred_2 = model2.predict(test_df[features_for_modelling])
pred_3 = model3.predict(test_df[features_for_modelling])
pred_p_1 = model1.predict_proba(test_df[features_for_modelling])
pred_p_2 = model2.predict_proba(test_df[features_for_modelling])
pred_p_3 = model3.predict_proba(test_df[features_for_modelling])
# Ajouter les prédictions
predictions.append(pred_1)
predictions.append(pred_2)
predictions.append(pred_3)
pred_prob.append(pred_p_1)
pred_prob.append(pred_p_2)
pred_prob.append(pred_p_3)
0: learn: 0.3834565 total: 88ms remaining: 43.9s 1: learn: 0.2596015 total: 237ms remaining: 59.1s 2: learn: 0.1749340 total: 588ms remaining: 1m 37s 3: learn: 0.1258398 total: 710ms remaining: 1m 28s 4: learn: 0.0928140 total: 802ms remaining: 1m 19s 5: learn: 0.0724706 total: 887ms remaining: 1m 13s 6: learn: 0.0611042 total: 955ms remaining: 1m 7s 7: learn: 0.0534283 total: 1.06s remaining: 1m 5s 8: learn: 0.0493857 total: 1.27s remaining: 1m 9s 9: learn: 0.0454900 total: 1.36s remaining: 1m 6s 10: learn: 0.0412011 total: 1.44s remaining: 1m 3s 11: learn: 0.0387098 total: 1.54s remaining: 1m 2s 12: learn: 0.0365220 total: 1.66s remaining: 1m 2s 13: learn: 0.0353943 total: 1.79s remaining: 1m 2s 14: learn: 0.0341159 total: 1.96s remaining: 1m 3s 15: learn: 0.0326292 total: 2.11s remaining: 1m 3s 16: learn: 0.0312003 total: 2.21s remaining: 1m 2s 17: learn: 0.0306730 total: 2.29s remaining: 1m 1s 18: learn: 0.0295411 total: 2.39s remaining: 1m 19: learn: 0.0287216 total: 2.52s remaining: 1m 20: learn: 0.0281644 total: 2.63s remaining: 1m 21: learn: 0.0275374 total: 2.81s remaining: 1m 1s 22: learn: 0.0269389 total: 2.98s remaining: 1m 1s 23: learn: 0.0264582 total: 3.21s remaining: 1m 3s 24: learn: 0.0259988 total: 3.48s remaining: 1m 6s 25: learn: 0.0251740 total: 3.59s remaining: 1m 5s 26: learn: 0.0247615 total: 3.76s remaining: 1m 5s 27: learn: 0.0242914 total: 3.87s remaining: 1m 5s 28: learn: 0.0239055 total: 4.02s remaining: 1m 5s 29: learn: 0.0234754 total: 4.18s remaining: 1m 5s 30: learn: 0.0229374 total: 4.46s remaining: 1m 7s 31: learn: 0.0225935 total: 4.69s remaining: 1m 8s 32: learn: 0.0221746 total: 4.99s remaining: 1m 10s 33: learn: 0.0216749 total: 5.15s remaining: 1m 10s 34: learn: 0.0211989 total: 5.33s remaining: 1m 10s 35: learn: 0.0207294 total: 5.45s remaining: 1m 10s 36: learn: 0.0203990 total: 5.56s remaining: 1m 9s 37: learn: 0.0200033 total: 5.66s remaining: 1m 8s 38: learn: 0.0197325 total: 5.75s remaining: 1m 7s 39: learn: 0.0193119 total: 5.88s remaining: 1m 7s 40: learn: 0.0191216 total: 5.97s remaining: 1m 6s 41: learn: 0.0187839 total: 6.04s remaining: 1m 5s 42: learn: 0.0186140 total: 6.14s remaining: 1m 5s 43: learn: 0.0184148 total: 6.38s remaining: 1m 6s 44: learn: 0.0181669 total: 6.56s remaining: 1m 6s 45: learn: 0.0178403 total: 6.66s remaining: 1m 5s 46: learn: 0.0175503 total: 6.75s remaining: 1m 5s 47: learn: 0.0173285 total: 6.84s remaining: 1m 4s 48: learn: 0.0171203 total: 6.94s remaining: 1m 3s 49: learn: 0.0169725 total: 7.02s remaining: 1m 3s 50: learn: 0.0167443 total: 7.13s remaining: 1m 2s 51: learn: 0.0166224 total: 7.24s remaining: 1m 2s 52: learn: 0.0163494 total: 7.31s remaining: 1m 1s 53: learn: 0.0161206 total: 7.39s remaining: 1m 1s 54: learn: 0.0160119 total: 7.5s remaining: 1m 55: learn: 0.0157378 total: 7.58s remaining: 1m 56: learn: 0.0155212 total: 7.77s remaining: 1m 57: learn: 0.0153260 total: 7.9s remaining: 1m 58: learn: 0.0151598 total: 8.16s remaining: 1m 59: learn: 0.0149301 total: 8.42s remaining: 1m 1s 60: learn: 0.0147450 total: 8.86s remaining: 1m 3s 61: learn: 0.0145706 total: 9s remaining: 1m 3s 62: learn: 0.0143856 total: 9.1s remaining: 1m 3s 63: learn: 0.0142818 total: 9.23s remaining: 1m 2s 64: learn: 0.0141014 total: 9.4s remaining: 1m 2s 65: learn: 0.0140312 total: 9.82s remaining: 1m 4s 66: learn: 0.0139396 total: 9.95s remaining: 1m 4s 67: learn: 0.0138524 total: 10.1s remaining: 1m 4s 68: learn: 0.0136877 total: 10.2s remaining: 1m 3s 69: learn: 0.0135968 total: 10.4s remaining: 1m 3s 70: learn: 0.0134371 total: 10.5s remaining: 1m 3s 71: learn: 0.0132973 total: 10.9s remaining: 1m 4s 72: learn: 0.0131890 total: 11.1s remaining: 1m 4s 73: learn: 0.0130934 total: 11.9s remaining: 1m 8s 74: learn: 0.0130173 total: 12.4s remaining: 1m 10s 75: learn: 0.0128360 total: 13s remaining: 1m 12s 76: learn: 0.0127414 total: 13.4s remaining: 1m 13s 77: learn: 0.0126522 total: 13.5s remaining: 1m 13s 78: learn: 0.0125879 total: 13.8s remaining: 1m 13s 79: learn: 0.0124797 total: 13.9s remaining: 1m 13s 80: learn: 0.0123514 total: 14.1s remaining: 1m 12s 81: learn: 0.0122422 total: 14.1s remaining: 1m 12s 82: learn: 0.0121417 total: 14.2s remaining: 1m 11s 83: learn: 0.0120513 total: 14.3s remaining: 1m 11s 84: learn: 0.0119620 total: 14.4s remaining: 1m 10s 85: learn: 0.0118470 total: 14.5s remaining: 1m 9s 86: learn: 0.0117194 total: 14.6s remaining: 1m 9s 87: learn: 0.0116446 total: 14.7s remaining: 1m 8s 88: learn: 0.0115554 total: 14.8s remaining: 1m 8s 89: learn: 0.0115234 total: 14.9s remaining: 1m 7s 90: learn: 0.0114216 total: 15s remaining: 1m 7s 91: learn: 0.0113238 total: 15.1s remaining: 1m 6s 92: learn: 0.0112280 total: 15.2s remaining: 1m 6s 93: learn: 0.0111264 total: 15.3s remaining: 1m 5s 94: learn: 0.0110878 total: 15.5s remaining: 1m 5s 95: learn: 0.0110098 total: 15.8s remaining: 1m 6s 96: learn: 0.0109012 total: 16.1s remaining: 1m 7s 97: learn: 0.0108400 total: 16.4s remaining: 1m 7s 98: learn: 0.0107708 total: 16.8s remaining: 1m 7s 99: learn: 0.0106232 total: 16.9s remaining: 1m 7s 100: learn: 0.0105659 total: 17s remaining: 1m 7s 101: learn: 0.0105055 total: 17.1s remaining: 1m 6s 102: learn: 0.0104411 total: 17.2s remaining: 1m 6s 103: learn: 0.0104006 total: 17.3s remaining: 1m 5s 104: learn: 0.0102247 total: 17.4s remaining: 1m 5s 105: learn: 0.0101303 total: 17.5s remaining: 1m 4s 106: learn: 0.0100808 total: 17.6s remaining: 1m 4s 107: learn: 0.0099975 total: 17.7s remaining: 1m 4s 108: learn: 0.0099195 total: 17.7s remaining: 1m 3s 109: learn: 0.0098627 total: 17.8s remaining: 1m 3s 110: learn: 0.0097741 total: 17.9s remaining: 1m 2s 111: learn: 0.0097192 total: 18s remaining: 1m 2s 112: learn: 0.0096771 total: 18.1s remaining: 1m 2s 113: learn: 0.0096400 total: 18.2s remaining: 1m 1s 114: learn: 0.0095238 total: 18.3s remaining: 1m 1s 115: learn: 0.0094681 total: 18.4s remaining: 1m 116: learn: 0.0094095 total: 18.5s remaining: 1m 117: learn: 0.0093744 total: 18.5s remaining: 1m 118: learn: 0.0093053 total: 18.6s remaining: 59.6s 119: learn: 0.0092379 total: 18.7s remaining: 59.2s 120: learn: 0.0092037 total: 18.8s remaining: 58.9s 121: learn: 0.0091326 total: 18.9s remaining: 58.6s 122: learn: 0.0090749 total: 19s remaining: 58.2s 123: learn: 0.0090261 total: 19.1s remaining: 57.9s 124: learn: 0.0088622 total: 19.2s remaining: 57.6s 125: learn: 0.0088129 total: 19.3s remaining: 57.2s 126: learn: 0.0087388 total: 19.4s remaining: 56.8s 127: learn: 0.0086846 total: 19.5s remaining: 56.5s 128: learn: 0.0086377 total: 19.5s remaining: 56.1s 129: learn: 0.0085457 total: 19.6s remaining: 55.8s 130: learn: 0.0085133 total: 19.7s remaining: 55.5s 131: learn: 0.0084541 total: 19.8s remaining: 55.2s 132: learn: 0.0083770 total: 19.9s remaining: 54.9s 133: learn: 0.0082832 total: 20s remaining: 54.7s 134: learn: 0.0081960 total: 20.1s remaining: 54.4s 135: learn: 0.0081299 total: 20.2s remaining: 54.1s 136: learn: 0.0080942 total: 20.3s remaining: 53.8s 137: learn: 0.0080423 total: 20.4s remaining: 53.5s 138: learn: 0.0079856 total: 20.5s remaining: 53.2s 139: learn: 0.0079472 total: 20.6s remaining: 53.1s 140: learn: 0.0079161 total: 20.7s remaining: 52.7s 141: learn: 0.0078884 total: 20.8s remaining: 52.4s 142: learn: 0.0078167 total: 20.9s remaining: 52.2s 143: learn: 0.0077642 total: 21s remaining: 51.9s 144: learn: 0.0077239 total: 21.1s remaining: 51.6s 145: learn: 0.0076302 total: 21.2s remaining: 51.3s 146: learn: 0.0076170 total: 21.3s remaining: 51.1s 147: learn: 0.0075741 total: 21.3s remaining: 50.8s 148: learn: 0.0075378 total: 21.4s remaining: 50.5s 149: learn: 0.0074936 total: 21.5s remaining: 50.3s 150: learn: 0.0074379 total: 21.6s remaining: 50s 151: learn: 0.0073894 total: 21.7s remaining: 49.7s 152: learn: 0.0073498 total: 21.8s remaining: 49.5s 153: learn: 0.0073125 total: 21.9s remaining: 49.2s 154: learn: 0.0072871 total: 22s remaining: 49s 155: learn: 0.0072744 total: 22.1s remaining: 48.7s 156: learn: 0.0072574 total: 22.2s remaining: 48.4s 157: learn: 0.0072437 total: 22.3s remaining: 48.2s 158: learn: 0.0071812 total: 22.3s remaining: 47.9s 159: learn: 0.0071322 total: 22.4s remaining: 47.7s 160: learn: 0.0071039 total: 22.5s remaining: 47.4s 161: learn: 0.0070411 total: 22.7s remaining: 47.4s 162: learn: 0.0070021 total: 22.8s remaining: 47.2s 163: learn: 0.0069711 total: 22.9s remaining: 46.9s 164: learn: 0.0069500 total: 23s remaining: 46.7s 165: learn: 0.0069364 total: 23.2s remaining: 46.7s 166: learn: 0.0069231 total: 23.5s remaining: 46.9s 167: learn: 0.0068709 total: 23.9s remaining: 47.2s 168: learn: 0.0068490 total: 24s remaining: 47s 169: learn: 0.0068401 total: 24.1s remaining: 46.8s 170: learn: 0.0067521 total: 24.2s remaining: 46.6s 171: learn: 0.0067133 total: 24.4s remaining: 46.4s 172: learn: 0.0066857 total: 24.5s remaining: 46.2s 173: learn: 0.0066682 total: 24.5s remaining: 46s 174: learn: 0.0066426 total: 24.6s remaining: 45.7s 175: learn: 0.0065986 total: 24.7s remaining: 45.5s 176: learn: 0.0065820 total: 24.8s remaining: 45.3s 177: learn: 0.0065683 total: 24.9s remaining: 45s 178: learn: 0.0064788 total: 25s remaining: 44.8s 179: learn: 0.0064409 total: 25.1s remaining: 44.6s 180: learn: 0.0063917 total: 25.2s remaining: 44.4s 181: learn: 0.0063357 total: 25.3s remaining: 44.1s 182: learn: 0.0062926 total: 25.4s remaining: 43.9s 183: learn: 0.0062605 total: 25.5s remaining: 43.7s 184: learn: 0.0062166 total: 25.6s remaining: 43.5s 185: learn: 0.0062077 total: 25.6s remaining: 43.3s 186: learn: 0.0061771 total: 25.7s remaining: 43.1s 187: learn: 0.0061653 total: 25.8s remaining: 42.9s 188: learn: 0.0061533 total: 25.9s remaining: 42.7s 189: learn: 0.0061453 total: 26s remaining: 42.5s 190: learn: 0.0061361 total: 26.1s remaining: 42.2s 191: learn: 0.0060760 total: 26.2s remaining: 42s 192: learn: 0.0060603 total: 26.3s remaining: 41.8s 193: learn: 0.0060194 total: 26.4s remaining: 41.6s 194: learn: 0.0060064 total: 26.5s remaining: 41.4s 195: learn: 0.0059910 total: 26.6s remaining: 41.2s 196: learn: 0.0059763 total: 26.6s remaining: 41s 197: learn: 0.0059687 total: 26.7s remaining: 40.8s 198: learn: 0.0059557 total: 26.8s remaining: 40.6s 199: learn: 0.0059490 total: 26.9s remaining: 40.4s 200: learn: 0.0059368 total: 27s remaining: 40.2s 201: learn: 0.0058806 total: 27.1s remaining: 40s 202: learn: 0.0058524 total: 27.2s remaining: 39.8s 203: learn: 0.0058364 total: 27.3s remaining: 39.6s 204: learn: 0.0057786 total: 27.4s remaining: 39.4s 205: learn: 0.0057639 total: 27.5s remaining: 39.2s 206: learn: 0.0057030 total: 27.6s remaining: 39.1s 207: learn: 0.0056788 total: 27.8s remaining: 39s 208: learn: 0.0056391 total: 28s remaining: 39s 209: learn: 0.0056081 total: 28.2s remaining: 38.9s 210: learn: 0.0055437 total: 28.3s remaining: 38.8s 211: learn: 0.0055167 total: 28.5s remaining: 38.7s 212: learn: 0.0054328 total: 28.6s remaining: 38.6s 213: learn: 0.0054065 total: 28.7s remaining: 38.4s 214: learn: 0.0053500 total: 29s remaining: 38.4s 215: learn: 0.0053389 total: 29.1s remaining: 38.3s 216: learn: 0.0053205 total: 29.2s remaining: 38.1s 217: learn: 0.0053127 total: 29.3s remaining: 38s 218: learn: 0.0052830 total: 29.4s remaining: 37.8s 219: learn: 0.0052497 total: 29.6s remaining: 37.6s 220: learn: 0.0052410 total: 29.6s remaining: 37.4s 221: learn: 0.0052247 total: 29.7s remaining: 37.2s 222: learn: 0.0051950 total: 29.8s remaining: 37.1s 223: learn: 0.0051563 total: 29.9s remaining: 36.9s 224: learn: 0.0051293 total: 30.1s remaining: 36.7s 225: learn: 0.0051185 total: 30.2s remaining: 36.6s 226: learn: 0.0050896 total: 30.2s remaining: 36.4s 227: learn: 0.0050623 total: 30.4s remaining: 36.2s 228: learn: 0.0050576 total: 30.4s remaining: 36s 229: learn: 0.0050176 total: 30.5s remaining: 35.9s 230: learn: 0.0049899 total: 30.6s remaining: 35.7s 231: learn: 0.0049655 total: 30.7s remaining: 35.5s 232: learn: 0.0049269 total: 30.8s remaining: 35.3s 233: learn: 0.0049102 total: 30.9s remaining: 35.2s 234: learn: 0.0048975 total: 31s remaining: 35s 235: learn: 0.0048771 total: 31.1s remaining: 34.8s 236: learn: 0.0048554 total: 31.2s remaining: 34.6s 237: learn: 0.0048254 total: 31.3s remaining: 34.4s 238: learn: 0.0048106 total: 31.4s remaining: 34.3s 239: learn: 0.0047811 total: 31.5s remaining: 34.1s 240: learn: 0.0047740 total: 31.6s remaining: 33.9s 241: learn: 0.0047706 total: 31.6s remaining: 33.7s 242: learn: 0.0047472 total: 31.7s remaining: 33.6s 243: learn: 0.0047276 total: 31.8s remaining: 33.4s 244: learn: 0.0047276 total: 31.9s remaining: 33.2s 245: learn: 0.0047236 total: 32.1s remaining: 33.1s 246: learn: 0.0047083 total: 32.4s remaining: 33.2s 247: learn: 0.0046974 total: 32.5s remaining: 33s 248: learn: 0.0046846 total: 32.6s remaining: 32.9s 249: learn: 0.0046637 total: 32.7s remaining: 32.7s 250: learn: 0.0046457 total: 32.8s remaining: 32.5s 251: learn: 0.0045939 total: 32.9s remaining: 32.4s 252: learn: 0.0045939 total: 33s remaining: 32.2s 253: learn: 0.0045694 total: 33.1s remaining: 32s 254: learn: 0.0045694 total: 33.1s remaining: 31.8s 255: learn: 0.0045694 total: 33.2s remaining: 31.7s 256: learn: 0.0045666 total: 33.3s remaining: 31.5s 257: learn: 0.0045633 total: 33.4s remaining: 31.3s 258: learn: 0.0045376 total: 33.5s remaining: 31.2s 259: learn: 0.0045307 total: 33.6s remaining: 31s 260: learn: 0.0045016 total: 33.7s remaining: 30.8s 261: learn: 0.0044796 total: 33.8s remaining: 30.7s 262: learn: 0.0044612 total: 33.9s remaining: 30.5s 263: learn: 0.0044499 total: 34s remaining: 30.4s 264: learn: 0.0044063 total: 34.1s remaining: 30.2s 265: learn: 0.0043834 total: 34.1s remaining: 30s 266: learn: 0.0043615 total: 34.3s remaining: 29.9s 267: learn: 0.0043460 total: 34.3s remaining: 29.7s 268: learn: 0.0043181 total: 34.4s remaining: 29.6s 269: learn: 0.0043106 total: 34.5s remaining: 29.4s 270: learn: 0.0042864 total: 34.6s remaining: 29.2s 271: learn: 0.0042694 total: 34.7s remaining: 29.1s 272: learn: 0.0042480 total: 34.8s remaining: 29s 273: learn: 0.0042334 total: 35s remaining: 28.9s 274: learn: 0.0042264 total: 35.1s remaining: 28.7s 275: learn: 0.0042134 total: 35.2s remaining: 28.6s 276: learn: 0.0042074 total: 35.3s remaining: 28.5s 277: learn: 0.0041934 total: 35.6s remaining: 28.4s 278: learn: 0.0041734 total: 35.7s remaining: 28.3s 279: learn: 0.0041456 total: 35.8s remaining: 28.1s 280: learn: 0.0041246 total: 35.9s remaining: 28s 281: learn: 0.0041148 total: 36s remaining: 27.8s 282: learn: 0.0041105 total: 36.1s remaining: 27.7s 283: learn: 0.0040684 total: 36.2s remaining: 27.5s 284: learn: 0.0040502 total: 36.3s remaining: 27.4s 285: learn: 0.0040417 total: 36.4s remaining: 27.2s 286: learn: 0.0040205 total: 36.5s remaining: 27.1s 287: learn: 0.0040063 total: 36.6s remaining: 26.9s 288: learn: 0.0039927 total: 36.6s remaining: 26.7s 289: learn: 0.0039927 total: 36.7s remaining: 26.6s 290: learn: 0.0039622 total: 36.8s remaining: 26.4s 291: learn: 0.0039588 total: 36.9s remaining: 26.3s 292: learn: 0.0039483 total: 37s remaining: 26.1s 293: learn: 0.0039270 total: 37.1s remaining: 26s 294: learn: 0.0039173 total: 37.1s remaining: 25.8s 295: learn: 0.0038935 total: 37.2s remaining: 25.7s 296: learn: 0.0038751 total: 37.3s remaining: 25.5s 297: learn: 0.0038498 total: 37.4s remaining: 25.3s 298: learn: 0.0038375 total: 37.5s remaining: 25.2s 299: learn: 0.0038197 total: 37.6s remaining: 25s 300: learn: 0.0038167 total: 37.7s remaining: 24.9s 301: learn: 0.0038040 total: 37.8s remaining: 24.8s 302: learn: 0.0038040 total: 37.8s remaining: 24.6s 303: learn: 0.0038040 total: 37.9s remaining: 24.4s 304: learn: 0.0038040 total: 38s remaining: 24.3s 305: learn: 0.0038040 total: 38s remaining: 24.1s 306: learn: 0.0038040 total: 38.1s remaining: 24s 307: learn: 0.0038040 total: 38.2s remaining: 23.8s 308: learn: 0.0038040 total: 38.3s remaining: 23.6s 309: learn: 0.0038040 total: 38.3s remaining: 23.5s 310: learn: 0.0038040 total: 38.4s remaining: 23.3s 311: learn: 0.0038040 total: 38.4s remaining: 23.2s 312: learn: 0.0038040 total: 38.5s remaining: 23s 313: learn: 0.0038040 total: 38.6s remaining: 22.9s 314: learn: 0.0038039 total: 38.7s remaining: 22.7s 315: learn: 0.0038039 total: 38.7s remaining: 22.6s 316: learn: 0.0038039 total: 38.8s remaining: 22.4s 317: learn: 0.0038039 total: 38.9s remaining: 22.3s 318: learn: 0.0038039 total: 39s remaining: 22.1s 319: learn: 0.0038039 total: 39s remaining: 22s 320: learn: 0.0038039 total: 39.1s remaining: 21.8s 321: learn: 0.0038039 total: 39.2s remaining: 21.7s 322: learn: 0.0038039 total: 39.2s remaining: 21.5s 323: learn: 0.0038039 total: 39.3s remaining: 21.4s 324: learn: 0.0038039 total: 39.4s remaining: 21.2s 325: learn: 0.0038039 total: 39.4s remaining: 21.1s 326: learn: 0.0038039 total: 39.5s remaining: 20.9s 327: learn: 0.0038039 total: 39.6s remaining: 20.8s 328: learn: 0.0038039 total: 39.6s remaining: 20.6s 329: learn: 0.0038039 total: 39.7s remaining: 20.5s 330: learn: 0.0038039 total: 39.8s remaining: 20.3s 331: learn: 0.0038039 total: 39.9s remaining: 20.2s 332: learn: 0.0038039 total: 39.9s remaining: 20s 333: learn: 0.0038039 total: 40s remaining: 19.9s 334: learn: 0.0038039 total: 40.1s remaining: 19.8s 335: learn: 0.0038039 total: 40.2s remaining: 19.6s 336: learn: 0.0038039 total: 40.2s remaining: 19.5s 337: learn: 0.0038039 total: 40.3s remaining: 19.3s 338: learn: 0.0038039 total: 40.4s remaining: 19.2s 339: learn: 0.0038039 total: 40.5s remaining: 19s 340: learn: 0.0038039 total: 40.6s remaining: 18.9s 341: learn: 0.0038039 total: 40.6s remaining: 18.8s 342: learn: 0.0038039 total: 40.7s remaining: 18.6s 343: learn: 0.0038039 total: 40.8s remaining: 18.5s 344: learn: 0.0038039 total: 40.8s remaining: 18.4s 345: learn: 0.0038038 total: 40.9s remaining: 18.2s 346: learn: 0.0038038 total: 41s remaining: 18.1s 347: learn: 0.0038038 total: 41.1s remaining: 17.9s 348: learn: 0.0038038 total: 41.1s remaining: 17.8s 349: learn: 0.0038038 total: 41.2s remaining: 17.7s 350: learn: 0.0038038 total: 41.3s remaining: 17.5s 351: learn: 0.0038038 total: 41.3s remaining: 17.4s 352: learn: 0.0038038 total: 41.4s remaining: 17.2s 353: learn: 0.0038038 total: 41.5s remaining: 17.1s 354: learn: 0.0038038 total: 41.6s remaining: 17s 355: learn: 0.0038038 total: 41.6s remaining: 16.8s 356: learn: 0.0038038 total: 41.7s remaining: 16.7s 357: learn: 0.0038038 total: 41.8s remaining: 16.6s 358: learn: 0.0038038 total: 41.9s remaining: 16.4s 359: learn: 0.0038038 total: 41.9s remaining: 16.3s 360: learn: 0.0038038 total: 42s remaining: 16.2s 361: learn: 0.0038038 total: 42.1s remaining: 16s 362: learn: 0.0038037 total: 42.1s remaining: 15.9s 363: learn: 0.0038038 total: 42.2s remaining: 15.8s 364: learn: 0.0038038 total: 42.3s remaining: 15.7s 365: learn: 0.0038037 total: 42.4s remaining: 15.5s 366: learn: 0.0038037 total: 42.5s remaining: 15.4s 367: learn: 0.0037993 total: 42.5s remaining: 15.3s 368: learn: 0.0037993 total: 42.6s remaining: 15.1s 369: learn: 0.0037993 total: 42.7s remaining: 15s 370: learn: 0.0037993 total: 42.7s remaining: 14.9s 371: learn: 0.0037993 total: 42.8s remaining: 14.7s 372: learn: 0.0037993 total: 42.9s remaining: 14.6s 373: learn: 0.0037993 total: 43s remaining: 14.5s 374: learn: 0.0037993 total: 43s remaining: 14.3s 375: learn: 0.0037993 total: 43.1s remaining: 14.2s 376: learn: 0.0037993 total: 43.2s remaining: 14.1s 377: learn: 0.0037993 total: 43.2s remaining: 14s 378: learn: 0.0037993 total: 43.3s remaining: 13.8s 379: learn: 0.0037993 total: 43.4s remaining: 13.7s 380: learn: 0.0037993 total: 43.5s remaining: 13.6s 381: learn: 0.0037993 total: 43.5s remaining: 13.4s 382: learn: 0.0037993 total: 43.6s remaining: 13.3s 383: learn: 0.0037993 total: 43.6s remaining: 13.2s 384: learn: 0.0037993 total: 43.7s remaining: 13.1s 385: learn: 0.0037993 total: 43.8s remaining: 12.9s 386: learn: 0.0037993 total: 43.9s remaining: 12.8s 387: learn: 0.0037993 total: 43.9s remaining: 12.7s 388: learn: 0.0037993 total: 44s remaining: 12.6s 389: learn: 0.0037985 total: 44.1s remaining: 12.4s 390: learn: 0.0037985 total: 44.2s remaining: 12.3s 391: learn: 0.0037985 total: 44.2s remaining: 12.2s 392: learn: 0.0037985 total: 44.3s remaining: 12.1s 393: learn: 0.0037985 total: 44.4s remaining: 11.9s 394: learn: 0.0037985 total: 44.4s remaining: 11.8s 395: learn: 0.0037985 total: 44.5s remaining: 11.7s 396: learn: 0.0037889 total: 44.6s remaining: 11.6s 397: learn: 0.0037671 total: 44.7s remaining: 11.4s 398: learn: 0.0037670 total: 44.7s remaining: 11.3s 399: learn: 0.0037670 total: 44.8s remaining: 11.2s 400: learn: 0.0037670 total: 44.9s remaining: 11.1s 401: learn: 0.0037562 total: 45s remaining: 11s 402: learn: 0.0037562 total: 45s remaining: 10.8s 403: learn: 0.0037562 total: 45.1s remaining: 10.7s 404: learn: 0.0037562 total: 45.2s remaining: 10.6s 405: learn: 0.0037562 total: 45.2s remaining: 10.5s 406: learn: 0.0037562 total: 45.3s remaining: 10.4s 407: learn: 0.0037562 total: 45.4s remaining: 10.2s 408: learn: 0.0037562 total: 45.4s remaining: 10.1s 409: learn: 0.0037562 total: 45.5s remaining: 9.99s 410: learn: 0.0037561 total: 45.6s remaining: 9.87s 411: learn: 0.0037561 total: 45.6s remaining: 9.75s 412: learn: 0.0037561 total: 45.7s remaining: 9.63s 413: learn: 0.0037561 total: 45.8s remaining: 9.52s 414: learn: 0.0037561 total: 45.9s remaining: 9.4s 415: learn: 0.0037561 total: 45.9s remaining: 9.28s 416: learn: 0.0037561 total: 46s remaining: 9.15s 417: learn: 0.0037510 total: 46.1s remaining: 9.04s 418: learn: 0.0037484 total: 46.2s remaining: 8.93s 419: learn: 0.0037299 total: 46.3s remaining: 8.81s 420: learn: 0.0037157 total: 46.4s remaining: 8.7s 421: learn: 0.0037157 total: 46.4s remaining: 8.58s 422: learn: 0.0037157 total: 46.5s remaining: 8.46s 423: learn: 0.0037157 total: 46.5s remaining: 8.34s 424: learn: 0.0037157 total: 46.6s remaining: 8.22s 425: learn: 0.0037157 total: 46.7s remaining: 8.11s 426: learn: 0.0037157 total: 46.7s remaining: 7.99s 427: learn: 0.0037157 total: 46.8s remaining: 7.87s 428: learn: 0.0037157 total: 47.1s remaining: 7.79s 429: learn: 0.0037157 total: 47.3s remaining: 7.7s 430: learn: 0.0037157 total: 47.7s remaining: 7.63s 431: learn: 0.0037156 total: 47.9s remaining: 7.54s 432: learn: 0.0037156 total: 48s remaining: 7.42s 433: learn: 0.0037156 total: 48s remaining: 7.3s 434: learn: 0.0037156 total: 48.1s remaining: 7.18s 435: learn: 0.0037156 total: 48.1s remaining: 7.07s 436: learn: 0.0037156 total: 48.2s remaining: 6.95s 437: learn: 0.0037156 total: 48.3s remaining: 6.83s 438: learn: 0.0037156 total: 48.3s remaining: 6.72s 439: learn: 0.0037156 total: 48.4s remaining: 6.6s 440: learn: 0.0037156 total: 48.5s remaining: 6.49s 441: learn: 0.0037156 total: 48.6s remaining: 6.37s 442: learn: 0.0037156 total: 48.6s remaining: 6.25s 443: learn: 0.0037156 total: 48.7s remaining: 6.14s 444: learn: 0.0037156 total: 48.8s remaining: 6.03s 445: learn: 0.0037156 total: 48.8s remaining: 5.91s 446: learn: 0.0037156 total: 48.9s remaining: 5.8s 447: learn: 0.0037156 total: 49s remaining: 5.68s 448: learn: 0.0037155 total: 49s remaining: 5.57s 449: learn: 0.0037156 total: 49.1s remaining: 5.45s 450: learn: 0.0037155 total: 49.1s remaining: 5.34s 451: learn: 0.0037155 total: 49.2s remaining: 5.22s 452: learn: 0.0037155 total: 49.3s remaining: 5.11s 453: learn: 0.0037155 total: 49.3s remaining: 5s 454: learn: 0.0037155 total: 49.4s remaining: 4.89s 455: learn: 0.0037155 total: 49.5s remaining: 4.77s 456: learn: 0.0037155 total: 49.5s remaining: 4.66s 457: learn: 0.0037155 total: 49.6s remaining: 4.55s 458: learn: 0.0037155 total: 49.7s remaining: 4.44s 459: learn: 0.0037155 total: 49.7s remaining: 4.33s 460: learn: 0.0037155 total: 49.8s remaining: 4.21s 461: learn: 0.0037155 total: 49.9s remaining: 4.1s 462: learn: 0.0037155 total: 50s remaining: 3.99s 463: learn: 0.0037155 total: 50s remaining: 3.88s 464: learn: 0.0037155 total: 50.1s remaining: 3.77s 465: learn: 0.0037155 total: 50.1s remaining: 3.66s 466: learn: 0.0037155 total: 50.2s remaining: 3.55s 467: learn: 0.0037155 total: 50.3s remaining: 3.44s 468: learn: 0.0037155 total: 50.4s remaining: 3.33s 469: learn: 0.0037155 total: 50.4s remaining: 3.22s 470: learn: 0.0037155 total: 50.5s remaining: 3.11s 471: learn: 0.0037155 total: 50.6s remaining: 3s 472: learn: 0.0037155 total: 50.6s remaining: 2.89s 473: learn: 0.0037155 total: 50.7s remaining: 2.78s 474: learn: 0.0037155 total: 50.8s remaining: 2.67s 475: learn: 0.0037155 total: 50.9s remaining: 2.56s 476: learn: 0.0037154 total: 50.9s remaining: 2.46s 477: learn: 0.0037154 total: 51s remaining: 2.35s 478: learn: 0.0037154 total: 51.1s remaining: 2.24s 479: learn: 0.0037154 total: 51.1s remaining: 2.13s 480: learn: 0.0037154 total: 51.2s remaining: 2.02s 481: learn: 0.0037154 total: 51.3s remaining: 1.91s 482: learn: 0.0037154 total: 51.3s remaining: 1.81s 483: learn: 0.0037154 total: 51.4s remaining: 1.7s 484: learn: 0.0037154 total: 51.5s remaining: 1.59s 485: learn: 0.0037154 total: 51.5s remaining: 1.48s 486: learn: 0.0037154 total: 51.6s remaining: 1.38s 487: learn: 0.0037154 total: 51.7s remaining: 1.27s 488: learn: 0.0037154 total: 51.7s remaining: 1.16s 489: learn: 0.0037154 total: 51.8s remaining: 1.06s 490: learn: 0.0037154 total: 51.9s remaining: 951ms 491: learn: 0.0037154 total: 51.9s remaining: 845ms 492: learn: 0.0037154 total: 52s remaining: 738ms 493: learn: 0.0037154 total: 52.1s remaining: 633ms 494: learn: 0.0037154 total: 52.2s remaining: 527ms 495: learn: 0.0037154 total: 52.3s remaining: 422ms 496: learn: 0.0037154 total: 52.4s remaining: 317ms 497: learn: 0.0037154 total: 52.6s remaining: 211ms 498: learn: 0.0037154 total: 52.7s remaining: 106ms 499: learn: 0.0037154 total: 52.7s remaining: 0us 0: learn: 0.3879264 total: 80.9ms remaining: 40.4s 1: learn: 0.2340300 total: 158ms remaining: 39.4s 2: learn: 0.1618594 total: 295ms remaining: 48.9s 3: learn: 0.1194842 total: 431ms remaining: 53.5s 4: learn: 0.0912497 total: 596ms remaining: 59s 5: learn: 0.0758234 total: 733ms remaining: 1m 6: learn: 0.0657586 total: 828ms remaining: 58.3s 7: learn: 0.0575069 total: 925ms remaining: 56.9s 8: learn: 0.0522200 total: 998ms remaining: 54.4s 9: learn: 0.0481737 total: 1.09s remaining: 53.5s 10: learn: 0.0458477 total: 1.19s remaining: 52.7s 11: learn: 0.0429662 total: 1.27s remaining: 51.6s 12: learn: 0.0408122 total: 1.36s remaining: 51.1s 13: learn: 0.0391675 total: 1.46s remaining: 50.8s 14: learn: 0.0372456 total: 1.54s remaining: 49.8s 15: learn: 0.0362863 total: 1.76s remaining: 53.2s 16: learn: 0.0349776 total: 1.84s remaining: 52.4s 17: learn: 0.0335162 total: 1.92s remaining: 51.4s 18: learn: 0.0322426 total: 2.01s remaining: 50.9s 19: learn: 0.0310345 total: 2.1s remaining: 50.5s 20: learn: 0.0302789 total: 2.17s remaining: 49.6s 21: learn: 0.0294989 total: 2.27s remaining: 49.4s 22: learn: 0.0285349 total: 2.37s remaining: 49.1s 23: learn: 0.0279637 total: 2.45s remaining: 48.6s 24: learn: 0.0272117 total: 2.54s remaining: 48.3s 25: learn: 0.0266150 total: 2.61s remaining: 47.7s 26: learn: 0.0259661 total: 2.71s remaining: 47.4s 27: learn: 0.0253357 total: 2.79s remaining: 47s 28: learn: 0.0247296 total: 2.87s remaining: 46.6s 29: learn: 0.0243045 total: 2.96s remaining: 46.4s 30: learn: 0.0237557 total: 3.04s remaining: 46.1s 31: learn: 0.0233306 total: 3.13s remaining: 45.7s 32: learn: 0.0228900 total: 3.21s remaining: 45.4s 33: learn: 0.0225766 total: 3.3s remaining: 45.3s 34: learn: 0.0223021 total: 3.39s remaining: 45s 35: learn: 0.0218535 total: 3.47s remaining: 44.8s 36: learn: 0.0216185 total: 3.59s remaining: 45s 37: learn: 0.0210708 total: 3.68s remaining: 44.7s 38: learn: 0.0205833 total: 3.75s remaining: 44.4s 39: learn: 0.0202557 total: 3.85s remaining: 44.3s 40: learn: 0.0199618 total: 3.94s remaining: 44.1s 41: learn: 0.0197008 total: 4.25s remaining: 46.3s 42: learn: 0.0195233 total: 4.43s remaining: 47.1s 43: learn: 0.0191877 total: 4.53s remaining: 47s 44: learn: 0.0189410 total: 4.61s remaining: 46.6s 45: learn: 0.0187335 total: 4.69s remaining: 46.3s 46: learn: 0.0182361 total: 4.78s remaining: 46.1s 47: learn: 0.0179956 total: 4.87s remaining: 45.9s 48: learn: 0.0175931 total: 4.95s remaining: 45.5s 49: learn: 0.0172742 total: 5.03s remaining: 45.2s 50: learn: 0.0171601 total: 5.13s remaining: 45.1s 51: learn: 0.0169098 total: 5.21s remaining: 44.9s 52: learn: 0.0166227 total: 5.3s remaining: 44.7s 53: learn: 0.0164605 total: 5.38s remaining: 44.5s 54: learn: 0.0163387 total: 5.48s remaining: 44.4s 55: learn: 0.0161531 total: 5.58s remaining: 44.2s 56: learn: 0.0159703 total: 5.65s remaining: 43.9s 57: learn: 0.0158630 total: 5.74s remaining: 43.8s 58: learn: 0.0155645 total: 5.83s remaining: 43.6s 59: learn: 0.0152769 total: 5.91s remaining: 43.3s 60: learn: 0.0150523 total: 6.01s remaining: 43.3s 61: learn: 0.0148246 total: 6.1s remaining: 43.1s 62: learn: 0.0146261 total: 6.18s remaining: 42.9s 63: learn: 0.0144345 total: 6.26s remaining: 42.7s 64: learn: 0.0143408 total: 6.37s remaining: 42.6s 65: learn: 0.0142144 total: 6.46s remaining: 42.5s 66: learn: 0.0140618 total: 6.54s remaining: 42.2s 67: learn: 0.0139277 total: 6.64s remaining: 42.2s 68: learn: 0.0138471 total: 6.72s remaining: 42s 69: learn: 0.0136249 total: 6.8s remaining: 41.8s 70: learn: 0.0134415 total: 6.9s remaining: 41.7s 71: learn: 0.0133454 total: 6.98s remaining: 41.5s 72: learn: 0.0132421 total: 7.06s remaining: 41.3s 73: learn: 0.0131261 total: 7.13s remaining: 41s 74: learn: 0.0130276 total: 7.29s remaining: 41.3s 75: learn: 0.0129432 total: 7.66s remaining: 42.7s 76: learn: 0.0128035 total: 7.9s remaining: 43.4s 77: learn: 0.0127023 total: 8.09s remaining: 43.8s 78: learn: 0.0126212 total: 8.2s remaining: 43.7s 79: learn: 0.0125738 total: 8.27s remaining: 43.4s 80: learn: 0.0124587 total: 8.36s remaining: 43.2s 81: learn: 0.0123482 total: 8.47s remaining: 43.2s 82: learn: 0.0122353 total: 8.55s remaining: 42.9s 83: learn: 0.0121039 total: 8.63s remaining: 42.7s 84: learn: 0.0119162 total: 8.72s remaining: 42.6s 85: learn: 0.0117438 total: 8.81s remaining: 42.4s 86: learn: 0.0116613 total: 8.88s remaining: 42.2s 87: learn: 0.0115545 total: 8.97s remaining: 42s 88: learn: 0.0114791 total: 9.06s remaining: 41.9s 89: learn: 0.0113391 total: 9.14s remaining: 41.7s 90: learn: 0.0112394 total: 9.24s remaining: 41.5s 91: learn: 0.0111743 total: 9.33s remaining: 41.4s 92: learn: 0.0111342 total: 9.41s remaining: 41.2s 93: learn: 0.0110566 total: 9.5s remaining: 41s 94: learn: 0.0109692 total: 9.58s remaining: 40.8s 95: learn: 0.0108913 total: 9.68s remaining: 40.8s 96: learn: 0.0107622 total: 9.77s remaining: 40.6s 97: learn: 0.0107304 total: 9.84s remaining: 40.4s 98: learn: 0.0106828 total: 9.94s remaining: 40.2s 99: learn: 0.0106192 total: 10s remaining: 40.1s 100: learn: 0.0105637 total: 10.1s remaining: 39.9s 101: learn: 0.0105146 total: 10.2s remaining: 39.7s 102: learn: 0.0104214 total: 10.3s remaining: 39.6s 103: learn: 0.0103826 total: 10.4s remaining: 39.5s 104: learn: 0.0103200 total: 10.4s remaining: 39.3s 105: learn: 0.0102807 total: 10.6s remaining: 39.2s 106: learn: 0.0102438 total: 10.6s remaining: 39s 107: learn: 0.0100916 total: 10.7s remaining: 38.9s 108: learn: 0.0099962 total: 10.8s remaining: 38.8s 109: learn: 0.0099248 total: 10.9s remaining: 38.6s 110: learn: 0.0098122 total: 11s remaining: 38.4s 111: learn: 0.0097412 total: 11s remaining: 38.3s 112: learn: 0.0096672 total: 11.1s remaining: 38.2s 113: learn: 0.0096041 total: 11.2s remaining: 38s 114: learn: 0.0095634 total: 11.3s remaining: 37.9s 115: learn: 0.0094752 total: 11.4s remaining: 37.8s 116: learn: 0.0094149 total: 11.5s remaining: 37.6s 117: learn: 0.0093320 total: 11.6s remaining: 37.5s 118: learn: 0.0092482 total: 11.7s remaining: 37.4s 119: learn: 0.0092116 total: 11.8s remaining: 37.3s 120: learn: 0.0091750 total: 11.9s remaining: 37.2s 121: learn: 0.0091585 total: 11.9s remaining: 37s 122: learn: 0.0091093 total: 12s remaining: 36.9s 123: learn: 0.0090901 total: 12.1s remaining: 36.7s 124: learn: 0.0090138 total: 12.2s remaining: 36.6s 125: learn: 0.0089878 total: 12.3s remaining: 36.4s 126: learn: 0.0089273 total: 12.4s remaining: 36.4s 127: learn: 0.0088942 total: 12.5s remaining: 36.2s 128: learn: 0.0088426 total: 12.5s remaining: 36s 129: learn: 0.0087965 total: 12.6s remaining: 35.9s 130: learn: 0.0087097 total: 12.7s remaining: 35.8s 131: learn: 0.0086649 total: 12.8s remaining: 35.7s 132: learn: 0.0086400 total: 12.9s remaining: 35.6s 133: learn: 0.0085938 total: 13s remaining: 35.4s 134: learn: 0.0085359 total: 13.1s remaining: 35.3s 135: learn: 0.0084542 total: 13.1s remaining: 35.2s 136: learn: 0.0083965 total: 13.2s remaining: 35.1s 137: learn: 0.0083098 total: 13.3s remaining: 35s 138: learn: 0.0082896 total: 13.4s remaining: 34.8s 139: learn: 0.0082240 total: 13.5s remaining: 34.7s 140: learn: 0.0081809 total: 13.6s remaining: 34.6s 141: learn: 0.0081384 total: 13.7s remaining: 34.6s 142: learn: 0.0081001 total: 13.9s remaining: 34.7s 143: learn: 0.0080430 total: 14s remaining: 34.5s 144: learn: 0.0079824 total: 14s remaining: 34.4s 145: learn: 0.0079395 total: 14.1s remaining: 34.3s 146: learn: 0.0078435 total: 14.2s remaining: 34.2s 147: learn: 0.0078269 total: 14.3s remaining: 34.1s 148: learn: 0.0077462 total: 14.4s remaining: 34s 149: learn: 0.0076923 total: 14.5s remaining: 33.9s 150: learn: 0.0076631 total: 14.6s remaining: 33.7s 151: learn: 0.0075777 total: 14.7s remaining: 33.6s 152: learn: 0.0075453 total: 14.8s remaining: 33.5s 153: learn: 0.0075101 total: 14.8s remaining: 33.4s 154: learn: 0.0074896 total: 14.9s remaining: 33.2s 155: learn: 0.0074311 total: 15s remaining: 33.1s 156: learn: 0.0074127 total: 15.1s remaining: 33s 157: learn: 0.0073579 total: 15.2s remaining: 32.9s 158: learn: 0.0073412 total: 15.3s remaining: 32.8s 159: learn: 0.0072688 total: 15.4s remaining: 32.7s 160: learn: 0.0071767 total: 15.5s remaining: 32.6s 161: learn: 0.0071383 total: 15.6s remaining: 32.5s 162: learn: 0.0071003 total: 15.6s remaining: 32.3s 163: learn: 0.0070643 total: 15.7s remaining: 32.2s 164: learn: 0.0069833 total: 15.8s remaining: 32.1s 165: learn: 0.0069562 total: 15.9s remaining: 32s 166: learn: 0.0069249 total: 16s remaining: 31.9s 167: learn: 0.0068592 total: 16.1s remaining: 31.8s 168: learn: 0.0067955 total: 16.1s remaining: 31.6s 169: learn: 0.0067663 total: 16.2s remaining: 31.5s 170: learn: 0.0066932 total: 16.3s remaining: 31.4s 171: learn: 0.0066540 total: 16.4s remaining: 31.3s 172: learn: 0.0066268 total: 16.5s remaining: 31.2s 173: learn: 0.0065820 total: 16.6s remaining: 31.1s 174: learn: 0.0065694 total: 16.7s remaining: 31s 175: learn: 0.0064827 total: 16.8s remaining: 30.8s 176: learn: 0.0064539 total: 16.9s remaining: 30.8s 177: learn: 0.0064354 total: 16.9s remaining: 30.6s 178: learn: 0.0064255 total: 17s remaining: 30.5s 179: learn: 0.0064149 total: 17.1s remaining: 30.4s 180: learn: 0.0063848 total: 17.2s remaining: 30.3s 181: learn: 0.0063562 total: 17.3s remaining: 30.2s 182: learn: 0.0063376 total: 17.4s remaining: 30.1s 183: learn: 0.0063083 total: 17.5s remaining: 30s 184: learn: 0.0062991 total: 17.5s remaining: 29.9s 185: learn: 0.0062605 total: 17.6s remaining: 29.8s 186: learn: 0.0062371 total: 17.7s remaining: 29.7s 187: learn: 0.0061951 total: 17.8s remaining: 29.6s 188: learn: 0.0061165 total: 17.9s remaining: 29.4s 189: learn: 0.0060968 total: 18s remaining: 29.4s 190: learn: 0.0060922 total: 18.1s remaining: 29.2s 191: learn: 0.0060607 total: 18.2s remaining: 29.1s 192: learn: 0.0060126 total: 18.3s remaining: 29s 193: learn: 0.0059795 total: 18.4s remaining: 29.1s 194: learn: 0.0059347 total: 18.5s remaining: 29s 195: learn: 0.0059184 total: 18.7s remaining: 29s 196: learn: 0.0058968 total: 18.8s remaining: 28.9s 197: learn: 0.0058647 total: 18.9s remaining: 28.8s 198: learn: 0.0058459 total: 19s remaining: 28.7s 199: learn: 0.0058087 total: 19.1s remaining: 28.6s 200: learn: 0.0057874 total: 19.5s remaining: 29s 201: learn: 0.0057198 total: 19.9s remaining: 29.3s 202: learn: 0.0056991 total: 20s remaining: 29.2s 203: learn: 0.0056411 total: 20.1s remaining: 29.1s 204: learn: 0.0056280 total: 20.2s remaining: 29.1s 205: learn: 0.0056009 total: 20.4s remaining: 29.1s 206: learn: 0.0055877 total: 20.6s remaining: 29.1s 207: learn: 0.0055559 total: 20.7s remaining: 29s 208: learn: 0.0055030 total: 20.8s remaining: 28.9s 209: learn: 0.0054993 total: 20.9s remaining: 28.8s 210: learn: 0.0054837 total: 21s remaining: 28.7s 211: learn: 0.0054771 total: 21.1s remaining: 28.6s 212: learn: 0.0054709 total: 21.2s remaining: 28.5s 213: learn: 0.0054537 total: 21.3s remaining: 28.4s 214: learn: 0.0054320 total: 21.4s remaining: 28.3s 215: learn: 0.0053864 total: 21.5s remaining: 28.2s 216: learn: 0.0053479 total: 21.6s remaining: 28.2s 217: learn: 0.0053418 total: 21.7s remaining: 28.1s 218: learn: 0.0053121 total: 21.8s remaining: 28s 219: learn: 0.0052952 total: 21.9s remaining: 27.9s 220: learn: 0.0052788 total: 22s remaining: 27.8s 221: learn: 0.0052474 total: 22.1s remaining: 27.7s 222: learn: 0.0052080 total: 22.2s remaining: 27.6s 223: learn: 0.0051790 total: 22.3s remaining: 27.5s 224: learn: 0.0051666 total: 22.4s remaining: 27.4s 225: learn: 0.0051388 total: 22.5s remaining: 27.3s 226: learn: 0.0051299 total: 22.6s remaining: 27.2s 227: learn: 0.0051165 total: 22.7s remaining: 27.1s 228: learn: 0.0050850 total: 22.8s remaining: 27s 229: learn: 0.0050525 total: 22.9s remaining: 26.8s 230: learn: 0.0050309 total: 22.9s remaining: 26.7s 231: learn: 0.0049863 total: 23s remaining: 26.6s 232: learn: 0.0049576 total: 23.2s remaining: 26.6s 233: learn: 0.0049495 total: 23.3s remaining: 26.5s 234: learn: 0.0049366 total: 23.5s remaining: 26.5s 235: learn: 0.0049099 total: 23.7s remaining: 26.5s 236: learn: 0.0048737 total: 23.9s remaining: 26.6s 237: learn: 0.0048408 total: 24.3s remaining: 26.7s 238: learn: 0.0048307 total: 24.4s remaining: 26.7s 239: learn: 0.0047733 total: 24.7s remaining: 26.7s 240: learn: 0.0047401 total: 24.9s remaining: 26.8s 241: learn: 0.0047067 total: 25s remaining: 26.7s 242: learn: 0.0046901 total: 25.1s remaining: 26.6s 243: learn: 0.0046748 total: 25.2s remaining: 26.5s 244: learn: 0.0046565 total: 25.3s remaining: 26.3s 245: learn: 0.0046396 total: 25.4s remaining: 26.2s 246: learn: 0.0046317 total: 25.5s remaining: 26.1s 247: learn: 0.0046039 total: 25.7s remaining: 26.1s 248: learn: 0.0045882 total: 25.8s remaining: 26s 249: learn: 0.0045777 total: 25.9s remaining: 25.9s 250: learn: 0.0045592 total: 26.1s remaining: 25.9s 251: learn: 0.0045426 total: 26.3s remaining: 25.9s 252: learn: 0.0045134 total: 26.4s remaining: 25.8s 253: learn: 0.0044764 total: 26.6s remaining: 25.8s 254: learn: 0.0044418 total: 26.7s remaining: 25.7s 255: learn: 0.0044113 total: 26.9s remaining: 25.6s 256: learn: 0.0044057 total: 27s remaining: 25.5s 257: learn: 0.0043801 total: 27.1s remaining: 25.4s 258: learn: 0.0043332 total: 27.2s remaining: 25.3s 259: learn: 0.0043171 total: 27.3s remaining: 25.2s 260: learn: 0.0043096 total: 27.4s remaining: 25.1s 261: learn: 0.0043051 total: 27.5s remaining: 25s 262: learn: 0.0042901 total: 27.6s remaining: 24.9s 263: learn: 0.0042602 total: 27.7s remaining: 24.7s 264: learn: 0.0042408 total: 27.8s remaining: 24.6s 265: learn: 0.0042150 total: 27.9s remaining: 24.5s 266: learn: 0.0042054 total: 27.9s remaining: 24.4s 267: learn: 0.0041775 total: 28s remaining: 24.3s 268: learn: 0.0041657 total: 28.1s remaining: 24.2s 269: learn: 0.0041568 total: 28.2s remaining: 24s 270: learn: 0.0041254 total: 28.3s remaining: 23.9s 271: learn: 0.0041115 total: 28.4s remaining: 23.8s 272: learn: 0.0041019 total: 28.5s remaining: 23.7s 273: learn: 0.0040978 total: 28.6s remaining: 23.6s 274: learn: 0.0040851 total: 28.7s remaining: 23.5s 275: learn: 0.0040785 total: 28.7s remaining: 23.3s 276: learn: 0.0040556 total: 28.8s remaining: 23.2s 277: learn: 0.0040372 total: 28.9s remaining: 23.1s 278: learn: 0.0040206 total: 29s remaining: 23s 279: learn: 0.0039899 total: 29.1s remaining: 22.9s 280: learn: 0.0039836 total: 29.2s remaining: 22.8s 281: learn: 0.0039776 total: 29.3s remaining: 22.6s 282: learn: 0.0039681 total: 29.4s remaining: 22.5s 283: learn: 0.0039585 total: 29.5s remaining: 22.4s 284: learn: 0.0039530 total: 29.5s remaining: 22.3s 285: learn: 0.0039268 total: 29.6s remaining: 22.2s 286: learn: 0.0039057 total: 29.7s remaining: 22.1s 287: learn: 0.0038953 total: 29.8s remaining: 21.9s 288: learn: 0.0038866 total: 29.9s remaining: 21.8s 289: learn: 0.0038775 total: 30s remaining: 21.7s 290: learn: 0.0038583 total: 30s remaining: 21.6s 291: learn: 0.0038537 total: 30.2s remaining: 21.5s 292: learn: 0.0038454 total: 30.2s remaining: 21.4s 293: learn: 0.0038208 total: 30.3s remaining: 21.2s 294: learn: 0.0038166 total: 30.4s remaining: 21.1s 295: learn: 0.0038142 total: 30.5s remaining: 21s 296: learn: 0.0038066 total: 30.6s remaining: 20.9s 297: learn: 0.0037952 total: 30.6s remaining: 20.8s 298: learn: 0.0037749 total: 30.7s remaining: 20.7s 299: learn: 0.0037748 total: 30.8s remaining: 20.5s 300: learn: 0.0037659 total: 30.9s remaining: 20.4s 301: learn: 0.0037623 total: 31s remaining: 20.3s 302: learn: 0.0037487 total: 31.1s remaining: 20.2s 303: learn: 0.0037206 total: 31.2s remaining: 20.1s 304: learn: 0.0036931 total: 31.3s remaining: 20s 305: learn: 0.0036910 total: 31.3s remaining: 19.9s 306: learn: 0.0036831 total: 31.4s remaining: 19.8s 307: learn: 0.0036573 total: 31.5s remaining: 19.6s 308: learn: 0.0036385 total: 31.6s remaining: 19.5s 309: learn: 0.0036266 total: 31.7s remaining: 19.4s 310: learn: 0.0035965 total: 31.8s remaining: 19.3s 311: learn: 0.0035800 total: 31.9s remaining: 19.2s 312: learn: 0.0035715 total: 31.9s remaining: 19.1s 313: learn: 0.0035634 total: 32s remaining: 19s 314: learn: 0.0035555 total: 32.1s remaining: 18.9s 315: learn: 0.0035254 total: 32.2s remaining: 18.8s 316: learn: 0.0035155 total: 32.3s remaining: 18.7s 317: learn: 0.0034926 total: 32.4s remaining: 18.5s 318: learn: 0.0034693 total: 32.5s remaining: 18.4s 319: learn: 0.0034576 total: 32.6s remaining: 18.3s 320: learn: 0.0034543 total: 32.6s remaining: 18.2s 321: learn: 0.0034399 total: 32.7s remaining: 18.1s 322: learn: 0.0034270 total: 32.8s remaining: 18s 323: learn: 0.0034206 total: 32.9s remaining: 17.9s 324: learn: 0.0033968 total: 33s remaining: 17.8s 325: learn: 0.0033734 total: 33.3s remaining: 17.8s 326: learn: 0.0033660 total: 33.5s remaining: 17.7s 327: learn: 0.0033534 total: 33.6s remaining: 17.6s 328: learn: 0.0033484 total: 33.7s remaining: 17.5s 329: learn: 0.0033424 total: 33.7s remaining: 17.4s 330: learn: 0.0033379 total: 33.8s remaining: 17.3s 331: learn: 0.0033231 total: 33.9s remaining: 17.2s 332: learn: 0.0033040 total: 34s remaining: 17s 333: learn: 0.0032963 total: 34.1s remaining: 16.9s 334: learn: 0.0032700 total: 34.2s remaining: 16.8s 335: learn: 0.0032666 total: 34.3s remaining: 16.7s 336: learn: 0.0032557 total: 34.4s remaining: 16.6s 337: learn: 0.0032430 total: 34.4s remaining: 16.5s 338: learn: 0.0032413 total: 34.5s remaining: 16.4s 339: learn: 0.0032232 total: 34.6s remaining: 16.3s 340: learn: 0.0031949 total: 34.7s remaining: 16.2s 341: learn: 0.0031896 total: 34.8s remaining: 16.1s 342: learn: 0.0031835 total: 34.9s remaining: 16s 343: learn: 0.0031752 total: 35s remaining: 15.9s 344: learn: 0.0031685 total: 35.1s remaining: 15.7s 345: learn: 0.0031671 total: 35.2s remaining: 15.6s 346: learn: 0.0031536 total: 35.2s remaining: 15.5s 347: learn: 0.0031460 total: 35.3s remaining: 15.4s 348: learn: 0.0031437 total: 35.4s remaining: 15.3s 349: learn: 0.0031383 total: 35.5s remaining: 15.2s 350: learn: 0.0031339 total: 35.6s remaining: 15.1s 351: learn: 0.0031276 total: 35.7s remaining: 15s 352: learn: 0.0031221 total: 35.7s remaining: 14.9s 353: learn: 0.0030955 total: 35.8s remaining: 14.8s 354: learn: 0.0030806 total: 35.9s remaining: 14.7s 355: learn: 0.0030747 total: 36s remaining: 14.6s 356: learn: 0.0030702 total: 36.1s remaining: 14.5s 357: learn: 0.0030552 total: 36.2s remaining: 14.4s 358: learn: 0.0030388 total: 36.3s remaining: 14.2s 359: learn: 0.0030322 total: 36.4s remaining: 14.1s 360: learn: 0.0030228 total: 36.5s remaining: 14.1s 361: learn: 0.0030040 total: 36.6s remaining: 14s 362: learn: 0.0029943 total: 36.7s remaining: 13.9s 363: learn: 0.0029734 total: 36.8s remaining: 13.8s 364: learn: 0.0029556 total: 36.9s remaining: 13.6s 365: learn: 0.0029421 total: 37s remaining: 13.5s 366: learn: 0.0029367 total: 37.1s remaining: 13.4s 367: learn: 0.0029296 total: 37.2s remaining: 13.3s 368: learn: 0.0029267 total: 37.3s remaining: 13.2s 369: learn: 0.0029184 total: 37.3s remaining: 13.1s 370: learn: 0.0029114 total: 37.4s remaining: 13s 371: learn: 0.0029114 total: 37.5s remaining: 12.9s 372: learn: 0.0029114 total: 37.6s remaining: 12.8s 373: learn: 0.0029016 total: 37.6s remaining: 12.7s 374: learn: 0.0029016 total: 37.7s remaining: 12.6s 375: learn: 0.0029016 total: 37.7s remaining: 12.4s 376: learn: 0.0029016 total: 37.8s remaining: 12.3s 377: learn: 0.0029016 total: 37.9s remaining: 12.2s 378: learn: 0.0029016 total: 38s remaining: 12.1s 379: learn: 0.0029016 total: 38s remaining: 12s 380: learn: 0.0029016 total: 38.1s remaining: 11.9s 381: learn: 0.0029016 total: 38.2s remaining: 11.8s 382: learn: 0.0029016 total: 38.2s remaining: 11.7s 383: learn: 0.0029016 total: 38.3s remaining: 11.6s 384: learn: 0.0029016 total: 38.4s remaining: 11.5s 385: learn: 0.0029016 total: 38.4s remaining: 11.3s 386: learn: 0.0029016 total: 38.5s remaining: 11.2s 387: learn: 0.0029016 total: 38.6s remaining: 11.1s 388: learn: 0.0029016 total: 38.6s remaining: 11s 389: learn: 0.0029016 total: 38.7s remaining: 10.9s 390: learn: 0.0029016 total: 38.8s remaining: 10.8s 391: learn: 0.0029016 total: 38.8s remaining: 10.7s 392: learn: 0.0029016 total: 38.9s remaining: 10.6s 393: learn: 0.0029016 total: 38.9s remaining: 10.5s 394: learn: 0.0029016 total: 39s remaining: 10.4s 395: learn: 0.0029016 total: 39.1s remaining: 10.3s 396: learn: 0.0029016 total: 39.2s remaining: 10.2s 397: learn: 0.0029016 total: 39.2s remaining: 10.1s 398: learn: 0.0029016 total: 39.3s remaining: 9.95s 399: learn: 0.0029016 total: 39.4s remaining: 9.85s 400: learn: 0.0029016 total: 39.4s remaining: 9.74s 401: learn: 0.0029016 total: 39.6s remaining: 9.65s 402: learn: 0.0029016 total: 39.7s remaining: 9.55s 403: learn: 0.0029016 total: 39.9s remaining: 9.49s 404: learn: 0.0029016 total: 40.2s remaining: 9.42s 405: learn: 0.0029016 total: 40.4s remaining: 9.35s 406: learn: 0.0029016 total: 40.8s remaining: 9.32s 407: learn: 0.0029016 total: 41.1s remaining: 9.26s 408: learn: 0.0029016 total: 41.6s remaining: 9.26s 409: learn: 0.0029016 total: 41.9s remaining: 9.2s 410: learn: 0.0029016 total: 42.2s remaining: 9.14s 411: learn: 0.0029016 total: 42.3s remaining: 9.04s 412: learn: 0.0029016 total: 42.4s remaining: 8.93s 413: learn: 0.0029016 total: 42.5s remaining: 8.83s 414: learn: 0.0029016 total: 42.6s remaining: 8.72s 415: learn: 0.0029016 total: 42.6s remaining: 8.61s 416: learn: 0.0029016 total: 42.7s remaining: 8.5s 417: learn: 0.0029016 total: 42.8s remaining: 8.39s 418: learn: 0.0029016 total: 42.9s remaining: 8.29s 419: learn: 0.0029016 total: 42.9s remaining: 8.18s 420: learn: 0.0029016 total: 43s remaining: 8.07s 421: learn: 0.0029016 total: 43.1s remaining: 7.96s 422: learn: 0.0029016 total: 43.2s remaining: 7.86s 423: learn: 0.0029016 total: 43.2s remaining: 7.75s 424: learn: 0.0029016 total: 43.3s remaining: 7.64s 425: learn: 0.0029016 total: 43.4s remaining: 7.53s 426: learn: 0.0029016 total: 43.4s remaining: 7.43s 427: learn: 0.0029016 total: 43.5s remaining: 7.32s 428: learn: 0.0029016 total: 43.6s remaining: 7.21s 429: learn: 0.0029016 total: 43.7s remaining: 7.11s 430: learn: 0.0029016 total: 43.7s remaining: 7s 431: learn: 0.0029016 total: 43.8s remaining: 6.9s 432: learn: 0.0029016 total: 43.9s remaining: 6.79s 433: learn: 0.0029016 total: 44.2s remaining: 6.72s 434: learn: 0.0029016 total: 44.4s remaining: 6.63s 435: learn: 0.0029016 total: 44.7s remaining: 6.57s 436: learn: 0.0029016 total: 45.4s remaining: 6.54s 437: learn: 0.0029016 total: 46.1s remaining: 6.53s 438: learn: 0.0029016 total: 46.5s remaining: 6.46s 439: learn: 0.0029016 total: 46.8s remaining: 6.38s 440: learn: 0.0029016 total: 47s remaining: 6.29s 441: learn: 0.0029016 total: 47.1s remaining: 6.18s 442: learn: 0.0029016 total: 47.2s remaining: 6.07s 443: learn: 0.0029016 total: 47.2s remaining: 5.96s 444: learn: 0.0029016 total: 47.3s remaining: 5.85s 445: learn: 0.0029016 total: 47.4s remaining: 5.74s 446: learn: 0.0029016 total: 47.5s remaining: 5.63s 447: learn: 0.0029016 total: 47.6s remaining: 5.53s 448: learn: 0.0029016 total: 47.7s remaining: 5.42s 449: learn: 0.0029016 total: 47.8s remaining: 5.31s 450: learn: 0.0029016 total: 48.1s remaining: 5.23s 451: learn: 0.0029016 total: 48.4s remaining: 5.14s 452: learn: 0.0029016 total: 48.5s remaining: 5.04s 453: learn: 0.0029016 total: 48.6s remaining: 4.93s 454: learn: 0.0029016 total: 48.7s remaining: 4.82s 455: learn: 0.0029016 total: 48.8s remaining: 4.71s 456: learn: 0.0029016 total: 48.9s remaining: 4.61s 457: learn: 0.0029016 total: 49s remaining: 4.5s 458: learn: 0.0029016 total: 49.1s remaining: 4.38s 459: learn: 0.0029016 total: 49.2s remaining: 4.27s 460: learn: 0.0029016 total: 49.2s remaining: 4.16s 461: learn: 0.0029016 total: 49.3s remaining: 4.05s 462: learn: 0.0029016 total: 49.4s remaining: 3.94s 463: learn: 0.0029016 total: 49.4s remaining: 3.83s 464: learn: 0.0029016 total: 49.5s remaining: 3.73s 465: learn: 0.0029016 total: 49.6s remaining: 3.62s 466: learn: 0.0029016 total: 49.7s remaining: 3.51s 467: learn: 0.0029016 total: 49.9s remaining: 3.41s 468: learn: 0.0029016 total: 50.2s remaining: 3.32s 469: learn: 0.0029016 total: 50.4s remaining: 3.22s 470: learn: 0.0029016 total: 50.6s remaining: 3.11s 471: learn: 0.0029016 total: 50.8s remaining: 3.01s 472: learn: 0.0029016 total: 51s remaining: 2.91s 473: learn: 0.0029016 total: 51.2s remaining: 2.81s 474: learn: 0.0029016 total: 51.4s remaining: 2.71s 475: learn: 0.0029016 total: 51.6s remaining: 2.6s 476: learn: 0.0029016 total: 51.8s remaining: 2.5s 477: learn: 0.0029016 total: 51.9s remaining: 2.39s 478: learn: 0.0029016 total: 52.2s remaining: 2.29s 479: learn: 0.0029016 total: 52.3s remaining: 2.18s 480: learn: 0.0029016 total: 52.4s remaining: 2.07s 481: learn: 0.0029016 total: 52.6s remaining: 1.96s 482: learn: 0.0029016 total: 52.7s remaining: 1.86s 483: learn: 0.0029016 total: 52.9s remaining: 1.75s 484: learn: 0.0029016 total: 53s remaining: 1.64s 485: learn: 0.0029016 total: 53.2s remaining: 1.53s 486: learn: 0.0029016 total: 53.3s remaining: 1.42s 487: learn: 0.0029016 total: 53.6s remaining: 1.32s 488: learn: 0.0029016 total: 53.7s remaining: 1.21s 489: learn: 0.0029016 total: 53.7s remaining: 1.1s 490: learn: 0.0029016 total: 53.8s remaining: 986ms 491: learn: 0.0029016 total: 53.9s remaining: 876ms 492: learn: 0.0029016 total: 53.9s remaining: 766ms 493: learn: 0.0029016 total: 54s remaining: 656ms 494: learn: 0.0029016 total: 54.1s remaining: 546ms 495: learn: 0.0029016 total: 54.2s remaining: 437ms 496: learn: 0.0029016 total: 54.2s remaining: 327ms 497: learn: 0.0029016 total: 54.4s remaining: 218ms 498: learn: 0.0029016 total: 54.6s remaining: 109ms 499: learn: 0.0029016 total: 54.8s remaining: 0us 0: learn: 0.4116441 total: 163ms remaining: 1m 21s 1: learn: 0.2814952 total: 331ms remaining: 1m 22s 2: learn: 0.1939123 total: 630ms remaining: 1m 44s 3: learn: 0.1351963 total: 978ms remaining: 2m 1s 4: learn: 0.0998223 total: 1.14s remaining: 1m 52s 5: learn: 0.0807118 total: 1.31s remaining: 1m 47s 6: learn: 0.0684784 total: 1.44s remaining: 1m 41s 7: learn: 0.0590878 total: 1.58s remaining: 1m 37s 8: learn: 0.0543919 total: 1.74s remaining: 1m 34s 9: learn: 0.0502424 total: 2s remaining: 1m 38s 10: learn: 0.0472711 total: 2.2s remaining: 1m 37s 11: learn: 0.0456978 total: 2.5s remaining: 1m 41s 12: learn: 0.0427787 total: 2.7s remaining: 1m 41s 13: learn: 0.0404089 total: 2.9s remaining: 1m 40s 14: learn: 0.0384036 total: 3.06s remaining: 1m 38s 15: learn: 0.0366183 total: 3.29s remaining: 1m 39s 16: learn: 0.0352740 total: 3.63s remaining: 1m 43s 17: learn: 0.0341774 total: 3.83s remaining: 1m 42s 18: learn: 0.0330681 total: 4.14s remaining: 1m 44s 19: learn: 0.0324876 total: 4.34s remaining: 1m 44s 20: learn: 0.0312908 total: 4.57s remaining: 1m 44s 21: learn: 0.0304133 total: 4.72s remaining: 1m 42s 22: learn: 0.0295026 total: 4.88s remaining: 1m 41s 23: learn: 0.0288785 total: 5s remaining: 1m 39s 24: learn: 0.0282316 total: 5.16s remaining: 1m 38s 25: learn: 0.0277354 total: 5.27s remaining: 1m 36s 26: learn: 0.0272280 total: 5.41s remaining: 1m 34s 27: learn: 0.0263135 total: 5.51s remaining: 1m 32s 28: learn: 0.0257069 total: 5.6s remaining: 1m 30s 29: learn: 0.0252512 total: 5.69s remaining: 1m 29s 30: learn: 0.0249742 total: 5.78s remaining: 1m 27s 31: learn: 0.0245758 total: 5.86s remaining: 1m 25s 32: learn: 0.0243160 total: 5.95s remaining: 1m 24s 33: learn: 0.0237512 total: 6.04s remaining: 1m 22s 34: learn: 0.0233733 total: 6.13s remaining: 1m 21s 35: learn: 0.0230253 total: 6.21s remaining: 1m 20s 36: learn: 0.0226496 total: 6.31s remaining: 1m 18s 37: learn: 0.0222385 total: 6.39s remaining: 1m 17s 38: learn: 0.0220834 total: 6.47s remaining: 1m 16s 39: learn: 0.0215878 total: 6.56s remaining: 1m 15s 40: learn: 0.0214304 total: 6.65s remaining: 1m 14s 41: learn: 0.0209779 total: 6.74s remaining: 1m 13s 42: learn: 0.0208238 total: 6.81s remaining: 1m 12s 43: learn: 0.0203967 total: 6.9s remaining: 1m 11s 44: learn: 0.0201443 total: 6.99s remaining: 1m 10s 45: learn: 0.0194152 total: 7.08s remaining: 1m 9s 46: learn: 0.0191813 total: 7.17s remaining: 1m 9s 47: learn: 0.0190405 total: 7.25s remaining: 1m 8s 48: learn: 0.0186897 total: 7.33s remaining: 1m 7s 49: learn: 0.0185510 total: 7.4s remaining: 1m 6s 50: learn: 0.0183202 total: 7.5s remaining: 1m 6s 51: learn: 0.0181582 total: 7.58s remaining: 1m 5s 52: learn: 0.0179673 total: 7.66s remaining: 1m 4s 53: learn: 0.0178547 total: 7.75s remaining: 1m 3s 54: learn: 0.0177236 total: 7.82s remaining: 1m 3s 55: learn: 0.0174001 total: 7.9s remaining: 1m 2s 56: learn: 0.0173000 total: 7.98s remaining: 1m 2s 57: learn: 0.0171598 total: 8.07s remaining: 1m 1s 58: learn: 0.0169122 total: 8.15s remaining: 1m 59: learn: 0.0164178 total: 8.23s remaining: 1m 60: learn: 0.0162190 total: 8.33s remaining: 60s 61: learn: 0.0160141 total: 8.41s remaining: 59.4s 62: learn: 0.0158188 total: 8.48s remaining: 58.9s 63: learn: 0.0155145 total: 8.58s remaining: 58.5s 64: learn: 0.0153490 total: 8.66s remaining: 58s 65: learn: 0.0152620 total: 8.75s remaining: 57.5s 66: learn: 0.0150518 total: 8.83s remaining: 57.1s 67: learn: 0.0149081 total: 8.93s remaining: 56.7s 68: learn: 0.0148519 total: 9s remaining: 56.2s 69: learn: 0.0148042 total: 9.1s remaining: 55.9s 70: learn: 0.0146852 total: 9.34s remaining: 56.4s 71: learn: 0.0145193 total: 9.45s remaining: 56.2s 72: learn: 0.0144073 total: 9.62s remaining: 56.3s 73: learn: 0.0142477 total: 9.71s remaining: 55.9s 74: learn: 0.0141301 total: 9.79s remaining: 55.5s 75: learn: 0.0140284 total: 9.89s remaining: 55.2s 76: learn: 0.0139049 total: 10s remaining: 55s 77: learn: 0.0137139 total: 10.1s remaining: 54.8s 78: learn: 0.0136125 total: 10.3s remaining: 54.7s 79: learn: 0.0135317 total: 10.4s remaining: 54.8s 80: learn: 0.0134542 total: 10.7s remaining: 55.2s 81: learn: 0.0132939 total: 10.9s remaining: 55.6s 82: learn: 0.0131057 total: 11s remaining: 55.3s 83: learn: 0.0130658 total: 11.1s remaining: 54.9s 84: learn: 0.0129215 total: 11.2s remaining: 54.9s 85: learn: 0.0128431 total: 11.5s remaining: 55.3s 86: learn: 0.0126953 total: 11.8s remaining: 55.9s 87: learn: 0.0126368 total: 12.5s remaining: 58.5s 88: learn: 0.0125073 total: 13s remaining: 59.9s 89: learn: 0.0124461 total: 13.1s remaining: 59.8s 90: learn: 0.0123666 total: 13.4s remaining: 1m 91: learn: 0.0122259 total: 13.7s remaining: 1m 92: learn: 0.0121275 total: 13.9s remaining: 1m 1s 93: learn: 0.0119801 total: 14.6s remaining: 1m 3s 94: learn: 0.0119119 total: 15.2s remaining: 1m 4s 95: learn: 0.0117981 total: 15.5s remaining: 1m 5s 96: learn: 0.0117495 total: 15.6s remaining: 1m 4s 97: learn: 0.0117145 total: 15.7s remaining: 1m 4s 98: learn: 0.0116134 total: 15.8s remaining: 1m 4s 99: learn: 0.0114684 total: 15.9s remaining: 1m 3s 100: learn: 0.0113977 total: 16s remaining: 1m 3s 101: learn: 0.0113631 total: 16.1s remaining: 1m 2s 102: learn: 0.0112832 total: 16.1s remaining: 1m 2s 103: learn: 0.0111978 total: 16.2s remaining: 1m 1s 104: learn: 0.0111506 total: 16.3s remaining: 1m 1s 105: learn: 0.0110142 total: 16.4s remaining: 1m 106: learn: 0.0109307 total: 16.5s remaining: 1m 107: learn: 0.0108719 total: 16.5s remaining: 1m 108: learn: 0.0108119 total: 16.6s remaining: 59.5s 109: learn: 0.0107600 total: 16.7s remaining: 59.1s 110: learn: 0.0106635 total: 16.7s remaining: 58.6s 111: learn: 0.0106256 total: 16.8s remaining: 58.2s 112: learn: 0.0105434 total: 16.9s remaining: 57.8s 113: learn: 0.0104276 total: 16.9s remaining: 57.3s 114: learn: 0.0103443 total: 17s remaining: 56.9s 115: learn: 0.0102336 total: 17.1s remaining: 56.5s 116: learn: 0.0101821 total: 17.1s remaining: 56.1s 117: learn: 0.0100824 total: 17.2s remaining: 55.7s 118: learn: 0.0099833 total: 17.3s remaining: 55.3s 119: learn: 0.0099190 total: 17.3s remaining: 54.9s 120: learn: 0.0098497 total: 17.4s remaining: 54.6s 121: learn: 0.0098109 total: 17.5s remaining: 54.3s 122: learn: 0.0097864 total: 17.6s remaining: 53.9s 123: learn: 0.0096722 total: 17.7s remaining: 53.6s 124: learn: 0.0095950 total: 17.8s remaining: 53.3s 125: learn: 0.0095607 total: 17.9s remaining: 53.1s 126: learn: 0.0094704 total: 18.1s remaining: 53.2s 127: learn: 0.0093708 total: 18.4s remaining: 53.4s 128: learn: 0.0093061 total: 18.7s remaining: 53.7s 129: learn: 0.0092728 total: 18.8s remaining: 53.5s 130: learn: 0.0091948 total: 19s remaining: 53.6s 131: learn: 0.0091665 total: 19.1s remaining: 53.3s 132: learn: 0.0090909 total: 19.2s remaining: 52.9s 133: learn: 0.0090284 total: 19.3s remaining: 52.7s 134: learn: 0.0089823 total: 19.4s remaining: 52.4s 135: learn: 0.0088995 total: 19.6s remaining: 52.4s 136: learn: 0.0088187 total: 19.7s remaining: 52.1s 137: learn: 0.0087279 total: 19.7s remaining: 51.8s 138: learn: 0.0087097 total: 19.8s remaining: 51.5s 139: learn: 0.0086815 total: 19.9s remaining: 51.1s 140: learn: 0.0086378 total: 19.9s remaining: 50.8s 141: learn: 0.0085466 total: 20s remaining: 50.5s 142: learn: 0.0084766 total: 20.1s remaining: 50.2s 143: learn: 0.0083624 total: 20.2s remaining: 49.9s 144: learn: 0.0083310 total: 20.3s remaining: 49.6s 145: learn: 0.0082974 total: 20.4s remaining: 49.4s 146: learn: 0.0082525 total: 20.4s remaining: 49.1s 147: learn: 0.0082097 total: 20.5s remaining: 48.8s 148: learn: 0.0081548 total: 20.6s remaining: 48.6s 149: learn: 0.0080906 total: 20.8s remaining: 48.6s 150: learn: 0.0080406 total: 21.1s remaining: 48.8s 151: learn: 0.0079617 total: 21.4s remaining: 48.9s 152: learn: 0.0079045 total: 21.5s remaining: 48.7s 153: learn: 0.0078178 total: 21.7s remaining: 48.8s 154: learn: 0.0077267 total: 22s remaining: 49s 155: learn: 0.0076692 total: 22.1s remaining: 48.7s 156: learn: 0.0076284 total: 22.2s remaining: 48.5s 157: learn: 0.0075387 total: 22.3s remaining: 48.3s 158: learn: 0.0075126 total: 22.4s remaining: 48s 159: learn: 0.0074278 total: 22.5s remaining: 47.8s 160: learn: 0.0073633 total: 22.6s remaining: 47.5s 161: learn: 0.0073082 total: 22.6s remaining: 47.2s 162: learn: 0.0072656 total: 22.7s remaining: 46.9s 163: learn: 0.0072176 total: 22.8s remaining: 46.6s 164: learn: 0.0071814 total: 22.8s remaining: 46.4s 165: learn: 0.0071164 total: 22.9s remaining: 46.1s 166: learn: 0.0070867 total: 23s remaining: 45.8s 167: learn: 0.0070262 total: 23s remaining: 45.5s 168: learn: 0.0069919 total: 23.1s remaining: 45.3s 169: learn: 0.0069614 total: 23.2s remaining: 45s 170: learn: 0.0069231 total: 23.2s remaining: 44.7s 171: learn: 0.0068703 total: 23.3s remaining: 44.4s 172: learn: 0.0068445 total: 23.4s remaining: 44.2s 173: learn: 0.0068133 total: 23.4s remaining: 43.9s 174: learn: 0.0067364 total: 23.5s remaining: 43.7s 175: learn: 0.0067051 total: 23.6s remaining: 43.4s 176: learn: 0.0066719 total: 23.6s remaining: 43.1s 177: learn: 0.0066172 total: 23.7s remaining: 42.9s 178: learn: 0.0065937 total: 23.8s remaining: 42.6s 179: learn: 0.0065515 total: 23.9s remaining: 42.4s 180: learn: 0.0065120 total: 24s remaining: 42.2s 181: learn: 0.0065063 total: 24.1s remaining: 42.1s 182: learn: 0.0064358 total: 24.3s remaining: 42.2s 183: learn: 0.0063866 total: 24.6s remaining: 42.2s 184: learn: 0.0063642 total: 24.9s remaining: 42.4s 185: learn: 0.0063350 total: 25s remaining: 42.3s 186: learn: 0.0063189 total: 25.8s remaining: 43.1s 187: learn: 0.0062777 total: 26s remaining: 43.1s 188: learn: 0.0062463 total: 26.1s remaining: 42.9s 189: learn: 0.0062197 total: 26.1s remaining: 42.6s 190: learn: 0.0061708 total: 26.2s remaining: 42.4s 191: learn: 0.0061323 total: 26.3s remaining: 42.1s 192: learn: 0.0060999 total: 26.3s remaining: 41.9s 193: learn: 0.0060489 total: 26.4s remaining: 41.6s 194: learn: 0.0060143 total: 26.5s remaining: 41.4s 195: learn: 0.0059986 total: 26.6s remaining: 41.2s 196: learn: 0.0059863 total: 26.6s remaining: 41s 197: learn: 0.0059701 total: 26.8s remaining: 40.9s 198: learn: 0.0059198 total: 27s remaining: 40.8s 199: learn: 0.0058903 total: 27.1s remaining: 40.7s 200: learn: 0.0058414 total: 27.2s remaining: 40.5s 201: learn: 0.0057974 total: 27.5s remaining: 40.5s 202: learn: 0.0057705 total: 27.5s remaining: 40.3s 203: learn: 0.0057134 total: 27.7s remaining: 40.2s 204: learn: 0.0056967 total: 27.8s remaining: 40s 205: learn: 0.0056557 total: 28s remaining: 40s 206: learn: 0.0056431 total: 28.1s remaining: 39.8s 207: learn: 0.0056377 total: 28.2s remaining: 39.6s 208: learn: 0.0056245 total: 28.3s remaining: 39.3s 209: learn: 0.0055928 total: 28.3s remaining: 39.1s 210: learn: 0.0055624 total: 28.4s remaining: 38.9s 211: learn: 0.0055502 total: 28.5s remaining: 38.7s 212: learn: 0.0055289 total: 28.5s remaining: 38.4s 213: learn: 0.0055090 total: 28.6s remaining: 38.2s 214: learn: 0.0054835 total: 28.7s remaining: 38s 215: learn: 0.0054564 total: 28.8s remaining: 37.8s 216: learn: 0.0054445 total: 29s remaining: 37.8s 217: learn: 0.0054335 total: 29.1s remaining: 37.7s 218: learn: 0.0054067 total: 29.3s remaining: 37.6s 219: learn: 0.0053794 total: 29.5s remaining: 37.5s 220: learn: 0.0053446 total: 29.6s remaining: 37.4s 221: learn: 0.0053215 total: 29.8s remaining: 37.3s 222: learn: 0.0052913 total: 29.9s remaining: 37.2s 223: learn: 0.0052813 total: 30.2s remaining: 37.2s 224: learn: 0.0052548 total: 30.3s remaining: 37.1s 225: learn: 0.0052252 total: 30.5s remaining: 37s 226: learn: 0.0052016 total: 30.6s remaining: 36.8s 227: learn: 0.0051678 total: 30.7s remaining: 36.6s 228: learn: 0.0051415 total: 30.8s remaining: 36.4s 229: learn: 0.0051205 total: 31s remaining: 36.4s 230: learn: 0.0051166 total: 31.1s remaining: 36.2s 231: learn: 0.0051115 total: 31.3s remaining: 36.1s 232: learn: 0.0051115 total: 31.6s remaining: 36.2s 233: learn: 0.0051115 total: 31.8s remaining: 36.2s 234: learn: 0.0051054 total: 32s remaining: 36.1s 235: learn: 0.0050850 total: 32.1s remaining: 35.9s 236: learn: 0.0050370 total: 32.1s remaining: 35.7s 237: learn: 0.0050036 total: 32.2s remaining: 35.5s 238: learn: 0.0049931 total: 32.3s remaining: 35.3s 239: learn: 0.0049547 total: 32.4s remaining: 35.1s 240: learn: 0.0049301 total: 32.4s remaining: 34.8s 241: learn: 0.0049167 total: 32.5s remaining: 34.6s 242: learn: 0.0049119 total: 32.6s remaining: 34.4s 243: learn: 0.0048839 total: 32.6s remaining: 34.2s 244: learn: 0.0048617 total: 32.7s remaining: 34s 245: learn: 0.0048617 total: 32.8s remaining: 33.8s 246: learn: 0.0048314 total: 32.8s remaining: 33.6s 247: learn: 0.0048288 total: 32.9s remaining: 33.5s 248: learn: 0.0048070 total: 33s remaining: 33.3s 249: learn: 0.0048070 total: 33.1s remaining: 33.1s 250: learn: 0.0047768 total: 33.2s remaining: 32.9s 251: learn: 0.0047420 total: 33.2s remaining: 32.7s 252: learn: 0.0047232 total: 33.3s remaining: 32.5s 253: learn: 0.0046990 total: 33.4s remaining: 32.4s 254: learn: 0.0046956 total: 33.6s remaining: 32.3s 255: learn: 0.0046931 total: 34.1s remaining: 32.5s 256: learn: 0.0046664 total: 34.2s remaining: 32.3s 257: learn: 0.0046664 total: 34.4s remaining: 32.3s 258: learn: 0.0046625 total: 34.7s remaining: 32.3s 259: learn: 0.0046430 total: 34.8s remaining: 32.1s 260: learn: 0.0046286 total: 34.8s remaining: 31.9s 261: learn: 0.0046032 total: 35s remaining: 31.8s 262: learn: 0.0045731 total: 35.2s remaining: 31.7s 263: learn: 0.0045652 total: 36.1s remaining: 32.2s 264: learn: 0.0045651 total: 36.7s remaining: 32.5s 265: learn: 0.0045651 total: 37.4s remaining: 32.9s 266: learn: 0.0045651 total: 38.3s remaining: 33.4s 267: learn: 0.0045651 total: 38.7s remaining: 33.5s 268: learn: 0.0045651 total: 38.8s remaining: 33.3s 269: learn: 0.0045651 total: 38.9s remaining: 33.1s 270: learn: 0.0045651 total: 38.9s remaining: 32.9s 271: learn: 0.0045651 total: 39s remaining: 32.7s 272: learn: 0.0045651 total: 39s remaining: 32.5s 273: learn: 0.0045651 total: 39.1s remaining: 32.3s 274: learn: 0.0045651 total: 39.2s remaining: 32.1s 275: learn: 0.0045647 total: 39.5s remaining: 32.1s 276: learn: 0.0045646 total: 39.6s remaining: 31.9s 277: learn: 0.0045646 total: 39.9s remaining: 31.9s 278: learn: 0.0045646 total: 40.2s remaining: 31.8s 279: learn: 0.0045646 total: 40.3s remaining: 31.7s 280: learn: 0.0045646 total: 40.5s remaining: 31.5s 281: learn: 0.0045646 total: 40.5s remaining: 31.3s 282: learn: 0.0045646 total: 40.6s remaining: 31.1s 283: learn: 0.0045645 total: 40.7s remaining: 30.9s 284: learn: 0.0045445 total: 40.8s remaining: 30.8s 285: learn: 0.0045314 total: 40.9s remaining: 30.6s 286: learn: 0.0045134 total: 40.9s remaining: 30.4s 287: learn: 0.0044928 total: 41s remaining: 30.2s 288: learn: 0.0044928 total: 41.1s remaining: 30s 289: learn: 0.0044873 total: 41.2s remaining: 29.8s 290: learn: 0.0044872 total: 41.2s remaining: 29.6s 291: learn: 0.0044872 total: 41.3s remaining: 29.4s 292: learn: 0.0044538 total: 41.4s remaining: 29.2s 293: learn: 0.0044288 total: 41.4s remaining: 29s 294: learn: 0.0043951 total: 41.5s remaining: 28.8s 295: learn: 0.0043736 total: 41.6s remaining: 28.6s 296: learn: 0.0043589 total: 41.6s remaining: 28.5s 297: learn: 0.0043389 total: 41.7s remaining: 28.3s 298: learn: 0.0043124 total: 41.8s remaining: 28.1s 299: learn: 0.0043124 total: 41.8s remaining: 27.9s 300: learn: 0.0042998 total: 41.9s remaining: 27.7s 301: learn: 0.0042906 total: 42s remaining: 27.5s 302: learn: 0.0042906 total: 42s remaining: 27.3s 303: learn: 0.0042850 total: 42.1s remaining: 27.1s 304: learn: 0.0042850 total: 42.1s remaining: 26.9s 305: learn: 0.0042850 total: 42.1s remaining: 26.7s 306: learn: 0.0042850 total: 42.2s remaining: 26.5s 307: learn: 0.0042850 total: 42.2s remaining: 26.3s 308: learn: 0.0042850 total: 42.3s remaining: 26.2s 309: learn: 0.0042850 total: 42.5s remaining: 26s 310: learn: 0.0042849 total: 42.6s remaining: 25.9s 311: learn: 0.0042849 total: 42.6s remaining: 25.7s 312: learn: 0.0042849 total: 42.6s remaining: 25.5s 313: learn: 0.0042849 total: 42.7s remaining: 25.3s 314: learn: 0.0042849 total: 42.7s remaining: 25.1s 315: learn: 0.0042848 total: 42.7s remaining: 24.9s 316: learn: 0.0042848 total: 42.8s remaining: 24.7s 317: learn: 0.0042848 total: 42.8s remaining: 24.5s 318: learn: 0.0042848 total: 42.9s remaining: 24.3s 319: learn: 0.0042848 total: 42.9s remaining: 24.1s 320: learn: 0.0042848 total: 43s remaining: 24s 321: learn: 0.0042848 total: 43s remaining: 23.8s 322: learn: 0.0042848 total: 43s remaining: 23.6s 323: learn: 0.0042848 total: 43.1s remaining: 23.4s 324: learn: 0.0042848 total: 43.1s remaining: 23.2s 325: learn: 0.0042847 total: 43.2s remaining: 23s 326: learn: 0.0042847 total: 43.2s remaining: 22.9s 327: learn: 0.0042847 total: 43.3s remaining: 22.7s 328: learn: 0.0042847 total: 43.3s remaining: 22.5s 329: learn: 0.0042847 total: 43.5s remaining: 22.4s 330: learn: 0.0042847 total: 43.6s remaining: 22.3s 331: learn: 0.0042847 total: 43.7s remaining: 22.1s 332: learn: 0.0042847 total: 43.9s remaining: 22s 333: learn: 0.0042847 total: 43.9s remaining: 21.8s 334: learn: 0.0042847 total: 44.1s remaining: 21.7s 335: learn: 0.0042847 total: 44.2s remaining: 21.6s 336: learn: 0.0042847 total: 44.3s remaining: 21.4s 337: learn: 0.0042847 total: 44.4s remaining: 21.3s 338: learn: 0.0042691 total: 44.8s remaining: 21.3s 339: learn: 0.0042490 total: 45.2s remaining: 21.3s 340: learn: 0.0042490 total: 45.5s remaining: 21.2s 341: learn: 0.0042490 total: 45.8s remaining: 21.1s 342: learn: 0.0042490 total: 46.2s remaining: 21.1s 343: learn: 0.0042490 total: 46.3s remaining: 21s 344: learn: 0.0042490 total: 46.4s remaining: 20.8s 345: learn: 0.0042490 total: 46.6s remaining: 20.7s 346: learn: 0.0042490 total: 46.6s remaining: 20.6s 347: learn: 0.0042490 total: 46.7s remaining: 20.4s 348: learn: 0.0042490 total: 46.8s remaining: 20.3s 349: learn: 0.0042490 total: 46.9s remaining: 20.1s 350: learn: 0.0042490 total: 47s remaining: 20s 351: learn: 0.0042490 total: 47.1s remaining: 19.8s 352: learn: 0.0042490 total: 47.2s remaining: 19.7s 353: learn: 0.0042490 total: 47.4s remaining: 19.5s 354: learn: 0.0042490 total: 47.5s remaining: 19.4s 355: learn: 0.0042490 total: 47.6s remaining: 19.3s 356: learn: 0.0042489 total: 47.7s remaining: 19.1s 357: learn: 0.0042489 total: 47.8s remaining: 19s 358: learn: 0.0042489 total: 48s remaining: 18.9s 359: learn: 0.0042489 total: 48.1s remaining: 18.7s 360: learn: 0.0042489 total: 48.3s remaining: 18.6s 361: learn: 0.0042489 total: 48.4s remaining: 18.4s 362: learn: 0.0042489 total: 48.4s remaining: 18.3s 363: learn: 0.0042489 total: 48.5s remaining: 18.1s 364: learn: 0.0042489 total: 48.7s remaining: 18s 365: learn: 0.0042489 total: 49.2s remaining: 18s 366: learn: 0.0042489 total: 49.3s remaining: 17.9s 367: learn: 0.0042488 total: 49.5s remaining: 17.8s 368: learn: 0.0042488 total: 49.7s remaining: 17.6s 369: learn: 0.0042488 total: 49.7s remaining: 17.5s 370: learn: 0.0042488 total: 49.8s remaining: 17.3s 371: learn: 0.0042488 total: 50s remaining: 17.2s 372: learn: 0.0042488 total: 50s remaining: 17s 373: learn: 0.0042488 total: 50.1s remaining: 16.9s 374: learn: 0.0042488 total: 50.1s remaining: 16.7s 375: learn: 0.0042488 total: 50.2s remaining: 16.5s 376: learn: 0.0042488 total: 50.2s remaining: 16.4s 377: learn: 0.0042488 total: 50.3s remaining: 16.2s 378: learn: 0.0042488 total: 50.3s remaining: 16.1s 379: learn: 0.0042487 total: 50.4s remaining: 15.9s 380: learn: 0.0042488 total: 50.4s remaining: 15.7s 381: learn: 0.0042488 total: 50.5s remaining: 15.6s 382: learn: 0.0042487 total: 50.5s remaining: 15.4s 383: learn: 0.0042487 total: 50.6s remaining: 15.3s 384: learn: 0.0042487 total: 50.6s remaining: 15.1s 385: learn: 0.0042487 total: 50.6s remaining: 15s 386: learn: 0.0042487 total: 50.7s remaining: 14.8s 387: learn: 0.0042487 total: 50.7s remaining: 14.6s 388: learn: 0.0042487 total: 50.7s remaining: 14.5s 389: learn: 0.0042487 total: 50.8s remaining: 14.3s 390: learn: 0.0042487 total: 50.8s remaining: 14.2s 391: learn: 0.0042487 total: 50.9s remaining: 14s 392: learn: 0.0042487 total: 50.9s remaining: 13.9s 393: learn: 0.0042228 total: 51s remaining: 13.7s 394: learn: 0.0042048 total: 51.1s remaining: 13.6s 395: learn: 0.0042048 total: 51.1s remaining: 13.4s 396: learn: 0.0042047 total: 51.1s remaining: 13.3s 397: learn: 0.0042047 total: 51.2s remaining: 13.1s 398: learn: 0.0042047 total: 51.2s remaining: 13s 399: learn: 0.0042047 total: 51.2s remaining: 12.8s 400: learn: 0.0042047 total: 51.3s remaining: 12.7s 401: learn: 0.0042047 total: 51.3s remaining: 12.5s 402: learn: 0.0041921 total: 51.4s remaining: 12.4s 403: learn: 0.0041921 total: 51.5s remaining: 12.2s 404: learn: 0.0041921 total: 51.5s remaining: 12.1s 405: learn: 0.0041921 total: 51.6s remaining: 11.9s 406: learn: 0.0041921 total: 51.6s remaining: 11.8s 407: learn: 0.0041921 total: 51.7s remaining: 11.7s 408: learn: 0.0041921 total: 51.7s remaining: 11.5s 409: learn: 0.0041921 total: 51.8s remaining: 11.4s 410: learn: 0.0041921 total: 51.8s remaining: 11.2s 411: learn: 0.0041921 total: 51.9s remaining: 11.1s 412: learn: 0.0041921 total: 51.9s remaining: 10.9s 413: learn: 0.0041920 total: 52s remaining: 10.8s 414: learn: 0.0041920 total: 52s remaining: 10.7s 415: learn: 0.0041920 total: 52.1s remaining: 10.5s 416: learn: 0.0041920 total: 52.1s remaining: 10.4s 417: learn: 0.0041920 total: 52.2s remaining: 10.2s 418: learn: 0.0041920 total: 52.2s remaining: 10.1s 419: learn: 0.0041920 total: 52.3s remaining: 9.97s 420: learn: 0.0041920 total: 52.4s remaining: 9.84s 421: learn: 0.0041920 total: 52.5s remaining: 9.7s 422: learn: 0.0041920 total: 52.6s remaining: 9.57s 423: learn: 0.0041920 total: 52.8s remaining: 9.46s 424: learn: 0.0041920 total: 53s remaining: 9.35s 425: learn: 0.0041920 total: 53.1s remaining: 9.22s 426: learn: 0.0041920 total: 53.2s remaining: 9.1s 427: learn: 0.0041920 total: 53.3s remaining: 8.96s 428: learn: 0.0041920 total: 53.3s remaining: 8.83s 429: learn: 0.0041920 total: 53.4s remaining: 8.69s 430: learn: 0.0041920 total: 53.4s remaining: 8.55s 431: learn: 0.0041920 total: 53.4s remaining: 8.41s 432: learn: 0.0041920 total: 53.5s remaining: 8.28s 433: learn: 0.0041920 total: 53.6s remaining: 8.14s 434: learn: 0.0041919 total: 53.6s remaining: 8.01s 435: learn: 0.0041919 total: 53.6s remaining: 7.87s 436: learn: 0.0041919 total: 53.7s remaining: 7.74s 437: learn: 0.0041919 total: 53.7s remaining: 7.6s 438: learn: 0.0041919 total: 53.8s remaining: 7.47s 439: learn: 0.0041919 total: 53.8s remaining: 7.33s 440: learn: 0.0041919 total: 53.8s remaining: 7.2s 441: learn: 0.0041919 total: 53.9s remaining: 7.07s 442: learn: 0.0041919 total: 53.9s remaining: 6.93s 443: learn: 0.0041919 total: 53.9s remaining: 6.8s 444: learn: 0.0041919 total: 54s remaining: 6.67s 445: learn: 0.0041919 total: 54s remaining: 6.54s 446: learn: 0.0041919 total: 54.1s remaining: 6.41s 447: learn: 0.0041919 total: 54.1s remaining: 6.28s 448: learn: 0.0041919 total: 54.1s remaining: 6.15s 449: learn: 0.0041919 total: 54.2s remaining: 6.02s 450: learn: 0.0041919 total: 54.2s remaining: 5.89s 451: learn: 0.0041919 total: 54.2s remaining: 5.76s 452: learn: 0.0041919 total: 54.3s remaining: 5.63s 453: learn: 0.0041919 total: 54.3s remaining: 5.5s 454: learn: 0.0041919 total: 54.3s remaining: 5.37s 455: learn: 0.0041919 total: 54.4s remaining: 5.25s 456: learn: 0.0041919 total: 54.4s remaining: 5.12s 457: learn: 0.0041919 total: 54.5s remaining: 4.99s 458: learn: 0.0041919 total: 54.5s remaining: 4.87s 459: learn: 0.0041919 total: 54.5s remaining: 4.74s 460: learn: 0.0041919 total: 54.6s remaining: 4.62s 461: learn: 0.0041919 total: 54.6s remaining: 4.49s 462: learn: 0.0041919 total: 54.6s remaining: 4.37s 463: learn: 0.0041919 total: 54.7s remaining: 4.24s 464: learn: 0.0041919 total: 54.7s remaining: 4.12s 465: learn: 0.0041919 total: 54.8s remaining: 3.99s 466: learn: 0.0041919 total: 54.8s remaining: 3.87s 467: learn: 0.0041919 total: 54.8s remaining: 3.75s 468: learn: 0.0041919 total: 54.9s remaining: 3.63s 469: learn: 0.0041919 total: 54.9s remaining: 3.5s 470: learn: 0.0041919 total: 54.9s remaining: 3.38s 471: learn: 0.0041919 total: 55s remaining: 3.26s 472: learn: 0.0041919 total: 55s remaining: 3.14s 473: learn: 0.0041919 total: 55s remaining: 3.02s 474: learn: 0.0041919 total: 55.1s remaining: 2.9s 475: learn: 0.0041919 total: 55.1s remaining: 2.78s 476: learn: 0.0041919 total: 55.2s remaining: 2.66s 477: learn: 0.0041919 total: 55.2s remaining: 2.54s 478: learn: 0.0041919 total: 55.2s remaining: 2.42s 479: learn: 0.0041919 total: 55.3s remaining: 2.3s 480: learn: 0.0041919 total: 55.3s remaining: 2.19s 481: learn: 0.0041919 total: 55.4s remaining: 2.07s 482: learn: 0.0041919 total: 55.4s remaining: 1.95s 483: learn: 0.0041919 total: 55.4s remaining: 1.83s 484: learn: 0.0041918 total: 55.5s remaining: 1.72s 485: learn: 0.0041918 total: 55.5s remaining: 1.6s 486: learn: 0.0041918 total: 55.5s remaining: 1.48s 487: learn: 0.0041918 total: 55.6s remaining: 1.37s 488: learn: 0.0041918 total: 55.6s remaining: 1.25s 489: learn: 0.0041918 total: 55.7s remaining: 1.14s 490: learn: 0.0041918 total: 55.7s remaining: 1.02s 491: learn: 0.0041918 total: 55.8s remaining: 908ms 492: learn: 0.0041918 total: 55.9s remaining: 793ms 493: learn: 0.0041918 total: 56s remaining: 680ms 494: learn: 0.0041918 total: 56s remaining: 566ms 495: learn: 0.0041918 total: 56s remaining: 452ms 496: learn: 0.0041918 total: 56.1s remaining: 339ms 497: learn: 0.0041918 total: 56.2s remaining: 226ms 498: learn: 0.0041918 total: 56.3s remaining: 113ms 499: learn: 0.0041918 total: 56.3s remaining: 0us 0: learn: 0.4146511 total: 140ms remaining: 1m 9s 1: learn: 0.2670537 total: 274ms remaining: 1m 8s 2: learn: 0.1706554 total: 418ms remaining: 1m 9s 3: learn: 0.1262981 total: 687ms remaining: 1m 25s 4: learn: 0.0963916 total: 807ms remaining: 1m 19s 5: learn: 0.0762957 total: 915ms remaining: 1m 15s 6: learn: 0.0634311 total: 1.02s remaining: 1m 11s 7: learn: 0.0555778 total: 1.12s remaining: 1m 8s 8: learn: 0.0513606 total: 1.26s remaining: 1m 8s 9: learn: 0.0466234 total: 1.55s remaining: 1m 15s 10: learn: 0.0430544 total: 1.89s remaining: 1m 24s 11: learn: 0.0415717 total: 2.01s remaining: 1m 21s 12: learn: 0.0394707 total: 2.12s remaining: 1m 19s 13: learn: 0.0379522 total: 2.22s remaining: 1m 16s 14: learn: 0.0370596 total: 2.29s remaining: 1m 14s 15: learn: 0.0354812 total: 2.37s remaining: 1m 11s 16: learn: 0.0337175 total: 2.45s remaining: 1m 9s 17: learn: 0.0324110 total: 2.54s remaining: 1m 7s 18: learn: 0.0309966 total: 2.63s remaining: 1m 6s 19: learn: 0.0301186 total: 2.7s remaining: 1m 4s 20: learn: 0.0288737 total: 2.78s remaining: 1m 3s 21: learn: 0.0281964 total: 2.86s remaining: 1m 2s 22: learn: 0.0275099 total: 2.93s remaining: 1m 23: learn: 0.0269172 total: 3s remaining: 59.6s 24: learn: 0.0264462 total: 3.1s remaining: 59s 25: learn: 0.0259340 total: 3.22s remaining: 58.6s 26: learn: 0.0252994 total: 3.41s remaining: 59.8s 27: learn: 0.0246580 total: 4.41s remaining: 1m 14s 28: learn: 0.0241392 total: 5.07s remaining: 1m 22s 29: learn: 0.0234606 total: 5.17s remaining: 1m 20s 30: learn: 0.0230583 total: 5.25s remaining: 1m 19s 31: learn: 0.0226447 total: 5.45s remaining: 1m 19s 32: learn: 0.0223355 total: 5.54s remaining: 1m 18s 33: learn: 0.0219177 total: 5.73s remaining: 1m 18s 34: learn: 0.0216591 total: 6.02s remaining: 1m 20s 35: learn: 0.0213153 total: 6.24s remaining: 1m 20s 36: learn: 0.0210225 total: 6.58s remaining: 1m 22s 37: learn: 0.0207173 total: 6.88s remaining: 1m 23s 38: learn: 0.0205133 total: 7.06s remaining: 1m 23s 39: learn: 0.0202107 total: 7.2s remaining: 1m 22s 40: learn: 0.0198574 total: 7.3s remaining: 1m 21s 41: learn: 0.0194174 total: 7.39s remaining: 1m 20s 42: learn: 0.0192319 total: 7.5s remaining: 1m 19s 43: learn: 0.0190029 total: 7.6s remaining: 1m 18s 44: learn: 0.0186653 total: 7.7s remaining: 1m 17s 45: learn: 0.0183089 total: 7.83s remaining: 1m 17s 46: learn: 0.0182027 total: 7.92s remaining: 1m 16s 47: learn: 0.0179632 total: 8.01s remaining: 1m 15s 48: learn: 0.0175367 total: 8.12s remaining: 1m 14s 49: learn: 0.0173394 total: 8.22s remaining: 1m 14s 50: learn: 0.0171094 total: 8.32s remaining: 1m 13s 51: learn: 0.0169047 total: 8.44s remaining: 1m 12s 52: learn: 0.0167405 total: 8.52s remaining: 1m 11s 53: learn: 0.0165482 total: 8.61s remaining: 1m 11s 54: learn: 0.0164111 total: 8.74s remaining: 1m 10s 55: learn: 0.0162054 total: 8.85s remaining: 1m 10s 56: learn: 0.0160083 total: 8.94s remaining: 1m 9s 57: learn: 0.0159446 total: 9.05s remaining: 1m 8s 58: learn: 0.0156990 total: 9.14s remaining: 1m 8s 59: learn: 0.0152540 total: 9.22s remaining: 1m 7s 60: learn: 0.0150484 total: 9.32s remaining: 1m 7s 61: learn: 0.0149970 total: 9.39s remaining: 1m 6s 62: learn: 0.0147589 total: 9.47s remaining: 1m 5s 63: learn: 0.0146388 total: 9.63s remaining: 1m 5s 64: learn: 0.0145184 total: 9.8s remaining: 1m 5s 65: learn: 0.0143826 total: 9.98s remaining: 1m 5s 66: learn: 0.0142229 total: 10.1s remaining: 1m 5s 67: learn: 0.0141326 total: 10.3s remaining: 1m 5s 68: learn: 0.0140276 total: 10.4s remaining: 1m 4s 69: learn: 0.0138889 total: 10.5s remaining: 1m 4s 70: learn: 0.0137873 total: 10.6s remaining: 1m 4s 71: learn: 0.0136795 total: 10.7s remaining: 1m 3s 72: learn: 0.0135947 total: 10.8s remaining: 1m 3s 73: learn: 0.0134535 total: 10.9s remaining: 1m 3s 74: learn: 0.0133089 total: 11s remaining: 1m 2s 75: learn: 0.0132052 total: 11.1s remaining: 1m 2s 76: learn: 0.0131028 total: 11.2s remaining: 1m 1s 77: learn: 0.0130165 total: 11.3s remaining: 1m 1s 78: learn: 0.0129445 total: 11.4s remaining: 1m 79: learn: 0.0128264 total: 11.7s remaining: 1m 1s 80: learn: 0.0127645 total: 11.8s remaining: 1m 81: learn: 0.0125789 total: 11.9s remaining: 1m 82: learn: 0.0124012 total: 12s remaining: 1m 83: learn: 0.0122606 total: 12.1s remaining: 59.8s 84: learn: 0.0121192 total: 12.2s remaining: 59.4s 85: learn: 0.0119850 total: 12.3s remaining: 59.2s 86: learn: 0.0118496 total: 12.4s remaining: 58.8s 87: learn: 0.0118016 total: 12.5s remaining: 58.4s 88: learn: 0.0117075 total: 12.6s remaining: 58.2s 89: learn: 0.0115461 total: 12.7s remaining: 57.8s 90: learn: 0.0114760 total: 12.8s remaining: 57.5s 91: learn: 0.0113897 total: 12.9s remaining: 57.3s 92: learn: 0.0113561 total: 13s remaining: 56.9s 93: learn: 0.0113014 total: 13.1s remaining: 56.7s 94: learn: 0.0112450 total: 13.3s remaining: 56.6s 95: learn: 0.0111846 total: 13.4s remaining: 56.4s 96: learn: 0.0110923 total: 13.5s remaining: 56s 97: learn: 0.0109817 total: 13.6s remaining: 55.8s 98: learn: 0.0108994 total: 13.7s remaining: 55.4s 99: learn: 0.0108094 total: 13.8s remaining: 55.4s 100: learn: 0.0107614 total: 14s remaining: 55.3s 101: learn: 0.0106664 total: 14.1s remaining: 55.1s 102: learn: 0.0106347 total: 14.2s remaining: 54.8s 103: learn: 0.0105640 total: 14.4s remaining: 54.9s 104: learn: 0.0104615 total: 15.1s remaining: 56.7s 105: learn: 0.0103659 total: 15.4s remaining: 57.4s 106: learn: 0.0102800 total: 15.6s remaining: 57.4s 107: learn: 0.0102213 total: 15.8s remaining: 57.3s 108: learn: 0.0101028 total: 16.1s remaining: 57.6s 109: learn: 0.0100145 total: 16.3s remaining: 57.7s 110: learn: 0.0099450 total: 16.4s remaining: 57.6s 111: learn: 0.0098375 total: 16.6s remaining: 57.6s 112: learn: 0.0097704 total: 16.8s remaining: 57.4s 113: learn: 0.0096870 total: 17s remaining: 57.4s 114: learn: 0.0096039 total: 17.1s remaining: 57.3s 115: learn: 0.0095649 total: 17.3s remaining: 57.2s 116: learn: 0.0095265 total: 17.4s remaining: 56.9s 117: learn: 0.0095004 total: 17.6s remaining: 56.9s 118: learn: 0.0094473 total: 17.7s remaining: 56.7s 119: learn: 0.0094015 total: 18.2s remaining: 57.5s 120: learn: 0.0093295 total: 18.4s remaining: 57.5s 121: learn: 0.0092225 total: 18.5s remaining: 57.4s 122: learn: 0.0091891 total: 18.6s remaining: 57.2s 123: learn: 0.0091125 total: 18.9s remaining: 57.3s 124: learn: 0.0090792 total: 19s remaining: 57.1s 125: learn: 0.0089873 total: 19.2s remaining: 57s 126: learn: 0.0089285 total: 19.3s remaining: 56.8s 127: learn: 0.0088614 total: 19.5s remaining: 56.8s 128: learn: 0.0088348 total: 19.6s remaining: 56.5s 129: learn: 0.0087342 total: 19.8s remaining: 56.4s 130: learn: 0.0086594 total: 20.1s remaining: 56.6s 131: learn: 0.0085791 total: 20.3s remaining: 56.6s 132: learn: 0.0085107 total: 20.9s remaining: 57.6s 133: learn: 0.0084685 total: 21.7s remaining: 59.2s 134: learn: 0.0083808 total: 22.1s remaining: 59.8s 135: learn: 0.0083575 total: 22.4s remaining: 1m 136: learn: 0.0082927 total: 22.6s remaining: 59.9s 137: learn: 0.0082591 total: 22.9s remaining: 1m 138: learn: 0.0082267 total: 23.1s remaining: 59.9s 139: learn: 0.0081636 total: 23.3s remaining: 59.9s 140: learn: 0.0081099 total: 23.5s remaining: 59.8s 141: learn: 0.0080816 total: 23.8s remaining: 60s 142: learn: 0.0080629 total: 24.1s remaining: 1m 143: learn: 0.0080000 total: 24.3s remaining: 60s 144: learn: 0.0079261 total: 24.6s remaining: 1m 145: learn: 0.0078529 total: 24.9s remaining: 1m 146: learn: 0.0078373 total: 25.6s remaining: 1m 1s 147: learn: 0.0077792 total: 25.9s remaining: 1m 1s 148: learn: 0.0076984 total: 26.1s remaining: 1m 1s 149: learn: 0.0076374 total: 26.4s remaining: 1m 1s 150: learn: 0.0075991 total: 26.5s remaining: 1m 1s 151: learn: 0.0075813 total: 26.6s remaining: 1m 152: learn: 0.0075069 total: 26.8s remaining: 1m 153: learn: 0.0074566 total: 26.9s remaining: 1m 154: learn: 0.0074212 total: 26.9s remaining: 60s 155: learn: 0.0073516 total: 27.1s remaining: 59.7s 156: learn: 0.0073239 total: 27.2s remaining: 59.4s 157: learn: 0.0073119 total: 27.3s remaining: 59s 158: learn: 0.0072931 total: 27.5s remaining: 59s 159: learn: 0.0072769 total: 27.9s remaining: 59.3s 160: learn: 0.0071861 total: 28.1s remaining: 59.2s 161: learn: 0.0071254 total: 28.3s remaining: 59s 162: learn: 0.0070886 total: 28.5s remaining: 58.8s 163: learn: 0.0070513 total: 28.6s remaining: 58.6s 164: learn: 0.0070177 total: 28.8s remaining: 58.4s 165: learn: 0.0069830 total: 29.1s remaining: 58.6s 166: learn: 0.0069221 total: 29.4s remaining: 58.7s 167: learn: 0.0069048 total: 29.7s remaining: 58.6s 168: learn: 0.0068631 total: 30s remaining: 58.7s 169: learn: 0.0068007 total: 30.4s remaining: 59s 170: learn: 0.0067815 total: 30.8s remaining: 59.2s 171: learn: 0.0067340 total: 31.1s remaining: 59.2s 172: learn: 0.0067177 total: 31.6s remaining: 59.8s 173: learn: 0.0066752 total: 32.1s remaining: 1m 174: learn: 0.0066507 total: 32.4s remaining: 1m 175: learn: 0.0066015 total: 32.6s remaining: 59.9s 176: learn: 0.0065777 total: 32.7s remaining: 59.7s 177: learn: 0.0065390 total: 32.9s remaining: 59.5s 178: learn: 0.0065192 total: 33.1s remaining: 59.4s 179: learn: 0.0064877 total: 33.2s remaining: 59.1s 180: learn: 0.0064677 total: 33.4s remaining: 58.8s 181: learn: 0.0064243 total: 33.5s remaining: 58.6s 182: learn: 0.0063818 total: 33.7s remaining: 58.3s 183: learn: 0.0063702 total: 33.8s remaining: 58s 184: learn: 0.0063568 total: 33.9s remaining: 57.8s 185: learn: 0.0063182 total: 34s remaining: 57.4s 186: learn: 0.0062820 total: 34.1s remaining: 57.1s 187: learn: 0.0062717 total: 34.2s remaining: 56.8s 188: learn: 0.0062618 total: 34.3s remaining: 56.5s 189: learn: 0.0062235 total: 34.4s remaining: 56.2s 190: learn: 0.0062089 total: 34.5s remaining: 55.8s 191: learn: 0.0061793 total: 34.6s remaining: 55.5s 192: learn: 0.0061501 total: 34.7s remaining: 55.2s 193: learn: 0.0061200 total: 34.8s remaining: 54.8s 194: learn: 0.0061011 total: 34.9s remaining: 54.5s 195: learn: 0.0060639 total: 34.9s remaining: 54.2s 196: learn: 0.0060555 total: 35s remaining: 53.9s 197: learn: 0.0060448 total: 35.1s remaining: 53.5s 198: learn: 0.0060165 total: 35.2s remaining: 53.2s 199: learn: 0.0059509 total: 35.4s remaining: 53.1s 200: learn: 0.0058999 total: 35.5s remaining: 52.8s 201: learn: 0.0058783 total: 35.6s remaining: 52.6s 202: learn: 0.0058607 total: 35.8s remaining: 52.3s 203: learn: 0.0058345 total: 36s remaining: 52.2s 204: learn: 0.0057892 total: 36.1s remaining: 51.9s 205: learn: 0.0057737 total: 36.3s remaining: 51.8s 206: learn: 0.0057371 total: 36.5s remaining: 51.7s 207: learn: 0.0057229 total: 36.7s remaining: 51.5s 208: learn: 0.0057057 total: 37.3s remaining: 51.9s 209: learn: 0.0056695 total: 37.4s remaining: 51.7s 210: learn: 0.0056591 total: 37.6s remaining: 51.5s 211: learn: 0.0056342 total: 37.7s remaining: 51.3s 212: learn: 0.0055848 total: 38s remaining: 51.2s 213: learn: 0.0055610 total: 38.1s remaining: 51s 214: learn: 0.0055339 total: 38.4s remaining: 50.8s 215: learn: 0.0055146 total: 38.5s remaining: 50.6s 216: learn: 0.0055068 total: 38.6s remaining: 50.3s 217: learn: 0.0054968 total: 38.7s remaining: 50.1s 218: learn: 0.0054890 total: 38.8s remaining: 49.8s 219: learn: 0.0054622 total: 38.9s remaining: 49.5s 220: learn: 0.0054364 total: 39.1s remaining: 49.4s 221: learn: 0.0054284 total: 39.2s remaining: 49.1s 222: learn: 0.0054079 total: 39.3s remaining: 48.9s 223: learn: 0.0053881 total: 39.5s remaining: 48.7s 224: learn: 0.0053624 total: 39.7s remaining: 48.5s 225: learn: 0.0053350 total: 39.9s remaining: 48.4s 226: learn: 0.0053183 total: 40s remaining: 48.2s 227: learn: 0.0052868 total: 40.2s remaining: 48s 228: learn: 0.0052504 total: 40.5s remaining: 47.9s 229: learn: 0.0052183 total: 41.1s remaining: 48.2s 230: learn: 0.0051871 total: 41.4s remaining: 48.2s 231: learn: 0.0051655 total: 41.5s remaining: 48s 232: learn: 0.0051541 total: 41.7s remaining: 47.8s 233: learn: 0.0051408 total: 41.9s remaining: 47.6s 234: learn: 0.0051382 total: 42.2s remaining: 47.6s 235: learn: 0.0051196 total: 42.3s remaining: 47.4s 236: learn: 0.0050814 total: 42.5s remaining: 47.1s 237: learn: 0.0050656 total: 42.7s remaining: 47s 238: learn: 0.0050410 total: 42.8s remaining: 46.7s 239: learn: 0.0050180 total: 43s remaining: 46.6s 240: learn: 0.0049998 total: 43.3s remaining: 46.5s 241: learn: 0.0049669 total: 43.6s remaining: 46.5s 242: learn: 0.0049461 total: 43.7s remaining: 46.3s 243: learn: 0.0049370 total: 43.9s remaining: 46s 244: learn: 0.0048989 total: 44s remaining: 45.8s 245: learn: 0.0048890 total: 44.1s remaining: 45.6s 246: learn: 0.0048280 total: 44.4s remaining: 45.5s 247: learn: 0.0048069 total: 44.5s remaining: 45.3s 248: learn: 0.0047872 total: 44.8s remaining: 45.1s 249: learn: 0.0047699 total: 44.9s remaining: 44.9s 250: learn: 0.0047560 total: 45.1s remaining: 44.8s 251: learn: 0.0047506 total: 45.3s remaining: 44.5s 252: learn: 0.0047378 total: 45.4s remaining: 44.3s 253: learn: 0.0047129 total: 45.5s remaining: 44.1s 254: learn: 0.0046921 total: 45.6s remaining: 43.8s 255: learn: 0.0046863 total: 45.7s remaining: 43.6s 256: learn: 0.0046652 total: 45.9s remaining: 43.4s 257: learn: 0.0046328 total: 46s remaining: 43.2s 258: learn: 0.0046237 total: 46.3s remaining: 43.1s 259: learn: 0.0045831 total: 46.9s remaining: 43.3s 260: learn: 0.0045531 total: 47.1s remaining: 43.1s 261: learn: 0.0045460 total: 47.2s remaining: 42.9s 262: learn: 0.0045398 total: 47.4s remaining: 42.7s 263: learn: 0.0045337 total: 47.5s remaining: 42.4s 264: learn: 0.0045168 total: 47.6s remaining: 42.2s 265: learn: 0.0045120 total: 47.7s remaining: 42s 266: learn: 0.0044916 total: 47.8s remaining: 41.7s 267: learn: 0.0044854 total: 47.9s remaining: 41.5s 268: learn: 0.0044648 total: 48s remaining: 41.2s 269: learn: 0.0044586 total: 48.2s remaining: 41s 270: learn: 0.0044532 total: 48.3s remaining: 40.8s 271: learn: 0.0044495 total: 48.4s remaining: 40.5s 272: learn: 0.0044330 total: 48.5s remaining: 40.3s 273: learn: 0.0044237 total: 48.6s remaining: 40.1s 274: learn: 0.0044120 total: 48.7s remaining: 39.9s 275: learn: 0.0043819 total: 48.8s remaining: 39.6s 276: learn: 0.0043667 total: 48.9s remaining: 39.4s 277: learn: 0.0043602 total: 49.1s remaining: 39.2s 278: learn: 0.0043509 total: 49.2s remaining: 39s 279: learn: 0.0043356 total: 49.3s remaining: 38.7s 280: learn: 0.0043139 total: 49.4s remaining: 38.5s 281: learn: 0.0042865 total: 49.6s remaining: 38.3s 282: learn: 0.0042865 total: 49.6s remaining: 38s 283: learn: 0.0042801 total: 49.8s remaining: 37.9s 284: learn: 0.0042737 total: 49.9s remaining: 37.6s 285: learn: 0.0042621 total: 50s remaining: 37.4s 286: learn: 0.0042386 total: 50.2s remaining: 37.3s 287: learn: 0.0042091 total: 50.3s remaining: 37s 288: learn: 0.0041886 total: 50.4s remaining: 36.8s 289: learn: 0.0041735 total: 50.5s remaining: 36.5s 290: learn: 0.0041480 total: 50.5s remaining: 36.3s 291: learn: 0.0041306 total: 50.6s remaining: 36.1s 292: learn: 0.0041178 total: 50.7s remaining: 35.8s 293: learn: 0.0041112 total: 50.8s remaining: 35.6s 294: learn: 0.0040855 total: 50.9s remaining: 35.4s 295: learn: 0.0040558 total: 51s remaining: 35.1s 296: learn: 0.0040409 total: 51s remaining: 34.9s 297: learn: 0.0040308 total: 51.1s remaining: 34.7s 298: learn: 0.0040259 total: 51.2s remaining: 34.4s 299: learn: 0.0040259 total: 51.3s remaining: 34.2s 300: learn: 0.0040204 total: 51.4s remaining: 34s 301: learn: 0.0040161 total: 51.4s remaining: 33.7s 302: learn: 0.0039966 total: 51.5s remaining: 33.5s 303: learn: 0.0039855 total: 51.6s remaining: 33.3s 304: learn: 0.0039768 total: 51.7s remaining: 33s 305: learn: 0.0039674 total: 51.8s remaining: 32.8s 306: learn: 0.0039631 total: 51.9s remaining: 32.6s 307: learn: 0.0039512 total: 51.9s remaining: 32.4s 308: learn: 0.0039407 total: 52s remaining: 32.2s 309: learn: 0.0039212 total: 52.1s remaining: 31.9s 310: learn: 0.0039021 total: 52.2s remaining: 31.7s 311: learn: 0.0038943 total: 52.3s remaining: 31.5s 312: learn: 0.0038832 total: 52.4s remaining: 31.3s 313: learn: 0.0038607 total: 52.4s remaining: 31.1s 314: learn: 0.0038456 total: 52.5s remaining: 30.8s 315: learn: 0.0038359 total: 52.6s remaining: 30.6s 316: learn: 0.0038327 total: 52.7s remaining: 30.4s 317: learn: 0.0038295 total: 52.7s remaining: 30.2s 318: learn: 0.0038157 total: 52.8s remaining: 30s 319: learn: 0.0037836 total: 52.9s remaining: 29.7s 320: learn: 0.0037836 total: 53s remaining: 29.5s 321: learn: 0.0037815 total: 53s remaining: 29.3s 322: learn: 0.0037680 total: 53.1s remaining: 29.1s 323: learn: 0.0037634 total: 53.2s remaining: 28.9s 324: learn: 0.0037369 total: 53.3s remaining: 28.7s 325: learn: 0.0037317 total: 53.3s remaining: 28.5s 326: learn: 0.0037144 total: 53.4s remaining: 28.3s 327: learn: 0.0036961 total: 53.5s remaining: 28.1s 328: learn: 0.0036700 total: 53.6s remaining: 27.9s 329: learn: 0.0036555 total: 53.7s remaining: 27.6s 330: learn: 0.0036478 total: 53.7s remaining: 27.4s 331: learn: 0.0036272 total: 53.8s remaining: 27.2s 332: learn: 0.0036088 total: 53.9s remaining: 27s 333: learn: 0.0035931 total: 54s remaining: 26.8s 334: learn: 0.0035561 total: 54.1s remaining: 26.6s 335: learn: 0.0035516 total: 54.2s remaining: 26.4s 336: learn: 0.0035516 total: 54.2s remaining: 26.2s 337: learn: 0.0035461 total: 54.3s remaining: 26s 338: learn: 0.0035411 total: 54.4s remaining: 25.8s 339: learn: 0.0035349 total: 54.5s remaining: 25.6s 340: learn: 0.0035307 total: 54.5s remaining: 25.4s 341: learn: 0.0035307 total: 54.6s remaining: 25.2s 342: learn: 0.0035161 total: 54.7s remaining: 25s 343: learn: 0.0034934 total: 54.8s remaining: 24.8s 344: learn: 0.0034891 total: 54.8s remaining: 24.6s 345: learn: 0.0034797 total: 54.9s remaining: 24.4s 346: learn: 0.0034594 total: 55s remaining: 24.3s 347: learn: 0.0034552 total: 55.1s remaining: 24.1s 348: learn: 0.0034366 total: 55.2s remaining: 23.9s 349: learn: 0.0034195 total: 55.3s remaining: 23.7s 350: learn: 0.0034063 total: 55.3s remaining: 23.5s 351: learn: 0.0033891 total: 55.4s remaining: 23.3s 352: learn: 0.0033755 total: 55.5s remaining: 23.1s 353: learn: 0.0033726 total: 55.6s remaining: 22.9s 354: learn: 0.0033224 total: 55.7s remaining: 22.7s 355: learn: 0.0032952 total: 55.7s remaining: 22.5s 356: learn: 0.0032925 total: 55.8s remaining: 22.4s 357: learn: 0.0032702 total: 55.9s remaining: 22.2s 358: learn: 0.0032524 total: 56s remaining: 22s 359: learn: 0.0032464 total: 56.1s remaining: 21.8s 360: learn: 0.0032341 total: 56.2s remaining: 21.6s 361: learn: 0.0032179 total: 56.3s remaining: 21.4s 362: learn: 0.0032179 total: 56.3s remaining: 21.3s 363: learn: 0.0032087 total: 56.4s remaining: 21.1s 364: learn: 0.0032087 total: 56.5s remaining: 20.9s 365: learn: 0.0032072 total: 56.5s remaining: 20.7s 366: learn: 0.0032072 total: 56.6s remaining: 20.5s 367: learn: 0.0032020 total: 56.7s remaining: 20.3s 368: learn: 0.0031956 total: 56.8s remaining: 20.1s 369: learn: 0.0031900 total: 56.8s remaining: 20s 370: learn: 0.0031885 total: 56.9s remaining: 19.8s 371: learn: 0.0031847 total: 57s remaining: 19.6s 372: learn: 0.0031799 total: 57.1s remaining: 19.4s 373: learn: 0.0031778 total: 57.2s remaining: 19.3s 374: learn: 0.0031721 total: 57.2s remaining: 19.1s 375: learn: 0.0031461 total: 57.3s remaining: 18.9s 376: learn: 0.0031435 total: 57.4s remaining: 18.7s 377: learn: 0.0031392 total: 57.5s remaining: 18.6s 378: learn: 0.0031339 total: 57.6s remaining: 18.4s 379: learn: 0.0031288 total: 57.6s remaining: 18.2s 380: learn: 0.0031134 total: 57.7s remaining: 18s 381: learn: 0.0031073 total: 57.8s remaining: 17.9s 382: learn: 0.0031073 total: 57.9s remaining: 17.7s 383: learn: 0.0031073 total: 57.9s remaining: 17.5s 384: learn: 0.0031073 total: 58s remaining: 17.3s 385: learn: 0.0031042 total: 58.1s remaining: 17.1s 386: learn: 0.0031013 total: 58.1s remaining: 17s 387: learn: 0.0030946 total: 58.3s remaining: 16.8s 388: learn: 0.0030873 total: 58.3s remaining: 16.6s 389: learn: 0.0030818 total: 58.4s remaining: 16.5s 390: learn: 0.0030766 total: 58.5s remaining: 16.3s 391: learn: 0.0030667 total: 58.8s remaining: 16.2s 392: learn: 0.0030562 total: 59s remaining: 16.1s 393: learn: 0.0030487 total: 59.1s remaining: 15.9s 394: learn: 0.0030487 total: 59.1s remaining: 15.7s 395: learn: 0.0030487 total: 59.2s remaining: 15.5s 396: learn: 0.0030487 total: 59.2s remaining: 15.4s 397: learn: 0.0030487 total: 59.3s remaining: 15.2s 398: learn: 0.0030487 total: 59.4s remaining: 15s 399: learn: 0.0030487 total: 59.4s remaining: 14.9s 400: learn: 0.0030487 total: 59.5s remaining: 14.7s 401: learn: 0.0030487 total: 59.6s remaining: 14.5s 402: learn: 0.0030487 total: 59.6s remaining: 14.4s 403: learn: 0.0030487 total: 59.7s remaining: 14.2s 404: learn: 0.0030487 total: 59.7s remaining: 14s 405: learn: 0.0030486 total: 59.8s remaining: 13.8s 406: learn: 0.0030486 total: 59.9s remaining: 13.7s 407: learn: 0.0030486 total: 59.9s remaining: 13.5s 408: learn: 0.0030486 total: 1m remaining: 13.3s 409: learn: 0.0030486 total: 1m remaining: 13.2s 410: learn: 0.0030486 total: 1m remaining: 13s 411: learn: 0.0030486 total: 1m remaining: 12.9s 412: learn: 0.0030486 total: 1m remaining: 12.7s 413: learn: 0.0030486 total: 1m remaining: 12.5s 414: learn: 0.0030486 total: 1m remaining: 12.4s 415: learn: 0.0030486 total: 1m remaining: 12.2s 416: learn: 0.0030486 total: 1m remaining: 12s 417: learn: 0.0030486 total: 1m remaining: 11.9s 418: learn: 0.0030486 total: 1m remaining: 11.7s 419: learn: 0.0030485 total: 1m remaining: 11.6s 420: learn: 0.0030485 total: 1m remaining: 11.4s 421: learn: 0.0030485 total: 1m remaining: 11.2s 422: learn: 0.0030485 total: 1m remaining: 11.1s 423: learn: 0.0030485 total: 1m remaining: 10.9s 424: learn: 0.0030485 total: 1m 1s remaining: 10.8s 425: learn: 0.0030485 total: 1m 1s remaining: 10.6s 426: learn: 0.0030485 total: 1m 1s remaining: 10.5s 427: learn: 0.0030485 total: 1m 1s remaining: 10.3s 428: learn: 0.0030485 total: 1m 1s remaining: 10.1s 429: learn: 0.0030485 total: 1m 1s remaining: 9.99s 430: learn: 0.0030485 total: 1m 1s remaining: 9.83s 431: learn: 0.0030485 total: 1m 1s remaining: 9.68s 432: learn: 0.0030485 total: 1m 1s remaining: 9.52s 433: learn: 0.0030485 total: 1m 1s remaining: 9.37s 434: learn: 0.0030485 total: 1m 1s remaining: 9.21s 435: learn: 0.0030485 total: 1m 1s remaining: 9.06s 436: learn: 0.0030485 total: 1m 1s remaining: 8.9s 437: learn: 0.0030485 total: 1m 1s remaining: 8.75s 438: learn: 0.0030485 total: 1m 1s remaining: 8.6s 439: learn: 0.0030485 total: 1m 1s remaining: 8.45s 440: learn: 0.0030485 total: 1m 2s remaining: 8.3s 441: learn: 0.0030485 total: 1m 2s remaining: 8.15s 442: learn: 0.0030485 total: 1m 2s remaining: 8s 443: learn: 0.0030485 total: 1m 2s remaining: 7.85s 444: learn: 0.0030485 total: 1m 2s remaining: 7.7s 445: learn: 0.0030485 total: 1m 2s remaining: 7.55s 446: learn: 0.0030485 total: 1m 2s remaining: 7.4s 447: learn: 0.0030485 total: 1m 2s remaining: 7.25s 448: learn: 0.0030485 total: 1m 2s remaining: 7.1s 449: learn: 0.0030485 total: 1m 2s remaining: 6.95s 450: learn: 0.0030485 total: 1m 2s remaining: 6.81s 451: learn: 0.0030485 total: 1m 2s remaining: 6.66s 452: learn: 0.0030485 total: 1m 2s remaining: 6.51s 453: learn: 0.0030485 total: 1m 2s remaining: 6.37s 454: learn: 0.0030485 total: 1m 2s remaining: 6.22s 455: learn: 0.0030485 total: 1m 2s remaining: 6.08s 456: learn: 0.0030485 total: 1m 3s remaining: 5.93s 457: learn: 0.0030485 total: 1m 3s remaining: 5.79s 458: learn: 0.0030485 total: 1m 3s remaining: 5.64s 459: learn: 0.0030485 total: 1m 3s remaining: 5.5s 460: learn: 0.0030485 total: 1m 3s remaining: 5.35s 461: learn: 0.0030485 total: 1m 3s remaining: 5.21s 462: learn: 0.0030485 total: 1m 3s remaining: 5.07s 463: learn: 0.0030485 total: 1m 3s remaining: 4.92s 464: learn: 0.0030485 total: 1m 3s remaining: 4.78s 465: learn: 0.0030485 total: 1m 3s remaining: 4.64s 466: learn: 0.0030485 total: 1m 3s remaining: 4.5s 467: learn: 0.0030485 total: 1m 3s remaining: 4.36s 468: learn: 0.0030485 total: 1m 3s remaining: 4.21s 469: learn: 0.0030485 total: 1m 3s remaining: 4.08s 470: learn: 0.0030485 total: 1m 3s remaining: 3.94s 471: learn: 0.0030485 total: 1m 3s remaining: 3.79s 472: learn: 0.0030485 total: 1m 4s remaining: 3.65s 473: learn: 0.0030485 total: 1m 4s remaining: 3.52s 474: learn: 0.0030485 total: 1m 4s remaining: 3.38s 475: learn: 0.0030485 total: 1m 4s remaining: 3.24s 476: learn: 0.0030485 total: 1m 4s remaining: 3.1s 477: learn: 0.0030485 total: 1m 4s remaining: 2.96s 478: learn: 0.0030485 total: 1m 4s remaining: 2.82s 479: learn: 0.0030485 total: 1m 4s remaining: 2.69s 480: learn: 0.0030485 total: 1m 4s remaining: 2.55s 481: learn: 0.0030485 total: 1m 4s remaining: 2.41s 482: learn: 0.0030485 total: 1m 4s remaining: 2.27s 483: learn: 0.0030485 total: 1m 4s remaining: 2.14s 484: learn: 0.0030485 total: 1m 4s remaining: 2s 485: learn: 0.0030485 total: 1m 4s remaining: 1.87s 486: learn: 0.0030485 total: 1m 4s remaining: 1.73s 487: learn: 0.0030485 total: 1m 4s remaining: 1.6s 488: learn: 0.0030485 total: 1m 5s remaining: 1.47s 489: learn: 0.0030485 total: 1m 5s remaining: 1.33s 490: learn: 0.0030485 total: 1m 5s remaining: 1.2s 491: learn: 0.0030485 total: 1m 5s remaining: 1.06s 492: learn: 0.0030485 total: 1m 5s remaining: 929ms 493: learn: 0.0030485 total: 1m 5s remaining: 796ms 494: learn: 0.0030485 total: 1m 5s remaining: 664ms 495: learn: 0.0030485 total: 1m 6s remaining: 533ms 496: learn: 0.0030485 total: 1m 6s remaining: 402ms 497: learn: 0.0030485 total: 1m 6s remaining: 268ms 498: learn: 0.0030485 total: 1m 6s remaining: 134ms 499: learn: 0.0030485 total: 1m 6s remaining: 0us
final_predictions = mode(predictions, axis=0).mode.flatten()
final_pred_proba=np.mean(pred_prob,axis=0)
test_df['target'] = final_predictions
test_df[['proba','proba2']]= final_pred_proba
Interprétation des modèles¶
import joblib
# load models
model1 = joblib.load('Model/model1.pkl')
model2 = joblib.load('Model/model2.pkl')
model3 = joblib.load('Model/model3.pkl')
import shap
import matplotlib.pyplot as plt
# Échantillon de données (pour accélérer le calcul)
X_sample = shap.utils.sample(X_train, 5000)
# Fonction pour récupérer les valeurs SHAP pour un modèle
def get_shap_values(model, X):
explainer = shap.TreeExplainer(model)
shap_values = explainer.shap_values(X)
# Si classification binaire avec deux classes, retourner la classe positive
if isinstance(shap_values, list):
return shap_values[1]
return shap_values
# Récupérer les valeurs SHAP pour chaque modèle
shap_values1 = get_shap_values(model1, X_sample)
shap_values2 = get_shap_values(model2, X_sample)
shap_values3 = get_shap_values(model3, X_sample)
# Créer une figure avec 3 subplots (vertical)
fig, axes = plt.subplots(3, 1, figsize=(14, 18))
# Fonction générique pour tracer un beeswarm avec police personnalisée
def plot_shap_with_font(ax, shap_values, title, feature_name_size=10):
plt.sca(ax)
shap.summary_plot(shap_values, X_sample, show=False)
ax.set_title(title, fontsize=14)
# Réduire la taille de la police des noms des features (axe Y)
yticklabels = ax.get_yticklabels()
if yticklabels:
for label in yticklabels:
label.set_fontsize(feature_name_size)
# Tracer les trois graphiques
plot_shap_with_font(axes[0], shap_values1, "Modèle 1 - Xgboost (SHAP)", feature_name_size=8)
plot_shap_with_font(axes[1], shap_values2, "Modèle 2 - LightGbm (SHAP)", feature_name_size=8)
plot_shap_with_font(axes[2], shap_values3, "Modèle 3 - Catboost Plot (SHAP)", feature_name_size=8)
# Ajuster l'affichage
plt.tight_layout()
plt.subplots_adjust(hspace=0.5)
plt.show()
Des valeurs élevées (points roses à droite) font fortement basculer la prédiction vers le défaut de paiement.
Des valeurs basses (points bleus à gauche) indiquent une probabilité plus forte de remboursement.
tbl_loan_id et lender_portion_to_be_repaid arrivent systématiquement en 2ᵉ et 3ᵉ positions :
repayment_ratio est de loin la variable la plus influente sur les trois modèles :
Un identifiant de crédit spécifique (tbl_loan_id) capte probablement une structure (montants ou durée) récurrente liée au risque.
La part due au prêteur (“portion_to_be_repaid”) hausse aussi le risque quand elle est importante.
amount_to_repay_greater_than_average (différence par rapport à la moyenne) apparaît dans le top 5 de tous les modèles :
- Si le montant à rembourser dépasse largement la moyenne, le risque de défaut augmente nettement.
XGBoost place un tout petit peu plus haut amount_due_per_day, LightGBM et CatBoost valorisent un peu plus due_date_day et disbursement_date_day (structure temporelle).