machine_learning
HyperOpt를 이용한 하이퍼 파라미터 튜닝.ipynb
hayleyhell
2022. 11. 18. 19:08
Bayesian Optimization¶
- Grid Search
- Random Search
- Bayesian Optimization
- 일정 시간 내에는 Random Search가 성능이 좋음. 그러나 Grid와 Random은 이전 결과를 반영하지 못함.
- Bayesian Optimization은 이전의 결과를 살려서 짧은 반복을 통해 최적화
In [25]:
!pip install hyperopt
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/ Requirement already satisfied: hyperopt in /usr/local/lib/python3.7/dist-packages (0.1.2) Requirement already satisfied: tqdm in /usr/local/lib/python3.7/dist-packages (from hyperopt) (4.64.1) Requirement already satisfied: numpy in /usr/local/lib/python3.7/dist-packages (from hyperopt) (1.21.6) Requirement already satisfied: six in /usr/local/lib/python3.7/dist-packages (from hyperopt) (1.15.0) Requirement already satisfied: pymongo in /usr/local/lib/python3.7/dist-packages (from hyperopt) (4.3.2) Requirement already satisfied: scipy in /usr/local/lib/python3.7/dist-packages (from hyperopt) (1.7.3) Requirement already satisfied: future in /usr/local/lib/python3.7/dist-packages (from hyperopt) (0.16.0) Requirement already satisfied: networkx in /usr/local/lib/python3.7/dist-packages (from hyperopt) (2.6.3) Requirement already satisfied: dnspython<3.0.0,>=1.16.0 in /usr/local/lib/python3.7/dist-packages (from pymongo->hyperopt) (2.2.1)
In [26]:
from sklearn.datasets import load_breast_cancer
from sklearn.model_selection import train_test_split
import warnings
warnings.filterwarnings('ignore')
import pandas as pd
import numpy as np
cancer = load_breast_cancer()
cancer_df = pd.DataFrame(data=cancer.data, columns=cancer.feature_names)
cancer_df['target'] = cancer.target
In [27]:
y = cancer_df['target']
X = cancer_df.drop('target', axis=1)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=156)
# X_train, y_train을 다시 쪼개서 학습과 검증용 데이터로 분리
X_tr, X_val, y_tr, y_val = train_test_split(X_train, y_train, test_size=0.1, random_state=156)
In [28]:
from hyperopt import hp
# 하이퍼 파라미터 검색 공간 설정
# max_depth는 5에서 20까지 1간격으로, min_child_weight는 1에서 2까지 1간격으로
# colsample_bytree는 0.5에서 1사이, learning_rate는 0.01에서 0.2 사이 정규 분포된 값으로 검색
xgb_search_space = {'max_depth': hp.quniform('max_depth', 5, 20, 1), # 정수형 하이퍼 파라미터 hp.quniform()
'min_child_weight': hp.quniform('min_child_weight', 1, 2, 1), # 정수형 하이퍼 파라미터 hp.quniform()
'learning_rate': hp.uniform('learning_rate', 0.01, 0.2),
'colsample_bytree': hp.uniform('colsample_bytree', 0.5, 1)}
In [29]:
from sklearn.model_selection import cross_val_score
from xgboost import XGBClassifier
from hyperopt import STATUS_OK
# fmin()에서 search_space 값으로 입력된 모든 값은 실수형임
# XGBClassfier의 정수형 하이퍼 파라미터는 정수형 변환을 해줘야 함.
# 정확도가 높을수록 더 좋은 수치임. -1 * 정확도를 곱해서 큰 정확도 값일수록 최소가 되도록 변환
def objective_func(search_space):
# 수행 시간 절약을 위해 nestimators는 100으로 축소
xgb_clf = XGBClassifier(n_estimators=100,
max_depth = int(search_space['max_depth']),
min_child_weight = int(search_space['min_child_weight']),
learning_rate= search_space['learning_rate'],
colsample_bytree = search_space['colsample_bytree'],
eval_metric = 'logloss')
accuracy = cross_val_score(xgb_clf, X_train, y_train, scoring='accuracy', cv=3)
# accuracy는 cv=3 개수만큼 roc-auc 결과를 리스트로 가짐. 이를 평균해서 반환하되 -1을 곱함.
return {'loss': -1 * np.mean(accuracy), 'status': STATUS_OK}
In [30]:
from hyperopt import fmin, tpe, Trials
trial_val = Trials()
best = fmin(fn=objective_func,
space=xgb_search_space,
algo=tpe.suggest,
max_evals=50, # 최대 반복 횟수 지정
trials=trial_val)
print('best:', best)
100%|██████████| 50/50 [00:20<00:00, 2.46it/s, best loss: -0.9670616939700244]
best: {'colsample_bytree': 0.5757257559069644, 'learning_rate': 0.0990140383398648, 'max_depth': 16.0, 'min_child_weight': 2.0}
In [31]:
# n_estimators를 1000 증가 후 최적으로 찾은 하이퍼 파라미터를 기반으로 학습, 예측
xgb_clf = XGBClassifier(n_estimators=1000,
learning_rate=round(best['learning_rate'],5),
max_depth=int(best['max_depth']),
colsample_bytree=round(best['colsample_bytree'],5),
min_child_weight=int(best['min_child_weight']))
evals = [(X_tr, y_tr), (X_val, y_val)]
xgb_clf.fit(X_tr, y_tr, early_stopping_rounds=50, eval_metric='logloss',
eval_set=evals, verbose=True) # verbose는 학습마다 평가값 메세지 출력
pred = xgb_clf.predict(X_test)
pred_proba = xgb_clf.predict_proba(X_test)[:, 1]
[0] validation_0-logloss:0.612745 validation_1-logloss:0.645531 Multiple eval metrics have been passed: 'validation_1-logloss' will be used for early stopping. Will train until validation_1-logloss hasn't improved in 50 rounds. [1] validation_0-logloss:0.546644 validation_1-logloss:0.603944 [2] validation_0-logloss:0.488222 validation_1-logloss:0.55757 [3] validation_0-logloss:0.437966 validation_1-logloss:0.518234 [4] validation_0-logloss:0.395814 validation_1-logloss:0.493116 [5] validation_0-logloss:0.359231 validation_1-logloss:0.472969 [6] validation_0-logloss:0.326401 validation_1-logloss:0.447418 [7] validation_0-logloss:0.297659 validation_1-logloss:0.431204 [8] validation_0-logloss:0.273876 validation_1-logloss:0.410868 [9] validation_0-logloss:0.251387 validation_1-logloss:0.39334 [10] validation_0-logloss:0.231264 validation_1-logloss:0.375682 [11] validation_0-logloss:0.213454 validation_1-logloss:0.365359 [12] validation_0-logloss:0.196768 validation_1-logloss:0.354264 [13] validation_0-logloss:0.182037 validation_1-logloss:0.34182 [14] validation_0-logloss:0.169791 validation_1-logloss:0.334161 [15] validation_0-logloss:0.15743 validation_1-logloss:0.325749 [16] validation_0-logloss:0.147526 validation_1-logloss:0.31784 [17] validation_0-logloss:0.138603 validation_1-logloss:0.310528 [18] validation_0-logloss:0.129384 validation_1-logloss:0.30497 [19] validation_0-logloss:0.12174 validation_1-logloss:0.298758 [20] validation_0-logloss:0.115218 validation_1-logloss:0.29694 [21] validation_0-logloss:0.10904 validation_1-logloss:0.296408 [22] validation_0-logloss:0.102624 validation_1-logloss:0.290782 [23] validation_0-logloss:0.097223 validation_1-logloss:0.290404 [24] validation_0-logloss:0.092118 validation_1-logloss:0.290982 [25] validation_0-logloss:0.087222 validation_1-logloss:0.291158 [26] validation_0-logloss:0.083349 validation_1-logloss:0.290735 [27] validation_0-logloss:0.079293 validation_1-logloss:0.285803 [28] validation_0-logloss:0.075555 validation_1-logloss:0.286021 [29] validation_0-logloss:0.072472 validation_1-logloss:0.286351 [30] validation_0-logloss:0.068745 validation_1-logloss:0.281042 [31] validation_0-logloss:0.065868 validation_1-logloss:0.282334 [32] validation_0-logloss:0.063484 validation_1-logloss:0.284459 [33] validation_0-logloss:0.060704 validation_1-logloss:0.283615 [34] validation_0-logloss:0.05805 validation_1-logloss:0.280312 [35] validation_0-logloss:0.056118 validation_1-logloss:0.281955 [36] validation_0-logloss:0.053564 validation_1-logloss:0.283937 [37] validation_0-logloss:0.051399 validation_1-logloss:0.283073 [38] validation_0-logloss:0.049828 validation_1-logloss:0.278045 [39] validation_0-logloss:0.048168 validation_1-logloss:0.27724 [40] validation_0-logloss:0.046522 validation_1-logloss:0.272933 [41] validation_0-logloss:0.044751 validation_1-logloss:0.270471 [42] validation_0-logloss:0.043249 validation_1-logloss:0.2698 [43] validation_0-logloss:0.041826 validation_1-logloss:0.268583 [44] validation_0-logloss:0.040714 validation_1-logloss:0.267768 [45] validation_0-logloss:0.039408 validation_1-logloss:0.26859 [46] validation_0-logloss:0.038215 validation_1-logloss:0.267079 [47] validation_0-logloss:0.037418 validation_1-logloss:0.268131 [48] validation_0-logloss:0.03636 validation_1-logloss:0.271818 [49] validation_0-logloss:0.035371 validation_1-logloss:0.273469 [50] validation_0-logloss:0.034429 validation_1-logloss:0.273477 [51] validation_0-logloss:0.03335 validation_1-logloss:0.274475 [52] validation_0-logloss:0.03277 validation_1-logloss:0.275181 [53] validation_0-logloss:0.03223 validation_1-logloss:0.274337 [54] validation_0-logloss:0.031697 validation_1-logloss:0.271735 [55] validation_0-logloss:0.031106 validation_1-logloss:0.271192 [56] validation_0-logloss:0.030571 validation_1-logloss:0.271788 [57] validation_0-logloss:0.029869 validation_1-logloss:0.271841 [58] validation_0-logloss:0.029305 validation_1-logloss:0.269077 [59] validation_0-logloss:0.028817 validation_1-logloss:0.269534 [60] validation_0-logloss:0.028362 validation_1-logloss:0.269155 [61] validation_0-logloss:0.027959 validation_1-logloss:0.267771 [62] validation_0-logloss:0.027611 validation_1-logloss:0.266019 [63] validation_0-logloss:0.0272 validation_1-logloss:0.266359 [64] validation_0-logloss:0.026815 validation_1-logloss:0.26744 [65] validation_0-logloss:0.02641 validation_1-logloss:0.267334 [66] validation_0-logloss:0.026227 validation_1-logloss:0.266243 [67] validation_0-logloss:0.025871 validation_1-logloss:0.266628 [68] validation_0-logloss:0.025547 validation_1-logloss:0.26327 [69] validation_0-logloss:0.025109 validation_1-logloss:0.259998 [70] validation_0-logloss:0.024849 validation_1-logloss:0.259914 [71] validation_0-logloss:0.024529 validation_1-logloss:0.258723 [72] validation_0-logloss:0.024255 validation_1-logloss:0.255722 [73] validation_0-logloss:0.023909 validation_1-logloss:0.256009 [74] validation_0-logloss:0.023625 validation_1-logloss:0.25642 [75] validation_0-logloss:0.023321 validation_1-logloss:0.256746 [76] validation_0-logloss:0.023073 validation_1-logloss:0.256835 [77] validation_0-logloss:0.02295 validation_1-logloss:0.256723 [78] validation_0-logloss:0.02262 validation_1-logloss:0.254055 [79] validation_0-logloss:0.022413 validation_1-logloss:0.252768 [80] validation_0-logloss:0.022284 validation_1-logloss:0.25184 [81] validation_0-logloss:0.022028 validation_1-logloss:0.250769 [82] validation_0-logloss:0.021788 validation_1-logloss:0.25161 [83] validation_0-logloss:0.021684 validation_1-logloss:0.251508 [84] validation_0-logloss:0.021568 validation_1-logloss:0.252913 [85] validation_0-logloss:0.021303 validation_1-logloss:0.250509 [86] validation_0-logloss:0.021187 validation_1-logloss:0.249621 [87] validation_0-logloss:0.021079 validation_1-logloss:0.250997 [88] validation_0-logloss:0.020977 validation_1-logloss:0.250803 [89] validation_0-logloss:0.020873 validation_1-logloss:0.24997 [90] validation_0-logloss:0.020696 validation_1-logloss:0.250765 [91] validation_0-logloss:0.02059 validation_1-logloss:0.252147 [92] validation_0-logloss:0.020498 validation_1-logloss:0.253445 [93] validation_0-logloss:0.020402 validation_1-logloss:0.25264 [94] validation_0-logloss:0.02031 validation_1-logloss:0.252559 [95] validation_0-logloss:0.020224 validation_1-logloss:0.25201 [96] validation_0-logloss:0.020136 validation_1-logloss:0.253297 [97] validation_0-logloss:0.020046 validation_1-logloss:0.252512 [98] validation_0-logloss:0.019962 validation_1-logloss:0.252323 [99] validation_0-logloss:0.019879 validation_1-logloss:0.253587 [100] validation_0-logloss:0.019801 validation_1-logloss:0.253518 [101] validation_0-logloss:0.019727 validation_1-logloss:0.254724 [102] validation_0-logloss:0.019652 validation_1-logloss:0.254216 [103] validation_0-logloss:0.019563 validation_1-logloss:0.253428 [104] validation_0-logloss:0.019487 validation_1-logloss:0.252713 [105] validation_0-logloss:0.019414 validation_1-logloss:0.253919 [106] validation_0-logloss:0.019345 validation_1-logloss:0.254041 [107] validation_0-logloss:0.019272 validation_1-logloss:0.253978 [108] validation_0-logloss:0.019198 validation_1-logloss:0.253273 [109] validation_0-logloss:0.019131 validation_1-logloss:0.251599 [110] validation_0-logloss:0.019062 validation_1-logloss:0.252795 [111] validation_0-logloss:0.018993 validation_1-logloss:0.252611 [112] validation_0-logloss:0.018927 validation_1-logloss:0.253231 [113] validation_0-logloss:0.018861 validation_1-logloss:0.251559 [114] validation_0-logloss:0.018795 validation_1-logloss:0.251647 [115] validation_0-logloss:0.018731 validation_1-logloss:0.252815 [116] validation_0-logloss:0.018668 validation_1-logloss:0.25344 [117] validation_0-logloss:0.018606 validation_1-logloss:0.253277 [118] validation_0-logloss:0.01854 validation_1-logloss:0.252604 [119] validation_0-logloss:0.018479 validation_1-logloss:0.251011 [120] validation_0-logloss:0.018417 validation_1-logloss:0.252161 [121] validation_0-logloss:0.018357 validation_1-logloss:0.252104 [122] validation_0-logloss:0.018297 validation_1-logloss:0.251181 [123] validation_0-logloss:0.01824 validation_1-logloss:0.250567 [124] validation_0-logloss:0.018176 validation_1-logloss:0.250702 [125] validation_0-logloss:0.01812 validation_1-logloss:0.250787 [126] validation_0-logloss:0.018065 validation_1-logloss:0.250738 [127] validation_0-logloss:0.018014 validation_1-logloss:0.249904 [128] validation_0-logloss:0.017952 validation_1-logloss:0.250529 [129] validation_0-logloss:0.017894 validation_1-logloss:0.251653 [130] validation_0-logloss:0.017841 validation_1-logloss:0.251788 [131] validation_0-logloss:0.017784 validation_1-logloss:0.250246 [132] validation_0-logloss:0.017729 validation_1-logloss:0.250097 [133] validation_0-logloss:0.017674 validation_1-logloss:0.249228 [134] validation_0-logloss:0.017621 validation_1-logloss:0.249826 [135] validation_0-logloss:0.017569 validation_1-logloss:0.249913 [136] validation_0-logloss:0.017518 validation_1-logloss:0.249318 [137] validation_0-logloss:0.017469 validation_1-logloss:0.249187 [138] validation_0-logloss:0.01742 validation_1-logloss:0.249768 [139] validation_0-logloss:0.017366 validation_1-logloss:0.24828 [140] validation_0-logloss:0.017316 validation_1-logloss:0.247467 [141] validation_0-logloss:0.017269 validation_1-logloss:0.247404 [142] validation_0-logloss:0.01722 validation_1-logloss:0.247988 [143] validation_0-logloss:0.017173 validation_1-logloss:0.247198 [144] validation_0-logloss:0.01713 validation_1-logloss:0.247469 [145] validation_0-logloss:0.017085 validation_1-logloss:0.247411 [146] validation_0-logloss:0.017042 validation_1-logloss:0.247969 [147] validation_0-logloss:0.016992 validation_1-logloss:0.246537 [148] validation_0-logloss:0.01695 validation_1-logloss:0.246677 [149] validation_0-logloss:0.016904 validation_1-logloss:0.245307 [150] validation_0-logloss:0.016858 validation_1-logloss:0.245189 [151] validation_0-logloss:0.016815 validation_1-logloss:0.244628 [152] validation_0-logloss:0.016772 validation_1-logloss:0.243898 [153] validation_0-logloss:0.016729 validation_1-logloss:0.24446 [154] validation_0-logloss:0.016686 validation_1-logloss:0.24417 [155] validation_0-logloss:0.016647 validation_1-logloss:0.244109 [156] validation_0-logloss:0.016607 validation_1-logloss:0.243998 [157] validation_0-logloss:0.016566 validation_1-logloss:0.242722 [158] validation_0-logloss:0.016525 validation_1-logloss:0.243974 [159] validation_0-logloss:0.016487 validation_1-logloss:0.243879 [160] validation_0-logloss:0.016448 validation_1-logloss:0.242629 [161] validation_0-logloss:0.016408 validation_1-logloss:0.243592 [162] validation_0-logloss:0.016369 validation_1-logloss:0.243487 [163] validation_0-logloss:0.016331 validation_1-logloss:0.242961 [164] validation_0-logloss:0.016294 validation_1-logloss:0.244159 [165] validation_0-logloss:0.016258 validation_1-logloss:0.244169 [166] validation_0-logloss:0.01622 validation_1-logloss:0.244509 [167] validation_0-logloss:0.016183 validation_1-logloss:0.244414 [168] validation_0-logloss:0.016146 validation_1-logloss:0.243194 [169] validation_0-logloss:0.016109 validation_1-logloss:0.243106 [170] validation_0-logloss:0.016074 validation_1-logloss:0.241934 [171] validation_0-logloss:0.016038 validation_1-logloss:0.242283 [172] validation_0-logloss:0.016005 validation_1-logloss:0.243429 [173] validation_0-logloss:0.015971 validation_1-logloss:0.242939 [174] validation_0-logloss:0.015935 validation_1-logloss:0.24285 [175] validation_0-logloss:0.0159 validation_1-logloss:0.242588 [176] validation_0-logloss:0.015867 validation_1-logloss:0.242895 [177] validation_0-logloss:0.015834 validation_1-logloss:0.242815 [178] validation_0-logloss:0.015802 validation_1-logloss:0.242855 [179] validation_0-logloss:0.015768 validation_1-logloss:0.242783 [180] validation_0-logloss:0.015732 validation_1-logloss:0.241607 [181] validation_0-logloss:0.0157 validation_1-logloss:0.241545 [182] validation_0-logloss:0.015668 validation_1-logloss:0.241078 [183] validation_0-logloss:0.015636 validation_1-logloss:0.242226 [184] validation_0-logloss:0.015603 validation_1-logloss:0.24215 [185] validation_0-logloss:0.015572 validation_1-logloss:0.24249 [186] validation_0-logloss:0.015541 validation_1-logloss:0.242238 [187] validation_0-logloss:0.015509 validation_1-logloss:0.241111 [188] validation_0-logloss:0.015477 validation_1-logloss:0.241057 [189] validation_0-logloss:0.015447 validation_1-logloss:0.240612 [190] validation_0-logloss:0.015416 validation_1-logloss:0.240538 [191] validation_0-logloss:0.015387 validation_1-logloss:0.240847 [192] validation_0-logloss:0.015358 validation_1-logloss:0.241925 [193] validation_0-logloss:0.015327 validation_1-logloss:0.240825 [194] validation_0-logloss:0.015298 validation_1-logloss:0.240737 [195] validation_0-logloss:0.015268 validation_1-logloss:0.240693 [196] validation_0-logloss:0.01524 validation_1-logloss:0.240262 [197] validation_0-logloss:0.015212 validation_1-logloss:0.240027 [198] validation_0-logloss:0.015184 validation_1-logloss:0.239988 [199] validation_0-logloss:0.015154 validation_1-logloss:0.238913 [200] validation_0-logloss:0.015127 validation_1-logloss:0.23884 [201] validation_0-logloss:0.0151 validation_1-logloss:0.238883 [202] validation_0-logloss:0.015069 validation_1-logloss:0.239993 [203] validation_0-logloss:0.015043 validation_1-logloss:0.241038 [204] validation_0-logloss:0.015016 validation_1-logloss:0.240012 [205] validation_0-logloss:0.014987 validation_1-logloss:0.239981 [206] validation_0-logloss:0.014961 validation_1-logloss:0.240292 [207] validation_0-logloss:0.014934 validation_1-logloss:0.239876 [208] validation_0-logloss:0.014907 validation_1-logloss:0.239802 [209] validation_0-logloss:0.014881 validation_1-logloss:0.239866 [210] validation_0-logloss:0.014856 validation_1-logloss:0.239793 [211] validation_0-logloss:0.014831 validation_1-logloss:0.240094 [212] validation_0-logloss:0.014804 validation_1-logloss:0.23965 [213] validation_0-logloss:0.014778 validation_1-logloss:0.238659 [214] validation_0-logloss:0.014753 validation_1-logloss:0.238725 [215] validation_0-logloss:0.014728 validation_1-logloss:0.238651 [216] validation_0-logloss:0.014704 validation_1-logloss:0.238254 [217] validation_0-logloss:0.014678 validation_1-logloss:0.238365 [218] validation_0-logloss:0.014653 validation_1-logloss:0.238405 [219] validation_0-logloss:0.014629 validation_1-logloss:0.238744 [220] validation_0-logloss:0.014605 validation_1-logloss:0.237793 [221] validation_0-logloss:0.01458 validation_1-logloss:0.23882 [222] validation_0-logloss:0.014556 validation_1-logloss:0.238739 [223] validation_0-logloss:0.014533 validation_1-logloss:0.238721 [224] validation_0-logloss:0.014509 validation_1-logloss:0.238837 [225] validation_0-logloss:0.014486 validation_1-logloss:0.238438 [226] validation_0-logloss:0.014463 validation_1-logloss:0.238366 [227] validation_0-logloss:0.01444 validation_1-logloss:0.237983 [228] validation_0-logloss:0.014416 validation_1-logloss:0.238979 [229] validation_0-logloss:0.014393 validation_1-logloss:0.239018 [230] validation_0-logloss:0.014371 validation_1-logloss:0.239973 [231] validation_0-logloss:0.01435 validation_1-logloss:0.240047 [232] validation_0-logloss:0.014327 validation_1-logloss:0.239979 [233] validation_0-logloss:0.014304 validation_1-logloss:0.239036 [234] validation_0-logloss:0.014282 validation_1-logloss:0.239152 [235] validation_0-logloss:0.01426 validation_1-logloss:0.238779 [236] validation_0-logloss:0.014237 validation_1-logloss:0.238551 [237] validation_0-logloss:0.014214 validation_1-logloss:0.238897 [238] validation_0-logloss:0.014194 validation_1-logloss:0.238978 [239] validation_0-logloss:0.014172 validation_1-logloss:0.238074 [240] validation_0-logloss:0.014151 validation_1-logloss:0.238006 [241] validation_0-logloss:0.01413 validation_1-logloss:0.238955 [242] validation_0-logloss:0.014109 validation_1-logloss:0.239287 [243] validation_0-logloss:0.014088 validation_1-logloss:0.238923 [244] validation_0-logloss:0.014068 validation_1-logloss:0.238705 [245] validation_0-logloss:0.014048 validation_1-logloss:0.238792 [246] validation_0-logloss:0.014027 validation_1-logloss:0.238725 [247] validation_0-logloss:0.014008 validation_1-logloss:0.238381 [248] validation_0-logloss:0.01399 validation_1-logloss:0.238321 [249] validation_0-logloss:0.013971 validation_1-logloss:0.239226 [250] validation_0-logloss:0.013951 validation_1-logloss:0.239553 [251] validation_0-logloss:0.013932 validation_1-logloss:0.239644 [252] validation_0-logloss:0.013913 validation_1-logloss:0.239433 [253] validation_0-logloss:0.013895 validation_1-logloss:0.239095 [254] validation_0-logloss:0.013877 validation_1-logloss:0.23874 [255] validation_0-logloss:0.013859 validation_1-logloss:0.238833 [256] validation_0-logloss:0.013843 validation_1-logloss:0.238519 [257] validation_0-logloss:0.013823 validation_1-logloss:0.238452 [258] validation_0-logloss:0.013806 validation_1-logloss:0.238395 [259] validation_0-logloss:0.013787 validation_1-logloss:0.238717 [260] validation_0-logloss:0.01377 validation_1-logloss:0.239568 [261] validation_0-logloss:0.013754 validation_1-logloss:0.239664 [262] validation_0-logloss:0.013737 validation_1-logloss:0.239346 [263] validation_0-logloss:0.013719 validation_1-logloss:0.239289 [264] validation_0-logloss:0.013703 validation_1-logloss:0.239093 [265] validation_0-logloss:0.013685 validation_1-logloss:0.238738 [266] validation_0-logloss:0.013669 validation_1-logloss:0.239044 [267] validation_0-logloss:0.013653 validation_1-logloss:0.239144 [268] validation_0-logloss:0.013636 validation_1-logloss:0.239088 [269] validation_0-logloss:0.01362 validation_1-logloss:0.238775 [270] validation_0-logloss:0.013604 validation_1-logloss:0.238689 Stopping. Best iteration: [220] validation_0-logloss:0.014605 validation_1-logloss:0.237793
In [32]:
# 분류 평가 지표
from sklearn.metrics import accuracy_score, precision_score, recall_score, confusion_matrix
from sklearn.metrics import f1_score, roc_auc_score
def get_clf_eval(y_test, predict=None, predict_proba=None):
confusion = confusion_matrix(y_test, predict) # 오차행렬
accuracy = accuracy_score(y_test, predict) # 정확도
precision = precision_score(y_test, predict) # 정밀도
recall = recall_score(y_test, predict) # 재현도
f1 = f1_score(y_test, predict) # f1 스코어
roc_auc = roc_auc_score(y_test, predict_proba) # auc 스코어
print('오차 행렬')
print(confusion)
print('정확도: {0:.4f}, 정밀도: {1:.4f}, 재현율: {2:.4f}, F1: {3:.4f}, AUC:{4:.4f}'.format(accuracy, precision, recall, f1, roc_auc))
In [33]:
get_clf_eval(y_test, pred, pred_proba)
오차 행렬 [[34 3] [ 2 75]] 정확도: 0.9561, 정밀도: 0.9615, 재현율: 0.9740, F1: 0.9677, AUC:0.9944
In [34]:
from xgboost import plot_importance
import matplotlib.pyplot as plt
fig, ax = plt.subplots(1,1,figsize=(10,8))
plot_importance(xgb_clf, ax=ax, max_num_features=20, height=0.4)
Out[34]:
<matplotlib.axes._subplots.AxesSubplot at 0x7fde746de250>
XGBoost 모델 학습과 하이퍼 파라미터 튜닝¶
- 분류 실습 - 캐글 산탄데르 고객 만족 예측
In [35]:
!pip install kaggle
from google.colab import files
files.upload()
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/ Requirement already satisfied: kaggle in /usr/local/lib/python3.7/dist-packages (1.5.12) Requirement already satisfied: urllib3 in /usr/local/lib/python3.7/dist-packages (from kaggle) (1.24.3) Requirement already satisfied: tqdm in /usr/local/lib/python3.7/dist-packages (from kaggle) (4.64.1) Requirement already satisfied: six>=1.10 in /usr/local/lib/python3.7/dist-packages (from kaggle) (1.15.0) Requirement already satisfied: python-slugify in /usr/local/lib/python3.7/dist-packages (from kaggle) (6.1.2) Requirement already satisfied: certifi in /usr/local/lib/python3.7/dist-packages (from kaggle) (2022.9.24) Requirement already satisfied: requests in /usr/local/lib/python3.7/dist-packages (from kaggle) (2.23.0) Requirement already satisfied: python-dateutil in /usr/local/lib/python3.7/dist-packages (from kaggle) (2.8.2) Requirement already satisfied: text-unidecode>=1.3 in /usr/local/lib/python3.7/dist-packages (from python-slugify->kaggle) (1.3) Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.7/dist-packages (from requests->kaggle) (3.0.4) Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests->kaggle) (2.10)
Saving kaggle.json to kaggle (1).json
Out[35]:
{'kaggle.json': b'{"username":"minaahayley","key":"3af569f6a8c028e6233a6d29ce2439d3"}'}
In [37]:
ls -1ha kaggle.json
kaggle.json
In [38]:
!mkdir -p ~/.kaggle #create folder name Kaggle
!cp kaggle.json ~/.kaggle #copy kaggle.jason into folder Kaggle
!chmod 600 ~/.kaggle/kaggle.json #ignore Permission Warning
In [39]:
#다운로드가 제대로 되었는지 확인한다.
#ls 명령어는 특정 경로에 어떤 파일이 있는지 확인해 보는 명령어다.
%ls ~/.kaggle
competitions/ kaggle.json
In [40]:
#Copy API command 후 데이터셋 다운로드하기
!kaggle competitions download -c santander-customer-satisfaction
#파일 압축 풀기
!unzip santander-customer-satisfaction.zip
santander-customer-satisfaction.zip: Skipping, found more recently modified local copy (use --force to force download) Archive: santander-customer-satisfaction.zip replace sample_submission.csv? [y]es, [n]o, [A]ll, [N]one, [r]ename: y inflating: sample_submission.csv replace test.csv? [y]es, [n]o, [A]ll, [N]one, [r]ename: y inflating: test.csv replace train.csv? [y]es, [n]o, [A]ll, [N]one, [r]ename: y inflating: train.csv
In [41]:
!mkdir -p ~/.kaggle/competitions/santander-customer-satisfaction #create folder name Kaggle
!cp sample_submission.csv ~/.kaggle/competitions/santander-customer-satisfaction #copy into folder Kaggle
!cp test.csv ~/.kaggle/competitions/santander-customer-satisfaction
!cp train.csv ~/.kaggle/competitions/santander-customer-satisfaction
In [42]:
#다운로드가 제대로 되었는지 확인한다.
#ls 명령어는 특정 경로에 어떤 파일이 있는지 확인해 보는 명령어다.
%ls ~/.kaggle/competitions/santander-customer-satisfaction
sample_submission.csv test.csv train.csv
In [43]:
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import warnings
warnings.filterwarnings('ignore')
submission = pd.read_csv('~/.kaggle/competitions/santander-customer-satisfaction/sample_submission.csv')
test = pd.read_csv('~/.kaggle/competitions/santander-customer-satisfaction/test.csv')
train = pd.read_csv('~/.kaggle/competitions/santander-customer-satisfaction/train.csv', encoding='latin-1')
In [44]:
train.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 76020 entries, 0 to 76019 Columns: 371 entries, ID to TARGET dtypes: float64(111), int64(260) memory usage: 215.2 MB
In [45]:
train['TARGET'].value_counts()
Out[45]:
0 73012 1 3008 Name: TARGET, dtype: int64
In [46]:
train.describe()
Out[46]:
| ID | var3 | var15 | imp_ent_var16_ult1 | imp_op_var39_comer_ult1 | imp_op_var39_comer_ult3 | imp_op_var40_comer_ult1 | imp_op_var40_comer_ult3 | imp_op_var40_efect_ult1 | imp_op_var40_efect_ult3 | ... | saldo_medio_var33_hace2 | saldo_medio_var33_hace3 | saldo_medio_var33_ult1 | saldo_medio_var33_ult3 | saldo_medio_var44_hace2 | saldo_medio_var44_hace3 | saldo_medio_var44_ult1 | saldo_medio_var44_ult3 | var38 | TARGET | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| count | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | ... | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 76020.000000 | 7.602000e+04 | 76020.000000 |
| mean | 75964.050723 | -1523.199277 | 33.212865 | 86.208265 | 72.363067 | 119.529632 | 3.559130 | 6.472698 | 0.412946 | 0.567352 | ... | 7.935824 | 1.365146 | 12.215580 | 8.784074 | 31.505324 | 1.858575 | 76.026165 | 56.614351 | 1.172358e+05 | 0.039569 |
| std | 43781.947379 | 39033.462364 | 12.956486 | 1614.757313 | 339.315831 | 546.266294 | 93.155749 | 153.737066 | 30.604864 | 36.513513 | ... | 455.887218 | 113.959637 | 783.207399 | 538.439211 | 2013.125393 | 147.786584 | 4040.337842 | 2852.579397 | 1.826646e+05 | 0.194945 |
| min | 1.000000 | -999999.000000 | 5.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | ... | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 5.163750e+03 | 0.000000 |
| 25% | 38104.750000 | 2.000000 | 23.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | ... | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 6.787061e+04 | 0.000000 |
| 50% | 76043.000000 | 2.000000 | 28.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | ... | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.064092e+05 | 0.000000 |
| 75% | 113748.750000 | 2.000000 | 40.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | ... | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.187563e+05 | 0.000000 |
| max | 151838.000000 | 238.000000 | 105.000000 | 210000.000000 | 12888.030000 | 21024.810000 | 8237.820000 | 11073.570000 | 6600.000000 | 6600.000000 | ... | 50003.880000 | 20385.720000 | 138831.630000 | 91778.730000 | 438329.220000 | 24650.010000 | 681462.900000 | 397884.300000 | 2.203474e+07 | 1.000000 |
8 rows × 371 columns
In [47]:
# 데이터 전처리
train['var3'].replace(-999999, 2, inplace=True)
train.drop('ID', axis=1, inplace=True)
In [48]:
from sklearn.model_selection import train_test_split
X = train.iloc[:, :-1]
y = train.iloc[:, -1]
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=0)
# XGBoost의 조기 중단의 검증 데이터 세트로 사용하기 위해서
# X_train, y_train을 다시 학습과 검증 데이터 세트로 분리
X_tr, X_val, y_tr, y_val = train_test_split(X_train, y_train, test_size=0.3, random_state=0)
In [55]:
from xgboost import XGBClassifier
from sklearn.metrics import roc_auc_score
xgb_clf = XGBClassifier(n_estimators=500,
learning_rate=0.05,
random_state=156)
# 성능 평가 지표를 auc로, 조기 중단 파라미터는 100으로 설정하고 학습 수행
xgb_clf.fit(X_tr, y_tr, early_stopping_rounds=100, eval_metric='auc', eval_set=[(X_tr, y_tr), (X_val, y_val)])
[0] validation_0-auc:0.803388 validation_1-auc:0.797852 Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping. Will train until validation_1-auc hasn't improved in 100 rounds. [1] validation_0-auc:0.812619 validation_1-auc:0.796774 [2] validation_0-auc:0.811648 validation_1-auc:0.798697 [3] validation_0-auc:0.812568 validation_1-auc:0.797626 [4] validation_0-auc:0.814884 validation_1-auc:0.798837 [5] validation_0-auc:0.814676 validation_1-auc:0.80085 [6] validation_0-auc:0.814879 validation_1-auc:0.799201 [7] validation_0-auc:0.815144 validation_1-auc:0.798952 [8] validation_0-auc:0.81629 validation_1-auc:0.801921 [9] validation_0-auc:0.816236 validation_1-auc:0.800339 [10] validation_0-auc:0.815842 validation_1-auc:0.80191 [11] validation_0-auc:0.816102 validation_1-auc:0.800052 [12] validation_0-auc:0.818573 validation_1-auc:0.805199 [13] validation_0-auc:0.816937 validation_1-auc:0.801944 [14] validation_0-auc:0.816405 validation_1-auc:0.801992 [15] validation_0-auc:0.819057 validation_1-auc:0.805292 [16] validation_0-auc:0.819138 validation_1-auc:0.80513 [17] validation_0-auc:0.819732 validation_1-auc:0.805279 [18] validation_0-auc:0.81952 validation_1-auc:0.805405 [19] validation_0-auc:0.819717 validation_1-auc:0.805339 [20] validation_0-auc:0.819575 validation_1-auc:0.805397 [21] validation_0-auc:0.819793 validation_1-auc:0.805486 [22] validation_0-auc:0.819608 validation_1-auc:0.805625 [23] validation_0-auc:0.822065 validation_1-auc:0.80862 [24] validation_0-auc:0.822267 validation_1-auc:0.808684 [25] validation_0-auc:0.822329 validation_1-auc:0.809024 [26] validation_0-auc:0.822752 validation_1-auc:0.809397 [27] validation_0-auc:0.822394 validation_1-auc:0.809101 [28] validation_0-auc:0.822764 validation_1-auc:0.809641 [29] validation_0-auc:0.823185 validation_1-auc:0.809964 [30] validation_0-auc:0.824712 validation_1-auc:0.812074 [31] validation_0-auc:0.825012 validation_1-auc:0.81166 [32] validation_0-auc:0.825 validation_1-auc:0.811674 [33] validation_0-auc:0.825805 validation_1-auc:0.812665 [34] validation_0-auc:0.826376 validation_1-auc:0.812649 [35] validation_0-auc:0.825821 validation_1-auc:0.812762 [36] validation_0-auc:0.825806 validation_1-auc:0.812695 [37] validation_0-auc:0.826543 validation_1-auc:0.814157 [38] validation_0-auc:0.826981 validation_1-auc:0.814312 [39] validation_0-auc:0.82743 validation_1-auc:0.814536 [40] validation_0-auc:0.827277 validation_1-auc:0.814401 [41] validation_0-auc:0.827643 validation_1-auc:0.814741 [42] validation_0-auc:0.827689 validation_1-auc:0.815001 [43] validation_0-auc:0.828729 validation_1-auc:0.816471 [44] validation_0-auc:0.829562 validation_1-auc:0.817221 [45] validation_0-auc:0.829789 validation_1-auc:0.817443 [46] validation_0-auc:0.830148 validation_1-auc:0.818327 [47] validation_0-auc:0.830865 validation_1-auc:0.818501 [48] validation_0-auc:0.830928 validation_1-auc:0.818514 [49] validation_0-auc:0.831265 validation_1-auc:0.818778 [50] validation_0-auc:0.831579 validation_1-auc:0.818853 [51] validation_0-auc:0.831579 validation_1-auc:0.818812 [52] validation_0-auc:0.831528 validation_1-auc:0.818857 [53] validation_0-auc:0.832007 validation_1-auc:0.818936 [54] validation_0-auc:0.832495 validation_1-auc:0.819328 [55] validation_0-auc:0.832521 validation_1-auc:0.819452 [56] validation_0-auc:0.832962 validation_1-auc:0.819661 [57] validation_0-auc:0.832957 validation_1-auc:0.819626 [58] validation_0-auc:0.833556 validation_1-auc:0.819734 [59] validation_0-auc:0.834092 validation_1-auc:0.82 [60] validation_0-auc:0.834415 validation_1-auc:0.820338 [61] validation_0-auc:0.834512 validation_1-auc:0.820677 [62] validation_0-auc:0.834793 validation_1-auc:0.820498 [63] validation_0-auc:0.835374 validation_1-auc:0.820792 [64] validation_0-auc:0.835385 validation_1-auc:0.820557 [65] validation_0-auc:0.835797 validation_1-auc:0.820716 [66] validation_0-auc:0.836423 validation_1-auc:0.821124 [67] validation_0-auc:0.836954 validation_1-auc:0.821791 [68] validation_0-auc:0.837328 validation_1-auc:0.822171 [69] validation_0-auc:0.837587 validation_1-auc:0.822291 [70] validation_0-auc:0.837883 validation_1-auc:0.822495 [71] validation_0-auc:0.838142 validation_1-auc:0.822995 [72] validation_0-auc:0.838329 validation_1-auc:0.823153 [73] validation_0-auc:0.838585 validation_1-auc:0.823268 [74] validation_0-auc:0.838897 validation_1-auc:0.82362 [75] validation_0-auc:0.839226 validation_1-auc:0.823811 [76] validation_0-auc:0.839426 validation_1-auc:0.824109 [77] validation_0-auc:0.839702 validation_1-auc:0.824445 [78] validation_0-auc:0.840484 validation_1-auc:0.825485 [79] validation_0-auc:0.840708 validation_1-auc:0.82573 [80] validation_0-auc:0.840871 validation_1-auc:0.825816 [81] validation_0-auc:0.841273 validation_1-auc:0.825901 [82] validation_0-auc:0.84153 validation_1-auc:0.826054 [83] validation_0-auc:0.841715 validation_1-auc:0.826204 [84] validation_0-auc:0.842001 validation_1-auc:0.826518 [85] validation_0-auc:0.842435 validation_1-auc:0.827054 [86] validation_0-auc:0.843015 validation_1-auc:0.827868 [87] validation_0-auc:0.843286 validation_1-auc:0.827812 [88] validation_0-auc:0.843824 validation_1-auc:0.82855 [89] validation_0-auc:0.844174 validation_1-auc:0.828433 [90] validation_0-auc:0.844516 validation_1-auc:0.82842 [91] validation_0-auc:0.845039 validation_1-auc:0.828801 [92] validation_0-auc:0.845214 validation_1-auc:0.828823 [93] validation_0-auc:0.845457 validation_1-auc:0.828952 [94] validation_0-auc:0.845846 validation_1-auc:0.829051 [95] validation_0-auc:0.846098 validation_1-auc:0.829163 [96] validation_0-auc:0.846391 validation_1-auc:0.829197 [97] validation_0-auc:0.846567 validation_1-auc:0.829326 [98] validation_0-auc:0.846942 validation_1-auc:0.829431 [99] validation_0-auc:0.847055 validation_1-auc:0.829543 [100] validation_0-auc:0.847437 validation_1-auc:0.829611 [101] validation_0-auc:0.847565 validation_1-auc:0.829786 [102] validation_0-auc:0.847976 validation_1-auc:0.829913 [103] validation_0-auc:0.84823 validation_1-auc:0.829902 [104] validation_0-auc:0.848435 validation_1-auc:0.82987 [105] validation_0-auc:0.84887 validation_1-auc:0.82999 [106] validation_0-auc:0.849167 validation_1-auc:0.830151 [107] validation_0-auc:0.849436 validation_1-auc:0.83028 [108] validation_0-auc:0.849793 validation_1-auc:0.83041 [109] validation_0-auc:0.849943 validation_1-auc:0.830278 [110] validation_0-auc:0.850253 validation_1-auc:0.830479 [111] validation_0-auc:0.850514 validation_1-auc:0.830685 [112] validation_0-auc:0.850669 validation_1-auc:0.830854 [113] validation_0-auc:0.850908 validation_1-auc:0.830991 [114] validation_0-auc:0.851291 validation_1-auc:0.831072 [115] validation_0-auc:0.851388 validation_1-auc:0.831317 [116] validation_0-auc:0.851381 validation_1-auc:0.831317 [117] validation_0-auc:0.85163 validation_1-auc:0.831469 [118] validation_0-auc:0.851839 validation_1-auc:0.831458 [119] validation_0-auc:0.852018 validation_1-auc:0.831463 [120] validation_0-auc:0.852246 validation_1-auc:0.83169 [121] validation_0-auc:0.852304 validation_1-auc:0.831656 [122] validation_0-auc:0.852403 validation_1-auc:0.831651 [123] validation_0-auc:0.852468 validation_1-auc:0.831636 [124] validation_0-auc:0.852631 validation_1-auc:0.831592 [125] validation_0-auc:0.852777 validation_1-auc:0.831685 [126] validation_0-auc:0.852966 validation_1-auc:0.831825 [127] validation_0-auc:0.853117 validation_1-auc:0.831812 [128] validation_0-auc:0.853336 validation_1-auc:0.831863 [129] validation_0-auc:0.853523 validation_1-auc:0.831895 [130] validation_0-auc:0.853658 validation_1-auc:0.831836 [131] validation_0-auc:0.853921 validation_1-auc:0.831939 [132] validation_0-auc:0.85392 validation_1-auc:0.831926 [133] validation_0-auc:0.854164 validation_1-auc:0.831959 [134] validation_0-auc:0.85423 validation_1-auc:0.831947 [135] validation_0-auc:0.854479 validation_1-auc:0.832188 [136] validation_0-auc:0.854554 validation_1-auc:0.832339 [137] validation_0-auc:0.854649 validation_1-auc:0.832369 [138] validation_0-auc:0.854935 validation_1-auc:0.832417 [139] validation_0-auc:0.855044 validation_1-auc:0.832528 [140] validation_0-auc:0.855237 validation_1-auc:0.832648 [141] validation_0-auc:0.855372 validation_1-auc:0.832787 [142] validation_0-auc:0.855563 validation_1-auc:0.832952 [143] validation_0-auc:0.855661 validation_1-auc:0.833006 [144] validation_0-auc:0.855725 validation_1-auc:0.832923 [145] validation_0-auc:0.855903 validation_1-auc:0.832937 [146] validation_0-auc:0.856047 validation_1-auc:0.832822 [147] validation_0-auc:0.856171 validation_1-auc:0.833017 [148] validation_0-auc:0.856226 validation_1-auc:0.833117 [149] validation_0-auc:0.856251 validation_1-auc:0.833112 [150] validation_0-auc:0.856388 validation_1-auc:0.833232 [151] validation_0-auc:0.85646 validation_1-auc:0.833282 [152] validation_0-auc:0.856688 validation_1-auc:0.833415 [153] validation_0-auc:0.856802 validation_1-auc:0.83345 [154] validation_0-auc:0.856891 validation_1-auc:0.833423 [155] validation_0-auc:0.856982 validation_1-auc:0.833358 [156] validation_0-auc:0.85728 validation_1-auc:0.833308 [157] validation_0-auc:0.857433 validation_1-auc:0.833298 [158] validation_0-auc:0.857487 validation_1-auc:0.833346 [159] validation_0-auc:0.857595 validation_1-auc:0.833307 [160] validation_0-auc:0.857634 validation_1-auc:0.833271 [161] validation_0-auc:0.857758 validation_1-auc:0.83338 [162] validation_0-auc:0.857862 validation_1-auc:0.833404 [163] validation_0-auc:0.857961 validation_1-auc:0.833367 [164] validation_0-auc:0.8581 validation_1-auc:0.833383 [165] validation_0-auc:0.858203 validation_1-auc:0.833497 [166] validation_0-auc:0.858219 validation_1-auc:0.833533 [167] validation_0-auc:0.858305 validation_1-auc:0.833554 [168] validation_0-auc:0.858433 validation_1-auc:0.833597 [169] validation_0-auc:0.858522 validation_1-auc:0.833586 [170] validation_0-auc:0.858659 validation_1-auc:0.83367 [171] validation_0-auc:0.858847 validation_1-auc:0.833613 [172] validation_0-auc:0.858918 validation_1-auc:0.833745 [173] validation_0-auc:0.859042 validation_1-auc:0.833759 [174] validation_0-auc:0.85915 validation_1-auc:0.833747 [175] validation_0-auc:0.859193 validation_1-auc:0.83375 [176] validation_0-auc:0.859331 validation_1-auc:0.833721 [177] validation_0-auc:0.859419 validation_1-auc:0.833696 [178] validation_0-auc:0.859486 validation_1-auc:0.833698 [179] validation_0-auc:0.859544 validation_1-auc:0.833758 [180] validation_0-auc:0.859578 validation_1-auc:0.833763 [181] validation_0-auc:0.859624 validation_1-auc:0.833755 [182] validation_0-auc:0.859716 validation_1-auc:0.833852 [183] validation_0-auc:0.859773 validation_1-auc:0.833887 [184] validation_0-auc:0.859854 validation_1-auc:0.833886 [185] validation_0-auc:0.859941 validation_1-auc:0.833881 [186] validation_0-auc:0.859941 validation_1-auc:0.833936 [187] validation_0-auc:0.86006 validation_1-auc:0.833968 [188] validation_0-auc:0.860128 validation_1-auc:0.833972 [189] validation_0-auc:0.860259 validation_1-auc:0.833935 [190] validation_0-auc:0.860351 validation_1-auc:0.833948 [191] validation_0-auc:0.860443 validation_1-auc:0.834008 [192] validation_0-auc:0.860519 validation_1-auc:0.833994 [193] validation_0-auc:0.860601 validation_1-auc:0.834039 [194] validation_0-auc:0.860679 validation_1-auc:0.834068 [195] validation_0-auc:0.860748 validation_1-auc:0.834055 [196] validation_0-auc:0.860805 validation_1-auc:0.834117 [197] validation_0-auc:0.860818 validation_1-auc:0.83414 [198] validation_0-auc:0.860869 validation_1-auc:0.834168 [199] validation_0-auc:0.860887 validation_1-auc:0.834171 [200] validation_0-auc:0.860968 validation_1-auc:0.834135 [201] validation_0-auc:0.861004 validation_1-auc:0.834134 [202] validation_0-auc:0.861166 validation_1-auc:0.834073 [203] validation_0-auc:0.86127 validation_1-auc:0.834064 [204] validation_0-auc:0.861353 validation_1-auc:0.834212 [205] validation_0-auc:0.861402 validation_1-auc:0.83426 [206] validation_0-auc:0.86145 validation_1-auc:0.834321 [207] validation_0-auc:0.86152 validation_1-auc:0.834292 [208] validation_0-auc:0.861593 validation_1-auc:0.834316 [209] validation_0-auc:0.861664 validation_1-auc:0.834315 [210] validation_0-auc:0.861715 validation_1-auc:0.834316 [211] validation_0-auc:0.861734 validation_1-auc:0.834375 [212] validation_0-auc:0.861735 validation_1-auc:0.83435 [213] validation_0-auc:0.8618 validation_1-auc:0.834371 [214] validation_0-auc:0.861844 validation_1-auc:0.834357 [215] validation_0-auc:0.861939 validation_1-auc:0.834293 [216] validation_0-auc:0.862034 validation_1-auc:0.834333 [217] validation_0-auc:0.862151 validation_1-auc:0.83429 [218] validation_0-auc:0.862206 validation_1-auc:0.834331 [219] validation_0-auc:0.86223 validation_1-auc:0.834293 [220] validation_0-auc:0.862282 validation_1-auc:0.8343 [221] validation_0-auc:0.86237 validation_1-auc:0.834281 [222] validation_0-auc:0.862427 validation_1-auc:0.834257 [223] validation_0-auc:0.862473 validation_1-auc:0.834267 [224] validation_0-auc:0.8625 validation_1-auc:0.834278 [225] validation_0-auc:0.862534 validation_1-auc:0.834258 [226] validation_0-auc:0.862566 validation_1-auc:0.83425 [227] validation_0-auc:0.862699 validation_1-auc:0.834271 [228] validation_0-auc:0.86274 validation_1-auc:0.834287 [229] validation_0-auc:0.862784 validation_1-auc:0.834271 [230] validation_0-auc:0.86283 validation_1-auc:0.834316 [231] validation_0-auc:0.863021 validation_1-auc:0.834324 [232] validation_0-auc:0.863063 validation_1-auc:0.834341 [233] validation_0-auc:0.863241 validation_1-auc:0.83415 [234] validation_0-auc:0.863271 validation_1-auc:0.834166 [235] validation_0-auc:0.863339 validation_1-auc:0.834187 [236] validation_0-auc:0.863418 validation_1-auc:0.834221 [237] validation_0-auc:0.863491 validation_1-auc:0.834222 [238] validation_0-auc:0.863554 validation_1-auc:0.834228 [239] validation_0-auc:0.863823 validation_1-auc:0.834203 [240] validation_0-auc:0.863917 validation_1-auc:0.834212 [241] validation_0-auc:0.863946 validation_1-auc:0.834193 [242] validation_0-auc:0.864014 validation_1-auc:0.834246 [243] validation_0-auc:0.864047 validation_1-auc:0.834277 [244] validation_0-auc:0.864133 validation_1-auc:0.834246 [245] validation_0-auc:0.864162 validation_1-auc:0.834247 [246] validation_0-auc:0.864191 validation_1-auc:0.834284 [247] validation_0-auc:0.864263 validation_1-auc:0.834292 [248] validation_0-auc:0.864277 validation_1-auc:0.834286 [249] validation_0-auc:0.864385 validation_1-auc:0.834289 [250] validation_0-auc:0.864411 validation_1-auc:0.834297 [251] validation_0-auc:0.864487 validation_1-auc:0.834352 [252] validation_0-auc:0.864636 validation_1-auc:0.834298 [253] validation_0-auc:0.864678 validation_1-auc:0.834325 [254] validation_0-auc:0.864799 validation_1-auc:0.834375 [255] validation_0-auc:0.864873 validation_1-auc:0.834391 [256] validation_0-auc:0.864959 validation_1-auc:0.834423 [257] validation_0-auc:0.864979 validation_1-auc:0.834294 [258] validation_0-auc:0.865034 validation_1-auc:0.834429 [259] validation_0-auc:0.865059 validation_1-auc:0.834414 [260] validation_0-auc:0.865199 validation_1-auc:0.834365 [261] validation_0-auc:0.865205 validation_1-auc:0.834331 [262] validation_0-auc:0.865263 validation_1-auc:0.834292 [263] validation_0-auc:0.865309 validation_1-auc:0.834327 [264] validation_0-auc:0.865342 validation_1-auc:0.834321 [265] validation_0-auc:0.865372 validation_1-auc:0.834364 [266] validation_0-auc:0.865426 validation_1-auc:0.834405 [267] validation_0-auc:0.865633 validation_1-auc:0.834446 [268] validation_0-auc:0.865693 validation_1-auc:0.834478 [269] validation_0-auc:0.865822 validation_1-auc:0.834458 [270] validation_0-auc:0.866 validation_1-auc:0.834384 [271] validation_0-auc:0.866048 validation_1-auc:0.834362 [272] validation_0-auc:0.866112 validation_1-auc:0.834365 [273] validation_0-auc:0.866149 validation_1-auc:0.834376 [274] validation_0-auc:0.866202 validation_1-auc:0.834418 [275] validation_0-auc:0.866309 validation_1-auc:0.83448 [276] validation_0-auc:0.8665 validation_1-auc:0.834487 [277] validation_0-auc:0.866559 validation_1-auc:0.8345 [278] validation_0-auc:0.866583 validation_1-auc:0.834523 [279] validation_0-auc:0.866575 validation_1-auc:0.834491 [280] validation_0-auc:0.866592 validation_1-auc:0.834475 [281] validation_0-auc:0.866622 validation_1-auc:0.83447 [282] validation_0-auc:0.866633 validation_1-auc:0.834481 [283] validation_0-auc:0.866687 validation_1-auc:0.83447 [284] validation_0-auc:0.866818 validation_1-auc:0.834457 [285] validation_0-auc:0.866844 validation_1-auc:0.834471 [286] validation_0-auc:0.866899 validation_1-auc:0.834514 [287] validation_0-auc:0.866926 validation_1-auc:0.83452 [288] validation_0-auc:0.867039 validation_1-auc:0.834517 [289] validation_0-auc:0.867092 validation_1-auc:0.834599 [290] validation_0-auc:0.86713 validation_1-auc:0.834635 [291] validation_0-auc:0.86729 validation_1-auc:0.834659 [292] validation_0-auc:0.867352 validation_1-auc:0.834641 [293] validation_0-auc:0.867435 validation_1-auc:0.8346 [294] validation_0-auc:0.867461 validation_1-auc:0.834608 [295] validation_0-auc:0.867483 validation_1-auc:0.834605 [296] validation_0-auc:0.867525 validation_1-auc:0.834642 [297] validation_0-auc:0.867639 validation_1-auc:0.834538 [298] validation_0-auc:0.867678 validation_1-auc:0.834541 [299] validation_0-auc:0.867745 validation_1-auc:0.834593 [300] validation_0-auc:0.867845 validation_1-auc:0.834533 [301] validation_0-auc:0.86799 validation_1-auc:0.83461 [302] validation_0-auc:0.868002 validation_1-auc:0.834607 [303] validation_0-auc:0.868089 validation_1-auc:0.834614 [304] validation_0-auc:0.868145 validation_1-auc:0.834564 [305] validation_0-auc:0.86816 validation_1-auc:0.834564 [306] validation_0-auc:0.868215 validation_1-auc:0.834597 [307] validation_0-auc:0.868258 validation_1-auc:0.834575 [308] validation_0-auc:0.868408 validation_1-auc:0.83452 [309] validation_0-auc:0.868614 validation_1-auc:0.834569 [310] validation_0-auc:0.868643 validation_1-auc:0.834595 [311] validation_0-auc:0.868697 validation_1-auc:0.834665 [312] validation_0-auc:0.868724 validation_1-auc:0.834644 [313] validation_0-auc:0.868804 validation_1-auc:0.834587 [314] validation_0-auc:0.86881 validation_1-auc:0.834593 [315] validation_0-auc:0.86887 validation_1-auc:0.834599 [316] validation_0-auc:0.868899 validation_1-auc:0.834599 [317] validation_0-auc:0.869023 validation_1-auc:0.834572 [318] validation_0-auc:0.869121 validation_1-auc:0.834549 [319] validation_0-auc:0.869164 validation_1-auc:0.834567 [320] validation_0-auc:0.869187 validation_1-auc:0.834602 [321] validation_0-auc:0.869218 validation_1-auc:0.834626 [322] validation_0-auc:0.869252 validation_1-auc:0.834603 [323] validation_0-auc:0.869277 validation_1-auc:0.8346 [324] validation_0-auc:0.869325 validation_1-auc:0.834584 [325] validation_0-auc:0.869352 validation_1-auc:0.834573 [326] validation_0-auc:0.869473 validation_1-auc:0.834551 [327] validation_0-auc:0.869604 validation_1-auc:0.834475 [328] validation_0-auc:0.869639 validation_1-auc:0.834507 [329] validation_0-auc:0.869665 validation_1-auc:0.834518 [330] validation_0-auc:0.869721 validation_1-auc:0.834542 [331] validation_0-auc:0.869762 validation_1-auc:0.834578 [332] validation_0-auc:0.869785 validation_1-auc:0.834582 [333] validation_0-auc:0.869785 validation_1-auc:0.83456 [334] validation_0-auc:0.869858 validation_1-auc:0.834582 [335] validation_0-auc:0.869872 validation_1-auc:0.83456 [336] validation_0-auc:0.869924 validation_1-auc:0.834562 [337] validation_0-auc:0.869976 validation_1-auc:0.834563 [338] validation_0-auc:0.870022 validation_1-auc:0.834599 [339] validation_0-auc:0.870106 validation_1-auc:0.834616 [340] validation_0-auc:0.870154 validation_1-auc:0.834569 [341] validation_0-auc:0.870162 validation_1-auc:0.834583 [342] validation_0-auc:0.870273 validation_1-auc:0.834562 [343] validation_0-auc:0.870293 validation_1-auc:0.834573 [344] validation_0-auc:0.870352 validation_1-auc:0.834595 [345] validation_0-auc:0.870484 validation_1-auc:0.834657 [346] validation_0-auc:0.870531 validation_1-auc:0.834645 [347] validation_0-auc:0.870582 validation_1-auc:0.834581 [348] validation_0-auc:0.870668 validation_1-auc:0.83458 [349] validation_0-auc:0.870716 validation_1-auc:0.834633 [350] validation_0-auc:0.870824 validation_1-auc:0.834721 [351] validation_0-auc:0.870944 validation_1-auc:0.834751 [352] validation_0-auc:0.870993 validation_1-auc:0.834769 [353] validation_0-auc:0.871069 validation_1-auc:0.834738 [354] validation_0-auc:0.871118 validation_1-auc:0.83466 [355] validation_0-auc:0.871138 validation_1-auc:0.834649 [356] validation_0-auc:0.871248 validation_1-auc:0.83467 [357] validation_0-auc:0.871271 validation_1-auc:0.834668 [358] validation_0-auc:0.871286 validation_1-auc:0.834688 [359] validation_0-auc:0.871365 validation_1-auc:0.834615 [360] validation_0-auc:0.871427 validation_1-auc:0.834623 [361] validation_0-auc:0.871528 validation_1-auc:0.834603 [362] validation_0-auc:0.871601 validation_1-auc:0.834604 [363] validation_0-auc:0.871699 validation_1-auc:0.83464 [364] validation_0-auc:0.871707 validation_1-auc:0.834665 [365] validation_0-auc:0.871788 validation_1-auc:0.834689 [366] validation_0-auc:0.871805 validation_1-auc:0.834724 [367] validation_0-auc:0.871826 validation_1-auc:0.834722 [368] validation_0-auc:0.871878 validation_1-auc:0.834749 [369] validation_0-auc:0.871995 validation_1-auc:0.834688 [370] validation_0-auc:0.872009 validation_1-auc:0.834678 [371] validation_0-auc:0.872055 validation_1-auc:0.834814 [372] validation_0-auc:0.872089 validation_1-auc:0.834746 [373] validation_0-auc:0.872109 validation_1-auc:0.83476 [374] validation_0-auc:0.872137 validation_1-auc:0.834764 [375] validation_0-auc:0.872159 validation_1-auc:0.834751 [376] validation_0-auc:0.872162 validation_1-auc:0.834774 [377] validation_0-auc:0.872186 validation_1-auc:0.834778 [378] validation_0-auc:0.872229 validation_1-auc:0.834794 [379] validation_0-auc:0.872272 validation_1-auc:0.834785 [380] validation_0-auc:0.872303 validation_1-auc:0.834741 [381] validation_0-auc:0.872347 validation_1-auc:0.834687 [382] validation_0-auc:0.872393 validation_1-auc:0.834679 [383] validation_0-auc:0.872437 validation_1-auc:0.834689 [384] validation_0-auc:0.87254 validation_1-auc:0.834698 [385] validation_0-auc:0.872627 validation_1-auc:0.834726 [386] validation_0-auc:0.872638 validation_1-auc:0.834727 [387] validation_0-auc:0.872708 validation_1-auc:0.83472 [388] validation_0-auc:0.872832 validation_1-auc:0.834663 [389] validation_0-auc:0.872887 validation_1-auc:0.83462 [390] validation_0-auc:0.872948 validation_1-auc:0.834554 [391] validation_0-auc:0.872963 validation_1-auc:0.834566 [392] validation_0-auc:0.873048 validation_1-auc:0.834603 [393] validation_0-auc:0.873062 validation_1-auc:0.834615 [394] validation_0-auc:0.87308 validation_1-auc:0.834633 [395] validation_0-auc:0.873132 validation_1-auc:0.834705 [396] validation_0-auc:0.873153 validation_1-auc:0.834709 [397] validation_0-auc:0.873225 validation_1-auc:0.834697 [398] validation_0-auc:0.873344 validation_1-auc:0.834731 [399] validation_0-auc:0.873374 validation_1-auc:0.834659 [400] validation_0-auc:0.873414 validation_1-auc:0.834641 [401] validation_0-auc:0.873467 validation_1-auc:0.834645 [402] validation_0-auc:0.873508 validation_1-auc:0.834639 [403] validation_0-auc:0.873523 validation_1-auc:0.834666 [404] validation_0-auc:0.873623 validation_1-auc:0.834705 [405] validation_0-auc:0.87366 validation_1-auc:0.834699 [406] validation_0-auc:0.873698 validation_1-auc:0.834702 [407] validation_0-auc:0.873798 validation_1-auc:0.834648 [408] validation_0-auc:0.87384 validation_1-auc:0.8346 [409] validation_0-auc:0.87386 validation_1-auc:0.834621 [410] validation_0-auc:0.87389 validation_1-auc:0.834558 [411] validation_0-auc:0.873968 validation_1-auc:0.834679 [412] validation_0-auc:0.874059 validation_1-auc:0.834699 [413] validation_0-auc:0.874086 validation_1-auc:0.834647 [414] validation_0-auc:0.874127 validation_1-auc:0.834619 [415] validation_0-auc:0.874279 validation_1-auc:0.834665 [416] validation_0-auc:0.874389 validation_1-auc:0.834662 [417] validation_0-auc:0.874465 validation_1-auc:0.834597 [418] validation_0-auc:0.874636 validation_1-auc:0.834597 [419] validation_0-auc:0.874693 validation_1-auc:0.834589 [420] validation_0-auc:0.874716 validation_1-auc:0.834621 [421] validation_0-auc:0.874762 validation_1-auc:0.834607 [422] validation_0-auc:0.874836 validation_1-auc:0.834586 [423] validation_0-auc:0.874857 validation_1-auc:0.834555 [424] validation_0-auc:0.874888 validation_1-auc:0.834532 [425] validation_0-auc:0.875017 validation_1-auc:0.834522 [426] validation_0-auc:0.875056 validation_1-auc:0.834454 [427] validation_0-auc:0.875135 validation_1-auc:0.834502 [428] validation_0-auc:0.87524 validation_1-auc:0.834664 [429] validation_0-auc:0.87524 validation_1-auc:0.83466 [430] validation_0-auc:0.875315 validation_1-auc:0.834632 [431] validation_0-auc:0.875356 validation_1-auc:0.834571 [432] validation_0-auc:0.875365 validation_1-auc:0.834571 [433] validation_0-auc:0.875388 validation_1-auc:0.834574 [434] validation_0-auc:0.875479 validation_1-auc:0.834555 [435] validation_0-auc:0.875548 validation_1-auc:0.834523 [436] validation_0-auc:0.875585 validation_1-auc:0.834479 [437] validation_0-auc:0.875598 validation_1-auc:0.834478 [438] validation_0-auc:0.875627 validation_1-auc:0.834488 [439] validation_0-auc:0.875743 validation_1-auc:0.834442 [440] validation_0-auc:0.87576 validation_1-auc:0.834426 [441] validation_0-auc:0.875805 validation_1-auc:0.834444 [442] validation_0-auc:0.875879 validation_1-auc:0.834401 [443] validation_0-auc:0.875949 validation_1-auc:0.83443 [444] validation_0-auc:0.876012 validation_1-auc:0.834445 [445] validation_0-auc:0.87609 validation_1-auc:0.834549 [446] validation_0-auc:0.876215 validation_1-auc:0.834507 [447] validation_0-auc:0.876286 validation_1-auc:0.834454 [448] validation_0-auc:0.876318 validation_1-auc:0.834447 [449] validation_0-auc:0.876351 validation_1-auc:0.834437 [450] validation_0-auc:0.876431 validation_1-auc:0.834475 [451] validation_0-auc:0.876666 validation_1-auc:0.834399 [452] validation_0-auc:0.876713 validation_1-auc:0.834408 [453] validation_0-auc:0.876777 validation_1-auc:0.834437 [454] validation_0-auc:0.876886 validation_1-auc:0.834414 [455] validation_0-auc:0.876899 validation_1-auc:0.834395 [456] validation_0-auc:0.876916 validation_1-auc:0.834398 [457] validation_0-auc:0.876935 validation_1-auc:0.834381 [458] validation_0-auc:0.877107 validation_1-auc:0.834239 [459] validation_0-auc:0.877245 validation_1-auc:0.834135 [460] validation_0-auc:0.877308 validation_1-auc:0.834082 [461] validation_0-auc:0.877357 validation_1-auc:0.834015 [462] validation_0-auc:0.877367 validation_1-auc:0.833994 [463] validation_0-auc:0.877398 validation_1-auc:0.833971 [464] validation_0-auc:0.877444 validation_1-auc:0.833946 [465] validation_0-auc:0.877514 validation_1-auc:0.833934 [466] validation_0-auc:0.877555 validation_1-auc:0.83393 [467] validation_0-auc:0.87757 validation_1-auc:0.833937 [468] validation_0-auc:0.87772 validation_1-auc:0.833966 [469] validation_0-auc:0.877834 validation_1-auc:0.833894 [470] validation_0-auc:0.877893 validation_1-auc:0.833898 [471] validation_0-auc:0.877909 validation_1-auc:0.833876 Stopping. Best iteration: [371] validation_0-auc:0.872055 validation_1-auc:0.834814
Out[55]:
XGBClassifier(learning_rate=0.05, n_estimators=500, random_state=156)
In [50]:
y_test_pred_proba = xgb_clf.predict_proba(X_test)[:, 1]
xgb_roc_score = roc_auc_score(y_test, y_test_pred_proba)
print('roc auc: {0:.4f}'.format(xgb_roc_score))
roc auc: 0.8431
In [51]:
# 하이퍼 파라미터 튜닝
from hyperopt import hp
xgb_search_space = {'max_depth': hp.quniform('max_depth', 5, 15, 1),
'min_child_weight' : hp.quniform('min_child_weight', 1, 6, 1),
'colsample_bytree' : hp.uniform('colsample_bytree', 0.5, 0.95),
'learning_rate' : hp.uniform('learning_rate', 0.01, 0.2)}
In [52]:
from sklearn.model_selection import KFold
from sklearn.metrics import roc_auc_score
# fmin()에서 호출 시 search_space 값으로 XGBClassifier 교차 검증 학습 후 -1 * roc_auc 평균 값을 반환
def objective_func(search_space):
xgb_clf = XGBClassifier(n_estimators=100,
max_depth=int(search_space['max_depth']),
min_child_weight=int(search_space['min_child_weight']),
colsample_bytree=search_space['colsample_bytree'],
learning_rate=search_space['learning_rate']
)
# 3개 k-fold 방식으로 평가된 roc_auc 지표를 담는 list
roc_auc_list = []
# 3개 k-fold 방식 적용
kf = KFold(n_splits=3)
# X_train을 다시 학습과 검증용 데이터로 분리
for tr_index, val_index in kf.split(X_train):
# kf.split(X_train)으로 추출된 학습과 검증 index 값으로 학습과 검증 데이터 세트 분리
X_tr, y_tr = X_train.iloc[tr_index], y_train.iloc[tr_index]
X_val, y_val = X_train.iloc[val_index], y_train.iloc[val_index]
# early stopping은 30회로 설정하고 추출된 학습과 검증 데이터로 XGBClassifer 학습 수행
xgb_clf.fit(X_tr, y_tr, early_stopping_rounds=30, eval_metric='auc', eval_set=[(X_tr, y_tr), (X_val, y_val)])
# 1로 예측한 확률값 추출 후 roc auc 계산하고 평균 roc auc 계산을 위해 list에 결과값 담음
score = roc_auc_score(y_val, xgb_clf.predict_proba(X_val)[:, 1])
roc_auc_list.append(score)
# 3개 k-fold로 계산된 roc_auc 값의 평균값을 반환하되,
# HyperOpt는 목적함수의 최솟값을 위한 입력값을 찾으므로 -1을 곱한 뒤 반환
return -1*np.mean(roc_auc_list)
In [53]:
from hyperopt import fmin, tpe, Trials
trials = Trials()
# fmin() 함수를 호출. max_evals 지정된 횟수만큼 반복 후 목적함수의 최솟값을 가지는 최적 입력값 추출
best = fmin(fn=objective_func,
space=xgb_search_space,
algo=tpe.suggest,
max_evals=50, # 최대 반복 횟수
trials=trials)
print('best:', best)
스트리밍 출력 내용이 길어서 마지막 5000줄이 삭제되었습니다.
[5] validation_0-auc:0.844761 validation_1-auc:0.823674
[6] validation_0-auc:0.846786 validation_1-auc:0.824798
[7] validation_0-auc:0.848697 validation_1-auc:0.825615
[8] validation_0-auc:0.850254 validation_1-auc:0.826491
[9] validation_0-auc:0.853894 validation_1-auc:0.829663
[10] validation_0-auc:0.855902 validation_1-auc:0.830146
[11] validation_0-auc:0.8569 validation_1-auc:0.830915
[12] validation_0-auc:0.857928 validation_1-auc:0.830457
[13] validation_0-auc:0.862 validation_1-auc:0.831319
[14] validation_0-auc:0.865128 validation_1-auc:0.831938
[15] validation_0-auc:0.867063 validation_1-auc:0.832388
[16] validation_0-auc:0.868629 validation_1-auc:0.832058
[17] validation_0-auc:0.870385 validation_1-auc:0.833146
[18] validation_0-auc:0.872295 validation_1-auc:0.834195
[19] validation_0-auc:0.872282 validation_1-auc:0.834715
[20] validation_0-auc:0.873206 validation_1-auc:0.833967
[21] validation_0-auc:0.873496 validation_1-auc:0.833546
[22] validation_0-auc:0.875512 validation_1-auc:0.834456
[23] validation_0-auc:0.876789 validation_1-auc:0.835106
[24] validation_0-auc:0.877824 validation_1-auc:0.835205
[25] validation_0-auc:0.879179 validation_1-auc:0.836226
[26] validation_0-auc:0.8809 validation_1-auc:0.835558
[27] validation_0-auc:0.881473 validation_1-auc:0.83553
[28] validation_0-auc:0.882437 validation_1-auc:0.835361
[29] validation_0-auc:0.883258 validation_1-auc:0.835153
[30] validation_0-auc:0.884495 validation_1-auc:0.835918
[31] validation_0-auc:0.885 validation_1-auc:0.836569
[32] validation_0-auc:0.886242 validation_1-auc:0.836284
[33] validation_0-auc:0.887386 validation_1-auc:0.835947
[34] validation_0-auc:0.888094 validation_1-auc:0.835809
[35] validation_0-auc:0.889304 validation_1-auc:0.836053
[36] validation_0-auc:0.890005 validation_1-auc:0.836109
[37] validation_0-auc:0.890724 validation_1-auc:0.836094
[38] validation_0-auc:0.891327 validation_1-auc:0.836512
[39] validation_0-auc:0.891697 validation_1-auc:0.836664
[40] validation_0-auc:0.891881 validation_1-auc:0.836291
[41] validation_0-auc:0.892568 validation_1-auc:0.836348
[42] validation_0-auc:0.892927 validation_1-auc:0.836629
[43] validation_0-auc:0.893779 validation_1-auc:0.836861
[44] validation_0-auc:0.894096 validation_1-auc:0.836949
[45] validation_0-auc:0.894606 validation_1-auc:0.836897
[46] validation_0-auc:0.894848 validation_1-auc:0.836939
[47] validation_0-auc:0.895077 validation_1-auc:0.837149
[48] validation_0-auc:0.895344 validation_1-auc:0.837097
[49] validation_0-auc:0.895524 validation_1-auc:0.837177
[50] validation_0-auc:0.89624 validation_1-auc:0.837312
[51] validation_0-auc:0.896428 validation_1-auc:0.837145
[52] validation_0-auc:0.896811 validation_1-auc:0.837458
[53] validation_0-auc:0.896985 validation_1-auc:0.837255
[54] validation_0-auc:0.897725 validation_1-auc:0.837022
[55] validation_0-auc:0.898013 validation_1-auc:0.83692
[56] validation_0-auc:0.898519 validation_1-auc:0.83746
[57] validation_0-auc:0.898774 validation_1-auc:0.83744
[58] validation_0-auc:0.899127 validation_1-auc:0.837375
[59] validation_0-auc:0.899239 validation_1-auc:0.837481
[60] validation_0-auc:0.899998 validation_1-auc:0.837954
[61] validation_0-auc:0.900997 validation_1-auc:0.837899
[62] validation_0-auc:0.901231 validation_1-auc:0.837783
[63] validation_0-auc:0.901837 validation_1-auc:0.837674
[64] validation_0-auc:0.901958 validation_1-auc:0.837727
[65] validation_0-auc:0.902033 validation_1-auc:0.837553
[66] validation_0-auc:0.902176 validation_1-auc:0.837514
[67] validation_0-auc:0.902513 validation_1-auc:0.837502
[68] validation_0-auc:0.902824 validation_1-auc:0.837341
[69] validation_0-auc:0.902889 validation_1-auc:0.837486
[70] validation_0-auc:0.903009 validation_1-auc:0.837406
[71] validation_0-auc:0.903228 validation_1-auc:0.837323
[72] validation_0-auc:0.903316 validation_1-auc:0.837307
[73] validation_0-auc:0.903387 validation_1-auc:0.837169
[74] validation_0-auc:0.903883 validation_1-auc:0.837132
[75] validation_0-auc:0.904247 validation_1-auc:0.83693
[76] validation_0-auc:0.90449 validation_1-auc:0.836895
[77] validation_0-auc:0.904594 validation_1-auc:0.836814
[78] validation_0-auc:0.905295 validation_1-auc:0.836663
[79] validation_0-auc:0.905835 validation_1-auc:0.83674
[80] validation_0-auc:0.906593 validation_1-auc:0.836711
[81] validation_0-auc:0.906806 validation_1-auc:0.836744
[82] validation_0-auc:0.907272 validation_1-auc:0.836496
[83] validation_0-auc:0.907583 validation_1-auc:0.836385
[84] validation_0-auc:0.907622 validation_1-auc:0.8362
[85] validation_0-auc:0.907719 validation_1-auc:0.836336
[86] validation_0-auc:0.907786 validation_1-auc:0.836186
[87] validation_0-auc:0.908 validation_1-auc:0.836061
[88] validation_0-auc:0.908264 validation_1-auc:0.835793
[89] validation_0-auc:0.90832 validation_1-auc:0.835698
[90] validation_0-auc:0.908874 validation_1-auc:0.835564
Stopping. Best iteration:
[60] validation_0-auc:0.899998 validation_1-auc:0.837954
[0] validation_0-auc:0.827244 validation_1-auc:0.811061
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.833479 validation_1-auc:0.817054
[2] validation_0-auc:0.837955 validation_1-auc:0.820786
[3] validation_0-auc:0.841131 validation_1-auc:0.822749
[4] validation_0-auc:0.843609 validation_1-auc:0.824993
[5] validation_0-auc:0.845896 validation_1-auc:0.824342
[6] validation_0-auc:0.849302 validation_1-auc:0.82588
[7] validation_0-auc:0.849304 validation_1-auc:0.825169
[8] validation_0-auc:0.850556 validation_1-auc:0.825211
[9] validation_0-auc:0.852338 validation_1-auc:0.826174
[10] validation_0-auc:0.853634 validation_1-auc:0.826038
[11] validation_0-auc:0.856791 validation_1-auc:0.827607
[12] validation_0-auc:0.858396 validation_1-auc:0.828683
[13] validation_0-auc:0.862641 validation_1-auc:0.830328
[14] validation_0-auc:0.864888 validation_1-auc:0.831358
[15] validation_0-auc:0.866708 validation_1-auc:0.831402
[16] validation_0-auc:0.867904 validation_1-auc:0.831799
[17] validation_0-auc:0.868732 validation_1-auc:0.832296
[18] validation_0-auc:0.869915 validation_1-auc:0.832445
[19] validation_0-auc:0.87133 validation_1-auc:0.832595
[20] validation_0-auc:0.872629 validation_1-auc:0.833495
[21] validation_0-auc:0.874911 validation_1-auc:0.833097
[22] validation_0-auc:0.875891 validation_1-auc:0.833838
[23] validation_0-auc:0.877883 validation_1-auc:0.834578
[24] validation_0-auc:0.879529 validation_1-auc:0.835194
[25] validation_0-auc:0.881009 validation_1-auc:0.835803
[26] validation_0-auc:0.882204 validation_1-auc:0.835961
[27] validation_0-auc:0.883149 validation_1-auc:0.835979
[28] validation_0-auc:0.88408 validation_1-auc:0.835402
[29] validation_0-auc:0.885038 validation_1-auc:0.834688
[30] validation_0-auc:0.886261 validation_1-auc:0.835438
[31] validation_0-auc:0.88673 validation_1-auc:0.835781
[32] validation_0-auc:0.887439 validation_1-auc:0.835575
[33] validation_0-auc:0.888385 validation_1-auc:0.835901
[34] validation_0-auc:0.889368 validation_1-auc:0.836145
[35] validation_0-auc:0.890384 validation_1-auc:0.836216
[36] validation_0-auc:0.890996 validation_1-auc:0.836154
[37] validation_0-auc:0.891588 validation_1-auc:0.836108
[38] validation_0-auc:0.891954 validation_1-auc:0.8364
[39] validation_0-auc:0.892335 validation_1-auc:0.836417
[40] validation_0-auc:0.892897 validation_1-auc:0.836359
[41] validation_0-auc:0.893471 validation_1-auc:0.83601
[42] validation_0-auc:0.893676 validation_1-auc:0.836067
[43] validation_0-auc:0.894178 validation_1-auc:0.836326
[44] validation_0-auc:0.894467 validation_1-auc:0.836763
[45] validation_0-auc:0.894779 validation_1-auc:0.836533
[46] validation_0-auc:0.894984 validation_1-auc:0.836707
[47] validation_0-auc:0.895842 validation_1-auc:0.836916
[48] validation_0-auc:0.896077 validation_1-auc:0.837355
[49] validation_0-auc:0.896569 validation_1-auc:0.837673
[50] validation_0-auc:0.896969 validation_1-auc:0.837501
[51] validation_0-auc:0.897245 validation_1-auc:0.837479
[52] validation_0-auc:0.89744 validation_1-auc:0.838035
[53] validation_0-auc:0.89776 validation_1-auc:0.837956
[54] validation_0-auc:0.898187 validation_1-auc:0.838158
[55] validation_0-auc:0.898696 validation_1-auc:0.838196
[56] validation_0-auc:0.899239 validation_1-auc:0.838065
[57] validation_0-auc:0.899824 validation_1-auc:0.838005
[58] validation_0-auc:0.900432 validation_1-auc:0.838033
[59] validation_0-auc:0.900849 validation_1-auc:0.837565
[60] validation_0-auc:0.900913 validation_1-auc:0.837535
[61] validation_0-auc:0.901195 validation_1-auc:0.837327
[62] validation_0-auc:0.901396 validation_1-auc:0.837326
[63] validation_0-auc:0.901437 validation_1-auc:0.837232
[64] validation_0-auc:0.901675 validation_1-auc:0.837265
[65] validation_0-auc:0.902622 validation_1-auc:0.837186
[66] validation_0-auc:0.902782 validation_1-auc:0.837188
[67] validation_0-auc:0.903688 validation_1-auc:0.83707
[68] validation_0-auc:0.903876 validation_1-auc:0.836932
[69] validation_0-auc:0.904128 validation_1-auc:0.836408
[70] validation_0-auc:0.904261 validation_1-auc:0.836316
[71] validation_0-auc:0.904834 validation_1-auc:0.836387
[72] validation_0-auc:0.905076 validation_1-auc:0.836451
[73] validation_0-auc:0.905202 validation_1-auc:0.836482
[74] validation_0-auc:0.905533 validation_1-auc:0.836426
[75] validation_0-auc:0.90568 validation_1-auc:0.836307
[76] validation_0-auc:0.906236 validation_1-auc:0.836157
[77] validation_0-auc:0.906424 validation_1-auc:0.835976
[78] validation_0-auc:0.906601 validation_1-auc:0.835989
[79] validation_0-auc:0.90682 validation_1-auc:0.836074
[80] validation_0-auc:0.906922 validation_1-auc:0.835785
[81] validation_0-auc:0.907395 validation_1-auc:0.835761
[82] validation_0-auc:0.907772 validation_1-auc:0.835613
[83] validation_0-auc:0.908316 validation_1-auc:0.835638
[84] validation_0-auc:0.908532 validation_1-auc:0.83555
[85] validation_0-auc:0.909119 validation_1-auc:0.835531
Stopping. Best iteration:
[55] validation_0-auc:0.898696 validation_1-auc:0.838196
[0] validation_0-auc:0.735123 validation_1-auc:0.701625
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.82453 validation_1-auc:0.795614
[2] validation_0-auc:0.836272 validation_1-auc:0.80545
[3] validation_0-auc:0.841375 validation_1-auc:0.810318
[4] validation_0-auc:0.844529 validation_1-auc:0.812044
[5] validation_0-auc:0.843094 validation_1-auc:0.808222
[6] validation_0-auc:0.84535 validation_1-auc:0.808288
[7] validation_0-auc:0.850093 validation_1-auc:0.811925
[8] validation_0-auc:0.853986 validation_1-auc:0.815246
[9] validation_0-auc:0.856091 validation_1-auc:0.814765
[10] validation_0-auc:0.858829 validation_1-auc:0.816087
[11] validation_0-auc:0.863489 validation_1-auc:0.818724
[12] validation_0-auc:0.868038 validation_1-auc:0.820138
[13] validation_0-auc:0.870004 validation_1-auc:0.819354
[14] validation_0-auc:0.87009 validation_1-auc:0.817825
[15] validation_0-auc:0.873392 validation_1-auc:0.820986
[16] validation_0-auc:0.876423 validation_1-auc:0.821574
[17] validation_0-auc:0.878205 validation_1-auc:0.823969
[18] validation_0-auc:0.880712 validation_1-auc:0.825
[19] validation_0-auc:0.880556 validation_1-auc:0.822947
[20] validation_0-auc:0.883266 validation_1-auc:0.824155
[21] validation_0-auc:0.885019 validation_1-auc:0.823439
[22] validation_0-auc:0.884711 validation_1-auc:0.822155
[23] validation_0-auc:0.886694 validation_1-auc:0.824618
[24] validation_0-auc:0.887844 validation_1-auc:0.825787
[25] validation_0-auc:0.890074 validation_1-auc:0.824169
[26] validation_0-auc:0.891569 validation_1-auc:0.82339
[27] validation_0-auc:0.892917 validation_1-auc:0.82554
[28] validation_0-auc:0.895152 validation_1-auc:0.827165
[29] validation_0-auc:0.895692 validation_1-auc:0.82537
[30] validation_0-auc:0.896758 validation_1-auc:0.826913
[31] validation_0-auc:0.897887 validation_1-auc:0.827931
[32] validation_0-auc:0.898873 validation_1-auc:0.828115
[33] validation_0-auc:0.899659 validation_1-auc:0.827954
[34] validation_0-auc:0.899788 validation_1-auc:0.827501
[35] validation_0-auc:0.902046 validation_1-auc:0.828494
[36] validation_0-auc:0.904444 validation_1-auc:0.82969
[37] validation_0-auc:0.904612 validation_1-auc:0.829229
[38] validation_0-auc:0.9056 validation_1-auc:0.827899
[39] validation_0-auc:0.906908 validation_1-auc:0.827446
[40] validation_0-auc:0.907452 validation_1-auc:0.826793
[41] validation_0-auc:0.907646 validation_1-auc:0.825607
[42] validation_0-auc:0.909369 validation_1-auc:0.826443
[43] validation_0-auc:0.909582 validation_1-auc:0.825197
[44] validation_0-auc:0.911459 validation_1-auc:0.826444
[45] validation_0-auc:0.911971 validation_1-auc:0.826189
[46] validation_0-auc:0.914034 validation_1-auc:0.827109
[47] validation_0-auc:0.915356 validation_1-auc:0.828735
[48] validation_0-auc:0.916317 validation_1-auc:0.827775
[49] validation_0-auc:0.917391 validation_1-auc:0.827323
[50] validation_0-auc:0.917449 validation_1-auc:0.826477
[51] validation_0-auc:0.918834 validation_1-auc:0.827441
[52] validation_0-auc:0.919873 validation_1-auc:0.828689
[53] validation_0-auc:0.920581 validation_1-auc:0.829042
[54] validation_0-auc:0.921505 validation_1-auc:0.829455
[55] validation_0-auc:0.922113 validation_1-auc:0.829587
[56] validation_0-auc:0.92272 validation_1-auc:0.83033
[57] validation_0-auc:0.923349 validation_1-auc:0.830657
[58] validation_0-auc:0.923503 validation_1-auc:0.830353
[59] validation_0-auc:0.923696 validation_1-auc:0.83041
[60] validation_0-auc:0.924238 validation_1-auc:0.830555
[61] validation_0-auc:0.924435 validation_1-auc:0.83099
[62] validation_0-auc:0.924497 validation_1-auc:0.831242
[63] validation_0-auc:0.925051 validation_1-auc:0.831767
[64] validation_0-auc:0.92531 validation_1-auc:0.831817
[65] validation_0-auc:0.925783 validation_1-auc:0.832053
[66] validation_0-auc:0.926255 validation_1-auc:0.831896
[67] validation_0-auc:0.926693 validation_1-auc:0.832054
[68] validation_0-auc:0.926992 validation_1-auc:0.831983
[69] validation_0-auc:0.927018 validation_1-auc:0.831906
[70] validation_0-auc:0.927097 validation_1-auc:0.831975
[71] validation_0-auc:0.927276 validation_1-auc:0.831917
[72] validation_0-auc:0.927535 validation_1-auc:0.831677
[73] validation_0-auc:0.927676 validation_1-auc:0.83141
[74] validation_0-auc:0.928156 validation_1-auc:0.831453
[75] validation_0-auc:0.928332 validation_1-auc:0.831681
[76] validation_0-auc:0.928484 validation_1-auc:0.831611
[77] validation_0-auc:0.928661 validation_1-auc:0.831571
[78] validation_0-auc:0.929031 validation_1-auc:0.831178
[79] validation_0-auc:0.929154 validation_1-auc:0.830911
[80] validation_0-auc:0.92929 validation_1-auc:0.83082
[81] validation_0-auc:0.929647 validation_1-auc:0.830547
[82] validation_0-auc:0.929713 validation_1-auc:0.830445
[83] validation_0-auc:0.929893 validation_1-auc:0.830351
[84] validation_0-auc:0.930278 validation_1-auc:0.830393
[85] validation_0-auc:0.930359 validation_1-auc:0.830243
[86] validation_0-auc:0.930464 validation_1-auc:0.830102
[87] validation_0-auc:0.930553 validation_1-auc:0.830107
[88] validation_0-auc:0.930784 validation_1-auc:0.830289
[89] validation_0-auc:0.931201 validation_1-auc:0.830294
[90] validation_0-auc:0.931468 validation_1-auc:0.830064
[91] validation_0-auc:0.931594 validation_1-auc:0.829799
[92] validation_0-auc:0.931761 validation_1-auc:0.829806
[93] validation_0-auc:0.931832 validation_1-auc:0.8296
[94] validation_0-auc:0.931909 validation_1-auc:0.829501
[95] validation_0-auc:0.932216 validation_1-auc:0.829369
[96] validation_0-auc:0.932457 validation_1-auc:0.82943
[97] validation_0-auc:0.932634 validation_1-auc:0.829406
Stopping. Best iteration:
[67] validation_0-auc:0.926693 validation_1-auc:0.832054
[0] validation_0-auc:0.72562 validation_1-auc:0.711465
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.822999 validation_1-auc:0.801637
[2] validation_0-auc:0.829877 validation_1-auc:0.809608
[3] validation_0-auc:0.841917 validation_1-auc:0.819012
[4] validation_0-auc:0.849037 validation_1-auc:0.824772
[5] validation_0-auc:0.846038 validation_1-auc:0.818046
[6] validation_0-auc:0.849397 validation_1-auc:0.816351
[7] validation_0-auc:0.854445 validation_1-auc:0.820403
[8] validation_0-auc:0.860167 validation_1-auc:0.823322
[9] validation_0-auc:0.863289 validation_1-auc:0.822184
[10] validation_0-auc:0.869248 validation_1-auc:0.826534
[11] validation_0-auc:0.873342 validation_1-auc:0.829333
[12] validation_0-auc:0.875992 validation_1-auc:0.830452
[13] validation_0-auc:0.876516 validation_1-auc:0.827987
[14] validation_0-auc:0.876883 validation_1-auc:0.825679
[15] validation_0-auc:0.878048 validation_1-auc:0.826207
[16] validation_0-auc:0.880954 validation_1-auc:0.82684
[17] validation_0-auc:0.883938 validation_1-auc:0.827935
[18] validation_0-auc:0.886659 validation_1-auc:0.82884
[19] validation_0-auc:0.886323 validation_1-auc:0.827247
[20] validation_0-auc:0.888107 validation_1-auc:0.828335
[21] validation_0-auc:0.888445 validation_1-auc:0.827847
[22] validation_0-auc:0.888603 validation_1-auc:0.826498
[23] validation_0-auc:0.890659 validation_1-auc:0.828347
[24] validation_0-auc:0.891371 validation_1-auc:0.828835
[25] validation_0-auc:0.892732 validation_1-auc:0.827073
[26] validation_0-auc:0.894084 validation_1-auc:0.826168
[27] validation_0-auc:0.894911 validation_1-auc:0.827474
[28] validation_0-auc:0.89726 validation_1-auc:0.828546
[29] validation_0-auc:0.898078 validation_1-auc:0.827612
[30] validation_0-auc:0.899624 validation_1-auc:0.828491
[31] validation_0-auc:0.900309 validation_1-auc:0.828834
[32] validation_0-auc:0.901712 validation_1-auc:0.828596
[33] validation_0-auc:0.902735 validation_1-auc:0.827447
[34] validation_0-auc:0.903173 validation_1-auc:0.826543
[35] validation_0-auc:0.90616 validation_1-auc:0.826591
[36] validation_0-auc:0.907783 validation_1-auc:0.827108
[37] validation_0-auc:0.908508 validation_1-auc:0.827003
[38] validation_0-auc:0.909137 validation_1-auc:0.826247
[39] validation_0-auc:0.909877 validation_1-auc:0.825332
[40] validation_0-auc:0.910736 validation_1-auc:0.824815
[41] validation_0-auc:0.911593 validation_1-auc:0.823663
[42] validation_0-auc:0.913374 validation_1-auc:0.825257
Stopping. Best iteration:
[12] validation_0-auc:0.875992 validation_1-auc:0.830452
[0] validation_0-auc:0.744457 validation_1-auc:0.742295
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.830575 validation_1-auc:0.811487
[2] validation_0-auc:0.838942 validation_1-auc:0.81779
[3] validation_0-auc:0.843652 validation_1-auc:0.819411
[4] validation_0-auc:0.845548 validation_1-auc:0.82076
[5] validation_0-auc:0.849804 validation_1-auc:0.820052
[6] validation_0-auc:0.850113 validation_1-auc:0.81911
[7] validation_0-auc:0.856777 validation_1-auc:0.8213
[8] validation_0-auc:0.860528 validation_1-auc:0.821438
[9] validation_0-auc:0.861781 validation_1-auc:0.82019
[10] validation_0-auc:0.865905 validation_1-auc:0.822184
[11] validation_0-auc:0.869638 validation_1-auc:0.822502
[12] validation_0-auc:0.872602 validation_1-auc:0.82545
[13] validation_0-auc:0.875025 validation_1-auc:0.825362
[14] validation_0-auc:0.87529 validation_1-auc:0.824577
[15] validation_0-auc:0.877272 validation_1-auc:0.826105
[16] validation_0-auc:0.880088 validation_1-auc:0.827441
[17] validation_0-auc:0.883186 validation_1-auc:0.828028
[18] validation_0-auc:0.885643 validation_1-auc:0.829052
[19] validation_0-auc:0.885391 validation_1-auc:0.828104
[20] validation_0-auc:0.886988 validation_1-auc:0.828679
[21] validation_0-auc:0.88811 validation_1-auc:0.827733
[22] validation_0-auc:0.88821 validation_1-auc:0.826234
[23] validation_0-auc:0.89059 validation_1-auc:0.826397
[24] validation_0-auc:0.891439 validation_1-auc:0.827193
[25] validation_0-auc:0.89241 validation_1-auc:0.825984
[26] validation_0-auc:0.893779 validation_1-auc:0.824858
[27] validation_0-auc:0.894889 validation_1-auc:0.825657
[28] validation_0-auc:0.89682 validation_1-auc:0.826467
[29] validation_0-auc:0.898293 validation_1-auc:0.826243
[30] validation_0-auc:0.899792 validation_1-auc:0.827073
[31] validation_0-auc:0.900735 validation_1-auc:0.828171
[32] validation_0-auc:0.902347 validation_1-auc:0.826954
[33] validation_0-auc:0.903473 validation_1-auc:0.826014
[34] validation_0-auc:0.904094 validation_1-auc:0.825194
[35] validation_0-auc:0.906495 validation_1-auc:0.826854
[36] validation_0-auc:0.908309 validation_1-auc:0.827339
[37] validation_0-auc:0.90879 validation_1-auc:0.826502
[38] validation_0-auc:0.909989 validation_1-auc:0.82589
[39] validation_0-auc:0.91065 validation_1-auc:0.825267
[40] validation_0-auc:0.911214 validation_1-auc:0.825082
[41] validation_0-auc:0.911864 validation_1-auc:0.824367
[42] validation_0-auc:0.913491 validation_1-auc:0.825435
[43] validation_0-auc:0.913847 validation_1-auc:0.824289
[44] validation_0-auc:0.915323 validation_1-auc:0.825778
[45] validation_0-auc:0.915592 validation_1-auc:0.825329
[46] validation_0-auc:0.917346 validation_1-auc:0.826029
[47] validation_0-auc:0.9187 validation_1-auc:0.827302
[48] validation_0-auc:0.919293 validation_1-auc:0.826964
Stopping. Best iteration:
[18] validation_0-auc:0.885643 validation_1-auc:0.829052
[0] validation_0-auc:0.819189 validation_1-auc:0.793865
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.832924 validation_1-auc:0.808825
[2] validation_0-auc:0.837789 validation_1-auc:0.811917
[3] validation_0-auc:0.838753 validation_1-auc:0.811248
[4] validation_0-auc:0.84053 validation_1-auc:0.81187
[5] validation_0-auc:0.843976 validation_1-auc:0.813096
[6] validation_0-auc:0.845535 validation_1-auc:0.814107
[7] validation_0-auc:0.847304 validation_1-auc:0.813298
[8] validation_0-auc:0.850965 validation_1-auc:0.816172
[9] validation_0-auc:0.852314 validation_1-auc:0.817105
[10] validation_0-auc:0.854759 validation_1-auc:0.8188
[11] validation_0-auc:0.85564 validation_1-auc:0.820234
[12] validation_0-auc:0.858399 validation_1-auc:0.82124
[13] validation_0-auc:0.859604 validation_1-auc:0.824121
[14] validation_0-auc:0.861894 validation_1-auc:0.8263
[15] validation_0-auc:0.863468 validation_1-auc:0.827057
[16] validation_0-auc:0.865557 validation_1-auc:0.827845
[17] validation_0-auc:0.866987 validation_1-auc:0.82816
[18] validation_0-auc:0.868554 validation_1-auc:0.828349
[19] validation_0-auc:0.869814 validation_1-auc:0.829147
[20] validation_0-auc:0.87084 validation_1-auc:0.829368
[21] validation_0-auc:0.872276 validation_1-auc:0.830942
[22] validation_0-auc:0.873335 validation_1-auc:0.831149
[23] validation_0-auc:0.874306 validation_1-auc:0.832258
[24] validation_0-auc:0.875536 validation_1-auc:0.831303
[25] validation_0-auc:0.876488 validation_1-auc:0.831599
[26] validation_0-auc:0.877678 validation_1-auc:0.831988
[27] validation_0-auc:0.879157 validation_1-auc:0.831979
[28] validation_0-auc:0.880255 validation_1-auc:0.83275
[29] validation_0-auc:0.881749 validation_1-auc:0.832676
[30] validation_0-auc:0.882941 validation_1-auc:0.832487
[31] validation_0-auc:0.883815 validation_1-auc:0.832443
[32] validation_0-auc:0.884465 validation_1-auc:0.832122
[33] validation_0-auc:0.884918 validation_1-auc:0.832326
[34] validation_0-auc:0.885202 validation_1-auc:0.832198
[35] validation_0-auc:0.88566 validation_1-auc:0.832364
[36] validation_0-auc:0.885971 validation_1-auc:0.832496
[37] validation_0-auc:0.886406 validation_1-auc:0.832694
[38] validation_0-auc:0.887087 validation_1-auc:0.832929
[39] validation_0-auc:0.887389 validation_1-auc:0.832736
[40] validation_0-auc:0.88793 validation_1-auc:0.83258
[41] validation_0-auc:0.888785 validation_1-auc:0.83336
[42] validation_0-auc:0.889101 validation_1-auc:0.83317
[43] validation_0-auc:0.889952 validation_1-auc:0.833001
[44] validation_0-auc:0.890337 validation_1-auc:0.833102
[45] validation_0-auc:0.890482 validation_1-auc:0.832705
[46] validation_0-auc:0.89134 validation_1-auc:0.832712
[47] validation_0-auc:0.891413 validation_1-auc:0.832872
[48] validation_0-auc:0.891521 validation_1-auc:0.833002
[49] validation_0-auc:0.892206 validation_1-auc:0.832805
[50] validation_0-auc:0.892751 validation_1-auc:0.832806
[51] validation_0-auc:0.89324 validation_1-auc:0.832745
[52] validation_0-auc:0.893428 validation_1-auc:0.832558
[53] validation_0-auc:0.893885 validation_1-auc:0.832533
[54] validation_0-auc:0.894051 validation_1-auc:0.832532
[55] validation_0-auc:0.894337 validation_1-auc:0.832715
[56] validation_0-auc:0.894964 validation_1-auc:0.832104
[57] validation_0-auc:0.895443 validation_1-auc:0.831774
[58] validation_0-auc:0.895928 validation_1-auc:0.831332
[59] validation_0-auc:0.896005 validation_1-auc:0.831106
[60] validation_0-auc:0.896085 validation_1-auc:0.831005
[61] validation_0-auc:0.896716 validation_1-auc:0.83082
[62] validation_0-auc:0.897514 validation_1-auc:0.830175
[63] validation_0-auc:0.897609 validation_1-auc:0.83011
[64] validation_0-auc:0.897786 validation_1-auc:0.829621
[65] validation_0-auc:0.897989 validation_1-auc:0.829583
[66] validation_0-auc:0.898126 validation_1-auc:0.829532
[67] validation_0-auc:0.898506 validation_1-auc:0.829269
[68] validation_0-auc:0.898779 validation_1-auc:0.829183
[69] validation_0-auc:0.899047 validation_1-auc:0.82891
[70] validation_0-auc:0.89942 validation_1-auc:0.828873
[71] validation_0-auc:0.899994 validation_1-auc:0.828703
Stopping. Best iteration:
[41] validation_0-auc:0.888785 validation_1-auc:0.83336
[0] validation_0-auc:0.817693 validation_1-auc:0.804624
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.829691 validation_1-auc:0.81884
[2] validation_0-auc:0.836452 validation_1-auc:0.82381
[3] validation_0-auc:0.838278 validation_1-auc:0.824704
[4] validation_0-auc:0.840853 validation_1-auc:0.825516
[5] validation_0-auc:0.843714 validation_1-auc:0.825694
[6] validation_0-auc:0.844768 validation_1-auc:0.826193
[7] validation_0-auc:0.849383 validation_1-auc:0.829014
[8] validation_0-auc:0.850459 validation_1-auc:0.829909
[9] validation_0-auc:0.855088 validation_1-auc:0.829964
[10] validation_0-auc:0.857148 validation_1-auc:0.82977
[11] validation_0-auc:0.859962 validation_1-auc:0.830713
[12] validation_0-auc:0.862014 validation_1-auc:0.831201
[13] validation_0-auc:0.863786 validation_1-auc:0.830813
[14] validation_0-auc:0.865286 validation_1-auc:0.831725
[15] validation_0-auc:0.866239 validation_1-auc:0.832582
[16] validation_0-auc:0.86767 validation_1-auc:0.832533
[17] validation_0-auc:0.869164 validation_1-auc:0.833527
[18] validation_0-auc:0.870181 validation_1-auc:0.833384
[19] validation_0-auc:0.871738 validation_1-auc:0.833594
[20] validation_0-auc:0.873353 validation_1-auc:0.832858
[21] validation_0-auc:0.874792 validation_1-auc:0.832547
[22] validation_0-auc:0.875576 validation_1-auc:0.833703
[23] validation_0-auc:0.876631 validation_1-auc:0.834056
[24] validation_0-auc:0.877495 validation_1-auc:0.834363
[25] validation_0-auc:0.878978 validation_1-auc:0.835312
[26] validation_0-auc:0.879626 validation_1-auc:0.83536
[27] validation_0-auc:0.880433 validation_1-auc:0.835172
[28] validation_0-auc:0.881127 validation_1-auc:0.835262
[29] validation_0-auc:0.882978 validation_1-auc:0.834654
[30] validation_0-auc:0.883564 validation_1-auc:0.834999
[31] validation_0-auc:0.884268 validation_1-auc:0.835424
[32] validation_0-auc:0.884668 validation_1-auc:0.835693
[33] validation_0-auc:0.88536 validation_1-auc:0.835759
[34] validation_0-auc:0.885828 validation_1-auc:0.836336
[35] validation_0-auc:0.886385 validation_1-auc:0.836132
[36] validation_0-auc:0.88673 validation_1-auc:0.835902
[37] validation_0-auc:0.887109 validation_1-auc:0.835796
[38] validation_0-auc:0.887585 validation_1-auc:0.83604
[39] validation_0-auc:0.887881 validation_1-auc:0.835899
[40] validation_0-auc:0.88857 validation_1-auc:0.836218
[41] validation_0-auc:0.889091 validation_1-auc:0.836167
[42] validation_0-auc:0.889354 validation_1-auc:0.836124
[43] validation_0-auc:0.890646 validation_1-auc:0.836346
[44] validation_0-auc:0.89126 validation_1-auc:0.836305
[45] validation_0-auc:0.892267 validation_1-auc:0.836604
[46] validation_0-auc:0.893538 validation_1-auc:0.836587
[47] validation_0-auc:0.893793 validation_1-auc:0.836525
[48] validation_0-auc:0.893901 validation_1-auc:0.836579
[49] validation_0-auc:0.894954 validation_1-auc:0.836602
[50] validation_0-auc:0.895379 validation_1-auc:0.83661
[51] validation_0-auc:0.895976 validation_1-auc:0.836667
[52] validation_0-auc:0.896812 validation_1-auc:0.836573
[53] validation_0-auc:0.89696 validation_1-auc:0.836317
[54] validation_0-auc:0.897259 validation_1-auc:0.836305
[55] validation_0-auc:0.897475 validation_1-auc:0.836082
[56] validation_0-auc:0.897752 validation_1-auc:0.835912
[57] validation_0-auc:0.898092 validation_1-auc:0.835802
[58] validation_0-auc:0.898213 validation_1-auc:0.835673
[59] validation_0-auc:0.898404 validation_1-auc:0.835682
[60] validation_0-auc:0.898503 validation_1-auc:0.83567
[61] validation_0-auc:0.898989 validation_1-auc:0.835689
[62] validation_0-auc:0.899109 validation_1-auc:0.835621
[63] validation_0-auc:0.899736 validation_1-auc:0.835447
[64] validation_0-auc:0.899994 validation_1-auc:0.835328
[65] validation_0-auc:0.900099 validation_1-auc:0.835346
[66] validation_0-auc:0.900191 validation_1-auc:0.83528
[67] validation_0-auc:0.900368 validation_1-auc:0.835146
[68] validation_0-auc:0.900474 validation_1-auc:0.835142
[69] validation_0-auc:0.900702 validation_1-auc:0.835328
[70] validation_0-auc:0.900959 validation_1-auc:0.835398
[71] validation_0-auc:0.901225 validation_1-auc:0.835614
[72] validation_0-auc:0.901683 validation_1-auc:0.835341
[73] validation_0-auc:0.901799 validation_1-auc:0.835294
[74] validation_0-auc:0.901906 validation_1-auc:0.835241
[75] validation_0-auc:0.902366 validation_1-auc:0.835023
[76] validation_0-auc:0.902718 validation_1-auc:0.834816
[77] validation_0-auc:0.902812 validation_1-auc:0.834748
[78] validation_0-auc:0.90322 validation_1-auc:0.834717
[79] validation_0-auc:0.903898 validation_1-auc:0.83453
[80] validation_0-auc:0.904052 validation_1-auc:0.834384
[81] validation_0-auc:0.904148 validation_1-auc:0.834438
Stopping. Best iteration:
[51] validation_0-auc:0.895976 validation_1-auc:0.836667
[0] validation_0-auc:0.825908 validation_1-auc:0.810741
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.832362 validation_1-auc:0.816373
[2] validation_0-auc:0.836986 validation_1-auc:0.817859
[3] validation_0-auc:0.840528 validation_1-auc:0.820146
[4] validation_0-auc:0.844641 validation_1-auc:0.825198
[5] validation_0-auc:0.845835 validation_1-auc:0.826339
[6] validation_0-auc:0.846311 validation_1-auc:0.826481
[7] validation_0-auc:0.850981 validation_1-auc:0.828733
[8] validation_0-auc:0.851941 validation_1-auc:0.828147
[9] validation_0-auc:0.852977 validation_1-auc:0.828607
[10] validation_0-auc:0.855552 validation_1-auc:0.829512
[11] validation_0-auc:0.859165 validation_1-auc:0.829046
[12] validation_0-auc:0.861049 validation_1-auc:0.829579
[13] validation_0-auc:0.864198 validation_1-auc:0.831408
[14] validation_0-auc:0.865405 validation_1-auc:0.831986
[15] validation_0-auc:0.866403 validation_1-auc:0.832344
[16] validation_0-auc:0.867046 validation_1-auc:0.833164
[17] validation_0-auc:0.867954 validation_1-auc:0.833405
[18] validation_0-auc:0.86915 validation_1-auc:0.833822
[19] validation_0-auc:0.870803 validation_1-auc:0.83369
[20] validation_0-auc:0.87246 validation_1-auc:0.833753
[21] validation_0-auc:0.873865 validation_1-auc:0.833287
[22] validation_0-auc:0.875524 validation_1-auc:0.834087
[23] validation_0-auc:0.876561 validation_1-auc:0.835469
[24] validation_0-auc:0.877423 validation_1-auc:0.83554
[25] validation_0-auc:0.87871 validation_1-auc:0.835997
[26] validation_0-auc:0.879459 validation_1-auc:0.836084
[27] validation_0-auc:0.88064 validation_1-auc:0.836356
[28] validation_0-auc:0.881118 validation_1-auc:0.836536
[29] validation_0-auc:0.882715 validation_1-auc:0.835944
[30] validation_0-auc:0.883589 validation_1-auc:0.836712
[31] validation_0-auc:0.88437 validation_1-auc:0.83699
[32] validation_0-auc:0.884928 validation_1-auc:0.837323
[33] validation_0-auc:0.885357 validation_1-auc:0.837332
[34] validation_0-auc:0.886058 validation_1-auc:0.837579
[35] validation_0-auc:0.886968 validation_1-auc:0.837429
[36] validation_0-auc:0.887714 validation_1-auc:0.837528
[37] validation_0-auc:0.888299 validation_1-auc:0.837892
[38] validation_0-auc:0.888495 validation_1-auc:0.837959
[39] validation_0-auc:0.889106 validation_1-auc:0.837856
[40] validation_0-auc:0.889557 validation_1-auc:0.83809
[41] validation_0-auc:0.89033 validation_1-auc:0.838486
[42] validation_0-auc:0.891095 validation_1-auc:0.838564
[43] validation_0-auc:0.89133 validation_1-auc:0.838863
[44] validation_0-auc:0.892696 validation_1-auc:0.839018
[45] validation_0-auc:0.893381 validation_1-auc:0.839012
[46] validation_0-auc:0.894229 validation_1-auc:0.83904
[47] validation_0-auc:0.894791 validation_1-auc:0.838765
[48] validation_0-auc:0.895064 validation_1-auc:0.8389
[49] validation_0-auc:0.895427 validation_1-auc:0.838804
[50] validation_0-auc:0.896155 validation_1-auc:0.838647
[51] validation_0-auc:0.896823 validation_1-auc:0.838678
[52] validation_0-auc:0.897061 validation_1-auc:0.838649
[53] validation_0-auc:0.897457 validation_1-auc:0.838734
[54] validation_0-auc:0.898161 validation_1-auc:0.838587
[55] validation_0-auc:0.898455 validation_1-auc:0.83862
[56] validation_0-auc:0.898773 validation_1-auc:0.8386
[57] validation_0-auc:0.899413 validation_1-auc:0.838558
[58] validation_0-auc:0.899754 validation_1-auc:0.83854
[59] validation_0-auc:0.900122 validation_1-auc:0.838454
[60] validation_0-auc:0.90035 validation_1-auc:0.838547
[61] validation_0-auc:0.900733 validation_1-auc:0.838268
[62] validation_0-auc:0.901486 validation_1-auc:0.838081
[63] validation_0-auc:0.90169 validation_1-auc:0.837852
[64] validation_0-auc:0.901854 validation_1-auc:0.837661
[65] validation_0-auc:0.903085 validation_1-auc:0.837761
[66] validation_0-auc:0.90336 validation_1-auc:0.837883
[67] validation_0-auc:0.904243 validation_1-auc:0.837811
[68] validation_0-auc:0.904714 validation_1-auc:0.837651
[69] validation_0-auc:0.904906 validation_1-auc:0.837506
[70] validation_0-auc:0.9054 validation_1-auc:0.837545
[71] validation_0-auc:0.90619 validation_1-auc:0.837199
[72] validation_0-auc:0.906327 validation_1-auc:0.837195
[73] validation_0-auc:0.906472 validation_1-auc:0.837101
[74] validation_0-auc:0.906695 validation_1-auc:0.836873
[75] validation_0-auc:0.907333 validation_1-auc:0.836993
[76] validation_0-auc:0.907628 validation_1-auc:0.836866
Stopping. Best iteration:
[46] validation_0-auc:0.894229 validation_1-auc:0.83904
[0] validation_0-auc:0.815876 validation_1-auc:0.791026
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.82329 validation_1-auc:0.800224
[2] validation_0-auc:0.828034 validation_1-auc:0.803535
[3] validation_0-auc:0.828872 validation_1-auc:0.803601
[4] validation_0-auc:0.831254 validation_1-auc:0.805441
[5] validation_0-auc:0.833444 validation_1-auc:0.809016
[6] validation_0-auc:0.83442 validation_1-auc:0.810228
[7] validation_0-auc:0.834921 validation_1-auc:0.811509
[8] validation_0-auc:0.835473 validation_1-auc:0.812153
[9] validation_0-auc:0.836164 validation_1-auc:0.814615
[10] validation_0-auc:0.837752 validation_1-auc:0.815205
[11] validation_0-auc:0.838531 validation_1-auc:0.815608
[12] validation_0-auc:0.839538 validation_1-auc:0.816362
[13] validation_0-auc:0.841316 validation_1-auc:0.818434
[14] validation_0-auc:0.841728 validation_1-auc:0.818708
[15] validation_0-auc:0.842857 validation_1-auc:0.81986
[16] validation_0-auc:0.843309 validation_1-auc:0.819763
[17] validation_0-auc:0.843488 validation_1-auc:0.819946
[18] validation_0-auc:0.843902 validation_1-auc:0.819978
[19] validation_0-auc:0.845314 validation_1-auc:0.821081
[20] validation_0-auc:0.846879 validation_1-auc:0.822083
[21] validation_0-auc:0.848047 validation_1-auc:0.823678
[22] validation_0-auc:0.848373 validation_1-auc:0.823914
[23] validation_0-auc:0.848739 validation_1-auc:0.823866
[24] validation_0-auc:0.848613 validation_1-auc:0.823785
[25] validation_0-auc:0.849043 validation_1-auc:0.823439
[26] validation_0-auc:0.84963 validation_1-auc:0.82351
[27] validation_0-auc:0.850132 validation_1-auc:0.823539
[28] validation_0-auc:0.850355 validation_1-auc:0.823852
[29] validation_0-auc:0.850656 validation_1-auc:0.824165
[30] validation_0-auc:0.851066 validation_1-auc:0.82447
[31] validation_0-auc:0.851301 validation_1-auc:0.824978
[32] validation_0-auc:0.85212 validation_1-auc:0.825392
[33] validation_0-auc:0.85235 validation_1-auc:0.825842
[34] validation_0-auc:0.853183 validation_1-auc:0.825965
[35] validation_0-auc:0.853996 validation_1-auc:0.826187
[36] validation_0-auc:0.854341 validation_1-auc:0.826057
[37] validation_0-auc:0.855065 validation_1-auc:0.826301
[38] validation_0-auc:0.8554 validation_1-auc:0.826813
[39] validation_0-auc:0.855416 validation_1-auc:0.826625
[40] validation_0-auc:0.856157 validation_1-auc:0.827625
[41] validation_0-auc:0.856497 validation_1-auc:0.827936
[42] validation_0-auc:0.856971 validation_1-auc:0.828082
[43] validation_0-auc:0.85742 validation_1-auc:0.82806
[44] validation_0-auc:0.857663 validation_1-auc:0.828127
[45] validation_0-auc:0.858195 validation_1-auc:0.828558
[46] validation_0-auc:0.858663 validation_1-auc:0.828677
[47] validation_0-auc:0.858987 validation_1-auc:0.828699
[48] validation_0-auc:0.859512 validation_1-auc:0.82924
[49] validation_0-auc:0.859784 validation_1-auc:0.829225
[50] validation_0-auc:0.860068 validation_1-auc:0.829709
[51] validation_0-auc:0.860647 validation_1-auc:0.829955
[52] validation_0-auc:0.861024 validation_1-auc:0.830528
[53] validation_0-auc:0.861271 validation_1-auc:0.830482
[54] validation_0-auc:0.861612 validation_1-auc:0.830509
[55] validation_0-auc:0.861739 validation_1-auc:0.830587
[56] validation_0-auc:0.862352 validation_1-auc:0.831046
[57] validation_0-auc:0.862894 validation_1-auc:0.831287
[58] validation_0-auc:0.863227 validation_1-auc:0.831369
[59] validation_0-auc:0.863725 validation_1-auc:0.831785
[60] validation_0-auc:0.864048 validation_1-auc:0.831833
[61] validation_0-auc:0.864454 validation_1-auc:0.832299
[62] validation_0-auc:0.864777 validation_1-auc:0.832391
[63] validation_0-auc:0.865167 validation_1-auc:0.832498
[64] validation_0-auc:0.865363 validation_1-auc:0.832507
[65] validation_0-auc:0.865587 validation_1-auc:0.832353
[66] validation_0-auc:0.866078 validation_1-auc:0.832484
[67] validation_0-auc:0.866472 validation_1-auc:0.832361
[68] validation_0-auc:0.866755 validation_1-auc:0.832753
[69] validation_0-auc:0.867007 validation_1-auc:0.832813
[70] validation_0-auc:0.867249 validation_1-auc:0.832911
[71] validation_0-auc:0.867518 validation_1-auc:0.832939
[72] validation_0-auc:0.867762 validation_1-auc:0.833068
[73] validation_0-auc:0.867998 validation_1-auc:0.83307
[74] validation_0-auc:0.868059 validation_1-auc:0.833257
[75] validation_0-auc:0.868341 validation_1-auc:0.833442
[76] validation_0-auc:0.868468 validation_1-auc:0.83346
[77] validation_0-auc:0.868796 validation_1-auc:0.833506
[78] validation_0-auc:0.869145 validation_1-auc:0.833789
[79] validation_0-auc:0.869379 validation_1-auc:0.833687
[80] validation_0-auc:0.86957 validation_1-auc:0.833704
[81] validation_0-auc:0.869904 validation_1-auc:0.833797
[82] validation_0-auc:0.870087 validation_1-auc:0.834018
[83] validation_0-auc:0.870375 validation_1-auc:0.833989
[84] validation_0-auc:0.870578 validation_1-auc:0.834035
[85] validation_0-auc:0.870684 validation_1-auc:0.834024
[86] validation_0-auc:0.870929 validation_1-auc:0.834095
[87] validation_0-auc:0.871073 validation_1-auc:0.83409
[88] validation_0-auc:0.871348 validation_1-auc:0.834029
[89] validation_0-auc:0.871444 validation_1-auc:0.834081
[90] validation_0-auc:0.871736 validation_1-auc:0.834045
[91] validation_0-auc:0.871962 validation_1-auc:0.833982
[92] validation_0-auc:0.872239 validation_1-auc:0.834106
[93] validation_0-auc:0.872355 validation_1-auc:0.834329
[94] validation_0-auc:0.872377 validation_1-auc:0.834291
[95] validation_0-auc:0.872553 validation_1-auc:0.834496
[96] validation_0-auc:0.872755 validation_1-auc:0.834445
[97] validation_0-auc:0.872849 validation_1-auc:0.834626
[98] validation_0-auc:0.87303 validation_1-auc:0.834514
[99] validation_0-auc:0.873169 validation_1-auc:0.834411
[0] validation_0-auc:0.814517 validation_1-auc:0.806113
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.821086 validation_1-auc:0.810807
[2] validation_0-auc:0.825752 validation_1-auc:0.81971
[3] validation_0-auc:0.826262 validation_1-auc:0.818812
[4] validation_0-auc:0.827918 validation_1-auc:0.817868
[5] validation_0-auc:0.829503 validation_1-auc:0.818116
[6] validation_0-auc:0.830916 validation_1-auc:0.821714
[7] validation_0-auc:0.831762 validation_1-auc:0.82021
[8] validation_0-auc:0.832357 validation_1-auc:0.821078
[9] validation_0-auc:0.835597 validation_1-auc:0.823727
[10] validation_0-auc:0.836538 validation_1-auc:0.825437
[11] validation_0-auc:0.83658 validation_1-auc:0.825601
[12] validation_0-auc:0.83698 validation_1-auc:0.825137
[13] validation_0-auc:0.840405 validation_1-auc:0.826646
[14] validation_0-auc:0.84116 validation_1-auc:0.82729
[15] validation_0-auc:0.842328 validation_1-auc:0.827529
[16] validation_0-auc:0.84264 validation_1-auc:0.828357
[17] validation_0-auc:0.842449 validation_1-auc:0.828477
[18] validation_0-auc:0.842966 validation_1-auc:0.829303
[19] validation_0-auc:0.843432 validation_1-auc:0.828349
[20] validation_0-auc:0.844956 validation_1-auc:0.829094
[21] validation_0-auc:0.845152 validation_1-auc:0.828924
[22] validation_0-auc:0.846113 validation_1-auc:0.829088
[23] validation_0-auc:0.846457 validation_1-auc:0.829504
[24] validation_0-auc:0.847141 validation_1-auc:0.830119
[25] validation_0-auc:0.847534 validation_1-auc:0.829834
[26] validation_0-auc:0.848533 validation_1-auc:0.830202
[27] validation_0-auc:0.849049 validation_1-auc:0.830551
[28] validation_0-auc:0.849938 validation_1-auc:0.830436
[29] validation_0-auc:0.850468 validation_1-auc:0.830799
[30] validation_0-auc:0.850916 validation_1-auc:0.831025
[31] validation_0-auc:0.851083 validation_1-auc:0.831339
[32] validation_0-auc:0.8515 validation_1-auc:0.831409
[33] validation_0-auc:0.85249 validation_1-auc:0.831212
[34] validation_0-auc:0.853008 validation_1-auc:0.831791
[35] validation_0-auc:0.853344 validation_1-auc:0.831984
[36] validation_0-auc:0.853532 validation_1-auc:0.831869
[37] validation_0-auc:0.85409 validation_1-auc:0.832159
[38] validation_0-auc:0.854459 validation_1-auc:0.831746
[39] validation_0-auc:0.854942 validation_1-auc:0.831724
[40] validation_0-auc:0.855602 validation_1-auc:0.831616
[41] validation_0-auc:0.855799 validation_1-auc:0.831422
[42] validation_0-auc:0.85647 validation_1-auc:0.832306
[43] validation_0-auc:0.856982 validation_1-auc:0.832578
[44] validation_0-auc:0.857332 validation_1-auc:0.833109
[45] validation_0-auc:0.857506 validation_1-auc:0.832656
[46] validation_0-auc:0.857925 validation_1-auc:0.832774
[47] validation_0-auc:0.858409 validation_1-auc:0.833105
[48] validation_0-auc:0.85844 validation_1-auc:0.832883
[49] validation_0-auc:0.858565 validation_1-auc:0.832782
[50] validation_0-auc:0.858695 validation_1-auc:0.832692
[51] validation_0-auc:0.859676 validation_1-auc:0.833565
[52] validation_0-auc:0.860394 validation_1-auc:0.834181
[53] validation_0-auc:0.860951 validation_1-auc:0.834381
[54] validation_0-auc:0.861541 validation_1-auc:0.834865
[55] validation_0-auc:0.861789 validation_1-auc:0.834728
[56] validation_0-auc:0.862391 validation_1-auc:0.835241
[57] validation_0-auc:0.86304 validation_1-auc:0.835487
[58] validation_0-auc:0.863506 validation_1-auc:0.835244
[59] validation_0-auc:0.864084 validation_1-auc:0.83563
[60] validation_0-auc:0.86438 validation_1-auc:0.835658
[61] validation_0-auc:0.864906 validation_1-auc:0.835953
[62] validation_0-auc:0.865363 validation_1-auc:0.836035
[63] validation_0-auc:0.865693 validation_1-auc:0.836036
[64] validation_0-auc:0.866121 validation_1-auc:0.836328
[65] validation_0-auc:0.866497 validation_1-auc:0.836528
[66] validation_0-auc:0.866824 validation_1-auc:0.836184
[67] validation_0-auc:0.867231 validation_1-auc:0.836371
[68] validation_0-auc:0.867643 validation_1-auc:0.836848
[69] validation_0-auc:0.867913 validation_1-auc:0.836888
[70] validation_0-auc:0.868142 validation_1-auc:0.836934
[71] validation_0-auc:0.868244 validation_1-auc:0.837037
[72] validation_0-auc:0.868443 validation_1-auc:0.837234
[73] validation_0-auc:0.868686 validation_1-auc:0.83734
[74] validation_0-auc:0.868822 validation_1-auc:0.83746
[75] validation_0-auc:0.869018 validation_1-auc:0.837553
[76] validation_0-auc:0.869228 validation_1-auc:0.837626
[77] validation_0-auc:0.869349 validation_1-auc:0.837701
[78] validation_0-auc:0.869739 validation_1-auc:0.837557
[79] validation_0-auc:0.870027 validation_1-auc:0.837733
[80] validation_0-auc:0.87039 validation_1-auc:0.838001
[81] validation_0-auc:0.870596 validation_1-auc:0.83804
[82] validation_0-auc:0.870808 validation_1-auc:0.838167
[83] validation_0-auc:0.871025 validation_1-auc:0.838068
[84] validation_0-auc:0.871224 validation_1-auc:0.838181
[85] validation_0-auc:0.871226 validation_1-auc:0.838084
[86] validation_0-auc:0.871682 validation_1-auc:0.837886
[87] validation_0-auc:0.871785 validation_1-auc:0.838013
[88] validation_0-auc:0.872168 validation_1-auc:0.837822
[89] validation_0-auc:0.872362 validation_1-auc:0.837691
[90] validation_0-auc:0.872486 validation_1-auc:0.837692
[91] validation_0-auc:0.87262 validation_1-auc:0.837821
[92] validation_0-auc:0.872712 validation_1-auc:0.83773
[93] validation_0-auc:0.872835 validation_1-auc:0.837667
[94] validation_0-auc:0.872904 validation_1-auc:0.837689
[95] validation_0-auc:0.873053 validation_1-auc:0.837628
[96] validation_0-auc:0.873292 validation_1-auc:0.837831
[97] validation_0-auc:0.873496 validation_1-auc:0.837962
[98] validation_0-auc:0.873664 validation_1-auc:0.837968
[99] validation_0-auc:0.873832 validation_1-auc:0.837867
[0] validation_0-auc:0.820564 validation_1-auc:0.812281
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.822884 validation_1-auc:0.814231
[2] validation_0-auc:0.827147 validation_1-auc:0.818692
[3] validation_0-auc:0.828678 validation_1-auc:0.819044
[4] validation_0-auc:0.830778 validation_1-auc:0.82174
[5] validation_0-auc:0.832376 validation_1-auc:0.823735
[6] validation_0-auc:0.832772 validation_1-auc:0.824304
[7] validation_0-auc:0.833905 validation_1-auc:0.824382
[8] validation_0-auc:0.834644 validation_1-auc:0.824
[9] validation_0-auc:0.837077 validation_1-auc:0.825423
[10] validation_0-auc:0.838328 validation_1-auc:0.826081
[11] validation_0-auc:0.839169 validation_1-auc:0.82753
[12] validation_0-auc:0.839364 validation_1-auc:0.827739
[13] validation_0-auc:0.842299 validation_1-auc:0.827979
[14] validation_0-auc:0.843403 validation_1-auc:0.82817
[15] validation_0-auc:0.844064 validation_1-auc:0.828546
[16] validation_0-auc:0.844709 validation_1-auc:0.828836
[17] validation_0-auc:0.845389 validation_1-auc:0.828785
[18] validation_0-auc:0.846108 validation_1-auc:0.828811
[19] validation_0-auc:0.846912 validation_1-auc:0.828517
[20] validation_0-auc:0.847651 validation_1-auc:0.828662
[21] validation_0-auc:0.848158 validation_1-auc:0.828885
[22] validation_0-auc:0.84903 validation_1-auc:0.829912
[23] validation_0-auc:0.849324 validation_1-auc:0.829738
[24] validation_0-auc:0.849492 validation_1-auc:0.830329
[25] validation_0-auc:0.850315 validation_1-auc:0.830173
[26] validation_0-auc:0.851198 validation_1-auc:0.831207
[27] validation_0-auc:0.851698 validation_1-auc:0.831324
[28] validation_0-auc:0.852145 validation_1-auc:0.831739
[29] validation_0-auc:0.852371 validation_1-auc:0.831647
[30] validation_0-auc:0.852778 validation_1-auc:0.832109
[31] validation_0-auc:0.853468 validation_1-auc:0.832794
[32] validation_0-auc:0.853702 validation_1-auc:0.832441
[33] validation_0-auc:0.853979 validation_1-auc:0.832215
[34] validation_0-auc:0.854565 validation_1-auc:0.8325
[35] validation_0-auc:0.854822 validation_1-auc:0.833131
[36] validation_0-auc:0.855052 validation_1-auc:0.83355
[37] validation_0-auc:0.85555 validation_1-auc:0.834099
[38] validation_0-auc:0.855981 validation_1-auc:0.834568
[39] validation_0-auc:0.856265 validation_1-auc:0.834647
[40] validation_0-auc:0.856895 validation_1-auc:0.834189
[41] validation_0-auc:0.85702 validation_1-auc:0.833736
[42] validation_0-auc:0.857365 validation_1-auc:0.83428
[43] validation_0-auc:0.857696 validation_1-auc:0.834433
[44] validation_0-auc:0.857889 validation_1-auc:0.834739
[45] validation_0-auc:0.858183 validation_1-auc:0.83399
[46] validation_0-auc:0.858374 validation_1-auc:0.834081
[47] validation_0-auc:0.859025 validation_1-auc:0.835068
[48] validation_0-auc:0.859417 validation_1-auc:0.834645
[49] validation_0-auc:0.859554 validation_1-auc:0.834537
[50] validation_0-auc:0.859496 validation_1-auc:0.834389
[51] validation_0-auc:0.860414 validation_1-auc:0.83483
[52] validation_0-auc:0.860843 validation_1-auc:0.835171
[53] validation_0-auc:0.861383 validation_1-auc:0.83572
[54] validation_0-auc:0.861901 validation_1-auc:0.836289
[55] validation_0-auc:0.862086 validation_1-auc:0.836156
[56] validation_0-auc:0.862512 validation_1-auc:0.8366
[57] validation_0-auc:0.863042 validation_1-auc:0.836971
[58] validation_0-auc:0.863295 validation_1-auc:0.836759
[59] validation_0-auc:0.863639 validation_1-auc:0.837293
[60] validation_0-auc:0.863805 validation_1-auc:0.83719
[61] validation_0-auc:0.863969 validation_1-auc:0.837157
[62] validation_0-auc:0.864322 validation_1-auc:0.837492
[63] validation_0-auc:0.864793 validation_1-auc:0.837637
[64] validation_0-auc:0.864912 validation_1-auc:0.837409
[65] validation_0-auc:0.865325 validation_1-auc:0.837465
[66] validation_0-auc:0.865423 validation_1-auc:0.837237
[67] validation_0-auc:0.865758 validation_1-auc:0.837116
[68] validation_0-auc:0.866208 validation_1-auc:0.837375
[69] validation_0-auc:0.86629 validation_1-auc:0.837379
[70] validation_0-auc:0.866594 validation_1-auc:0.837791
[71] validation_0-auc:0.866829 validation_1-auc:0.837988
[72] validation_0-auc:0.867199 validation_1-auc:0.838091
[73] validation_0-auc:0.867449 validation_1-auc:0.838055
[74] validation_0-auc:0.867629 validation_1-auc:0.838346
[75] validation_0-auc:0.867867 validation_1-auc:0.8383
[76] validation_0-auc:0.868138 validation_1-auc:0.838299
[77] validation_0-auc:0.86832 validation_1-auc:0.838606
[78] validation_0-auc:0.868988 validation_1-auc:0.838819
[79] validation_0-auc:0.869179 validation_1-auc:0.838938
[80] validation_0-auc:0.869674 validation_1-auc:0.838909
[81] validation_0-auc:0.869922 validation_1-auc:0.83886
[82] validation_0-auc:0.870111 validation_1-auc:0.839139
[83] validation_0-auc:0.870336 validation_1-auc:0.83917
[84] validation_0-auc:0.870532 validation_1-auc:0.839454
[85] validation_0-auc:0.870644 validation_1-auc:0.839617
[86] validation_0-auc:0.871039 validation_1-auc:0.839627
[87] validation_0-auc:0.871197 validation_1-auc:0.83967
[88] validation_0-auc:0.871495 validation_1-auc:0.839675
[89] validation_0-auc:0.87166 validation_1-auc:0.839722
[90] validation_0-auc:0.871741 validation_1-auc:0.839853
[91] validation_0-auc:0.8719 validation_1-auc:0.839795
[92] validation_0-auc:0.872233 validation_1-auc:0.839849
[93] validation_0-auc:0.872395 validation_1-auc:0.840056
[94] validation_0-auc:0.872558 validation_1-auc:0.840147
[95] validation_0-auc:0.872701 validation_1-auc:0.840281
[96] validation_0-auc:0.872864 validation_1-auc:0.840261
[97] validation_0-auc:0.873121 validation_1-auc:0.840526
[98] validation_0-auc:0.873227 validation_1-auc:0.840525
[99] validation_0-auc:0.873451 validation_1-auc:0.840684
[0] validation_0-auc:0.826116 validation_1-auc:0.798941
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.830084 validation_1-auc:0.801434
[2] validation_0-auc:0.834366 validation_1-auc:0.804738
[3] validation_0-auc:0.836088 validation_1-auc:0.807266
[4] validation_0-auc:0.837569 validation_1-auc:0.808519
[5] validation_0-auc:0.839209 validation_1-auc:0.811307
[6] validation_0-auc:0.841754 validation_1-auc:0.812442
[7] validation_0-auc:0.841729 validation_1-auc:0.812203
[8] validation_0-auc:0.84276 validation_1-auc:0.812427
[9] validation_0-auc:0.843983 validation_1-auc:0.814752
[10] validation_0-auc:0.844412 validation_1-auc:0.815042
[11] validation_0-auc:0.845449 validation_1-auc:0.814535
[12] validation_0-auc:0.845999 validation_1-auc:0.815164
[13] validation_0-auc:0.850549 validation_1-auc:0.820284
[14] validation_0-auc:0.851123 validation_1-auc:0.820146
[15] validation_0-auc:0.851939 validation_1-auc:0.820634
[16] validation_0-auc:0.852892 validation_1-auc:0.820865
[17] validation_0-auc:0.853478 validation_1-auc:0.821817
[18] validation_0-auc:0.854074 validation_1-auc:0.822087
[19] validation_0-auc:0.854834 validation_1-auc:0.824235
[20] validation_0-auc:0.855295 validation_1-auc:0.824088
[21] validation_0-auc:0.85764 validation_1-auc:0.82496
[22] validation_0-auc:0.857703 validation_1-auc:0.825221
[23] validation_0-auc:0.857632 validation_1-auc:0.825249
[24] validation_0-auc:0.85834 validation_1-auc:0.825852
[25] validation_0-auc:0.859624 validation_1-auc:0.826136
[26] validation_0-auc:0.860503 validation_1-auc:0.826643
[27] validation_0-auc:0.861021 validation_1-auc:0.826623
[28] validation_0-auc:0.86165 validation_1-auc:0.826353
[29] validation_0-auc:0.862476 validation_1-auc:0.82809
[30] validation_0-auc:0.863173 validation_1-auc:0.828085
[31] validation_0-auc:0.863526 validation_1-auc:0.828942
[32] validation_0-auc:0.864272 validation_1-auc:0.829493
[33] validation_0-auc:0.864825 validation_1-auc:0.830548
[34] validation_0-auc:0.865335 validation_1-auc:0.83092
[35] validation_0-auc:0.865942 validation_1-auc:0.830824
[36] validation_0-auc:0.86601 validation_1-auc:0.830868
[37] validation_0-auc:0.866788 validation_1-auc:0.830677
[38] validation_0-auc:0.867891 validation_1-auc:0.830665
[39] validation_0-auc:0.868984 validation_1-auc:0.83145
[40] validation_0-auc:0.869984 validation_1-auc:0.831314
[41] validation_0-auc:0.870014 validation_1-auc:0.830857
[42] validation_0-auc:0.870632 validation_1-auc:0.831401
[43] validation_0-auc:0.871478 validation_1-auc:0.831252
[44] validation_0-auc:0.87228 validation_1-auc:0.831455
[45] validation_0-auc:0.872506 validation_1-auc:0.830937
[46] validation_0-auc:0.872937 validation_1-auc:0.830806
[47] validation_0-auc:0.873726 validation_1-auc:0.831074
[48] validation_0-auc:0.873776 validation_1-auc:0.830903
[49] validation_0-auc:0.873588 validation_1-auc:0.830364
[50] validation_0-auc:0.873257 validation_1-auc:0.829765
[51] validation_0-auc:0.874349 validation_1-auc:0.830306
[52] validation_0-auc:0.874972 validation_1-auc:0.830694
[53] validation_0-auc:0.875506 validation_1-auc:0.830852
[54] validation_0-auc:0.876404 validation_1-auc:0.831016
[55] validation_0-auc:0.876531 validation_1-auc:0.830672
[56] validation_0-auc:0.877314 validation_1-auc:0.830836
[57] validation_0-auc:0.878005 validation_1-auc:0.831412
[58] validation_0-auc:0.878121 validation_1-auc:0.83135
[59] validation_0-auc:0.878359 validation_1-auc:0.831739
[60] validation_0-auc:0.878809 validation_1-auc:0.83211
[61] validation_0-auc:0.879271 validation_1-auc:0.83232
[62] validation_0-auc:0.87987 validation_1-auc:0.832397
[63] validation_0-auc:0.880694 validation_1-auc:0.831967
[64] validation_0-auc:0.881361 validation_1-auc:0.831934
[65] validation_0-auc:0.88212 validation_1-auc:0.832176
[66] validation_0-auc:0.882567 validation_1-auc:0.832091
[67] validation_0-auc:0.883302 validation_1-auc:0.832313
[68] validation_0-auc:0.883768 validation_1-auc:0.832416
[69] validation_0-auc:0.883913 validation_1-auc:0.832631
[70] validation_0-auc:0.884275 validation_1-auc:0.83272
[71] validation_0-auc:0.885019 validation_1-auc:0.832985
[72] validation_0-auc:0.885666 validation_1-auc:0.833156
[73] validation_0-auc:0.886224 validation_1-auc:0.833269
[74] validation_0-auc:0.886241 validation_1-auc:0.833754
[75] validation_0-auc:0.886664 validation_1-auc:0.833645
[76] validation_0-auc:0.887104 validation_1-auc:0.833656
[77] validation_0-auc:0.887462 validation_1-auc:0.833606
[78] validation_0-auc:0.887896 validation_1-auc:0.833657
[79] validation_0-auc:0.888586 validation_1-auc:0.83395
[80] validation_0-auc:0.88891 validation_1-auc:0.83367
[81] validation_0-auc:0.889337 validation_1-auc:0.833572
[82] validation_0-auc:0.889646 validation_1-auc:0.83363
[83] validation_0-auc:0.890215 validation_1-auc:0.833872
[84] validation_0-auc:0.890735 validation_1-auc:0.834131
[85] validation_0-auc:0.890844 validation_1-auc:0.834008
[86] validation_0-auc:0.891124 validation_1-auc:0.834005
[87] validation_0-auc:0.891396 validation_1-auc:0.834131
[88] validation_0-auc:0.891587 validation_1-auc:0.834162
[89] validation_0-auc:0.891881 validation_1-auc:0.834132
[90] validation_0-auc:0.892143 validation_1-auc:0.834235
[91] validation_0-auc:0.892447 validation_1-auc:0.834213
[92] validation_0-auc:0.892836 validation_1-auc:0.834117
[93] validation_0-auc:0.893031 validation_1-auc:0.83401
[94] validation_0-auc:0.893242 validation_1-auc:0.834105
[95] validation_0-auc:0.893483 validation_1-auc:0.83412
[96] validation_0-auc:0.893757 validation_1-auc:0.834283
[97] validation_0-auc:0.893974 validation_1-auc:0.834286
[98] validation_0-auc:0.894193 validation_1-auc:0.834303
[99] validation_0-auc:0.894611 validation_1-auc:0.834417
[0] validation_0-auc:0.820551 validation_1-auc:0.806943
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.827059 validation_1-auc:0.810364
[2] validation_0-auc:0.826256 validation_1-auc:0.809256
[3] validation_0-auc:0.834311 validation_1-auc:0.81732
[4] validation_0-auc:0.835219 validation_1-auc:0.819816
[5] validation_0-auc:0.838487 validation_1-auc:0.82075
[6] validation_0-auc:0.839813 validation_1-auc:0.822452
[7] validation_0-auc:0.84094 validation_1-auc:0.823224
[8] validation_0-auc:0.841564 validation_1-auc:0.822684
[9] validation_0-auc:0.844904 validation_1-auc:0.826475
[10] validation_0-auc:0.846299 validation_1-auc:0.82681
[11] validation_0-auc:0.846939 validation_1-auc:0.827715
[12] validation_0-auc:0.847515 validation_1-auc:0.827761
[13] validation_0-auc:0.851728 validation_1-auc:0.828493
[14] validation_0-auc:0.852777 validation_1-auc:0.829275
[15] validation_0-auc:0.853314 validation_1-auc:0.829306
[16] validation_0-auc:0.853916 validation_1-auc:0.829343
[17] validation_0-auc:0.854312 validation_1-auc:0.829609
[18] validation_0-auc:0.854871 validation_1-auc:0.830008
[19] validation_0-auc:0.855595 validation_1-auc:0.829766
[20] validation_0-auc:0.856241 validation_1-auc:0.830028
[21] validation_0-auc:0.857523 validation_1-auc:0.830036
[22] validation_0-auc:0.858125 validation_1-auc:0.83044
[23] validation_0-auc:0.858265 validation_1-auc:0.830676
[24] validation_0-auc:0.858384 validation_1-auc:0.830733
[25] validation_0-auc:0.860089 validation_1-auc:0.831043
[26] validation_0-auc:0.860655 validation_1-auc:0.831796
[27] validation_0-auc:0.861345 validation_1-auc:0.832427
[28] validation_0-auc:0.861873 validation_1-auc:0.832528
[29] validation_0-auc:0.863432 validation_1-auc:0.832039
[30] validation_0-auc:0.863979 validation_1-auc:0.832505
[31] validation_0-auc:0.864387 validation_1-auc:0.832972
[32] validation_0-auc:0.865625 validation_1-auc:0.833234
[33] validation_0-auc:0.866782 validation_1-auc:0.832647
[34] validation_0-auc:0.867907 validation_1-auc:0.833187
[35] validation_0-auc:0.868978 validation_1-auc:0.833287
[36] validation_0-auc:0.869673 validation_1-auc:0.833375
[37] validation_0-auc:0.870712 validation_1-auc:0.833689
[38] validation_0-auc:0.871491 validation_1-auc:0.834011
[39] validation_0-auc:0.871897 validation_1-auc:0.833354
[40] validation_0-auc:0.87248 validation_1-auc:0.833036
[41] validation_0-auc:0.872706 validation_1-auc:0.832434
[42] validation_0-auc:0.873155 validation_1-auc:0.832768
[43] validation_0-auc:0.873851 validation_1-auc:0.832882
[44] validation_0-auc:0.874165 validation_1-auc:0.83298
[45] validation_0-auc:0.87416 validation_1-auc:0.832615
[46] validation_0-auc:0.87482 validation_1-auc:0.832664
[47] validation_0-auc:0.875556 validation_1-auc:0.832452
[48] validation_0-auc:0.875726 validation_1-auc:0.832285
[49] validation_0-auc:0.875726 validation_1-auc:0.831575
[50] validation_0-auc:0.875325 validation_1-auc:0.831079
[51] validation_0-auc:0.875998 validation_1-auc:0.831622
[52] validation_0-auc:0.876838 validation_1-auc:0.832091
[53] validation_0-auc:0.87757 validation_1-auc:0.832475
[54] validation_0-auc:0.878237 validation_1-auc:0.832698
[55] validation_0-auc:0.878184 validation_1-auc:0.832234
[56] validation_0-auc:0.879115 validation_1-auc:0.832355
[57] validation_0-auc:0.879688 validation_1-auc:0.832873
[58] validation_0-auc:0.879822 validation_1-auc:0.832159
[59] validation_0-auc:0.880268 validation_1-auc:0.832374
[60] validation_0-auc:0.880881 validation_1-auc:0.832947
[61] validation_0-auc:0.881359 validation_1-auc:0.833024
[62] validation_0-auc:0.881883 validation_1-auc:0.833755
[63] validation_0-auc:0.882319 validation_1-auc:0.833791
[64] validation_0-auc:0.882778 validation_1-auc:0.834228
[65] validation_0-auc:0.883812 validation_1-auc:0.834299
[66] validation_0-auc:0.883976 validation_1-auc:0.833716
[67] validation_0-auc:0.884541 validation_1-auc:0.833926
[68] validation_0-auc:0.885126 validation_1-auc:0.834425
[69] validation_0-auc:0.885237 validation_1-auc:0.834851
[70] validation_0-auc:0.885716 validation_1-auc:0.835039
[71] validation_0-auc:0.886069 validation_1-auc:0.83518
[72] validation_0-auc:0.886479 validation_1-auc:0.83559
[73] validation_0-auc:0.887056 validation_1-auc:0.835481
[74] validation_0-auc:0.887303 validation_1-auc:0.835724
[75] validation_0-auc:0.887648 validation_1-auc:0.835542
[76] validation_0-auc:0.888143 validation_1-auc:0.835509
[77] validation_0-auc:0.888636 validation_1-auc:0.835549
[78] validation_0-auc:0.889205 validation_1-auc:0.835379
[79] validation_0-auc:0.889667 validation_1-auc:0.835125
[80] validation_0-auc:0.889861 validation_1-auc:0.835069
[81] validation_0-auc:0.890127 validation_1-auc:0.835393
[82] validation_0-auc:0.8907 validation_1-auc:0.835424
[83] validation_0-auc:0.891183 validation_1-auc:0.83554
[84] validation_0-auc:0.891614 validation_1-auc:0.835556
[85] validation_0-auc:0.891741 validation_1-auc:0.835542
[86] validation_0-auc:0.892013 validation_1-auc:0.835501
[87] validation_0-auc:0.892299 validation_1-auc:0.835554
[88] validation_0-auc:0.892763 validation_1-auc:0.835825
[89] validation_0-auc:0.893147 validation_1-auc:0.836072
[90] validation_0-auc:0.893462 validation_1-auc:0.836184
[91] validation_0-auc:0.893958 validation_1-auc:0.836121
[92] validation_0-auc:0.894442 validation_1-auc:0.835975
[93] validation_0-auc:0.894866 validation_1-auc:0.836085
[94] validation_0-auc:0.894992 validation_1-auc:0.836242
[95] validation_0-auc:0.895284 validation_1-auc:0.836353
[96] validation_0-auc:0.89568 validation_1-auc:0.836519
[97] validation_0-auc:0.896032 validation_1-auc:0.836918
[98] validation_0-auc:0.896451 validation_1-auc:0.83687
[99] validation_0-auc:0.89666 validation_1-auc:0.83677
[0] validation_0-auc:0.828946 validation_1-auc:0.81023
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.831855 validation_1-auc:0.814233
[2] validation_0-auc:0.834698 validation_1-auc:0.815638
[3] validation_0-auc:0.835846 validation_1-auc:0.818512
[4] validation_0-auc:0.837656 validation_1-auc:0.818859
[5] validation_0-auc:0.83869 validation_1-auc:0.820356
[6] validation_0-auc:0.841705 validation_1-auc:0.821554
[7] validation_0-auc:0.843177 validation_1-auc:0.821496
[8] validation_0-auc:0.843183 validation_1-auc:0.820271
[9] validation_0-auc:0.845774 validation_1-auc:0.822222
[10] validation_0-auc:0.846094 validation_1-auc:0.821476
[11] validation_0-auc:0.84672 validation_1-auc:0.821712
[12] validation_0-auc:0.847408 validation_1-auc:0.822447
[13] validation_0-auc:0.853003 validation_1-auc:0.825793
[14] validation_0-auc:0.85491 validation_1-auc:0.826114
[15] validation_0-auc:0.856464 validation_1-auc:0.827555
[16] validation_0-auc:0.856552 validation_1-auc:0.827795
[17] validation_0-auc:0.856592 validation_1-auc:0.82827
[18] validation_0-auc:0.856865 validation_1-auc:0.828583
[19] validation_0-auc:0.858946 validation_1-auc:0.829015
[20] validation_0-auc:0.860036 validation_1-auc:0.828965
[21] validation_0-auc:0.861654 validation_1-auc:0.828864
[22] validation_0-auc:0.862007 validation_1-auc:0.82915
[23] validation_0-auc:0.86308 validation_1-auc:0.829705
[24] validation_0-auc:0.8633 validation_1-auc:0.829685
[25] validation_0-auc:0.863563 validation_1-auc:0.829459
[26] validation_0-auc:0.864615 validation_1-auc:0.829547
[27] validation_0-auc:0.865145 validation_1-auc:0.829696
[28] validation_0-auc:0.865686 validation_1-auc:0.830058
[29] validation_0-auc:0.866444 validation_1-auc:0.830552
[30] validation_0-auc:0.867292 validation_1-auc:0.830872
[31] validation_0-auc:0.867497 validation_1-auc:0.830994
[32] validation_0-auc:0.868016 validation_1-auc:0.831119
[33] validation_0-auc:0.868593 validation_1-auc:0.830923
[34] validation_0-auc:0.86962 validation_1-auc:0.83118
[35] validation_0-auc:0.869928 validation_1-auc:0.83117
[36] validation_0-auc:0.870302 validation_1-auc:0.831282
[37] validation_0-auc:0.870833 validation_1-auc:0.831332
[38] validation_0-auc:0.871224 validation_1-auc:0.831162
[39] validation_0-auc:0.872118 validation_1-auc:0.831127
[40] validation_0-auc:0.872813 validation_1-auc:0.831014
[41] validation_0-auc:0.873404 validation_1-auc:0.830582
[42] validation_0-auc:0.873939 validation_1-auc:0.830848
[43] validation_0-auc:0.874505 validation_1-auc:0.83123
[44] validation_0-auc:0.875468 validation_1-auc:0.831988
[45] validation_0-auc:0.87583 validation_1-auc:0.831788
[46] validation_0-auc:0.876124 validation_1-auc:0.832137
[47] validation_0-auc:0.876661 validation_1-auc:0.832457
[48] validation_0-auc:0.877272 validation_1-auc:0.831833
[49] validation_0-auc:0.877584 validation_1-auc:0.830818
[50] validation_0-auc:0.877776 validation_1-auc:0.830346
[51] validation_0-auc:0.878428 validation_1-auc:0.830682
[52] validation_0-auc:0.878868 validation_1-auc:0.831529
[53] validation_0-auc:0.87893 validation_1-auc:0.831674
[54] validation_0-auc:0.879592 validation_1-auc:0.832084
[55] validation_0-auc:0.87976 validation_1-auc:0.831856
[56] validation_0-auc:0.880392 validation_1-auc:0.83255
[57] validation_0-auc:0.880857 validation_1-auc:0.833064
[58] validation_0-auc:0.880832 validation_1-auc:0.832579
[59] validation_0-auc:0.881086 validation_1-auc:0.832521
[60] validation_0-auc:0.882078 validation_1-auc:0.832762
[61] validation_0-auc:0.882652 validation_1-auc:0.833018
[62] validation_0-auc:0.883017 validation_1-auc:0.833184
[63] validation_0-auc:0.883638 validation_1-auc:0.833309
[64] validation_0-auc:0.884138 validation_1-auc:0.833807
[65] validation_0-auc:0.885178 validation_1-auc:0.833844
[66] validation_0-auc:0.885543 validation_1-auc:0.833418
[67] validation_0-auc:0.886338 validation_1-auc:0.833404
[68] validation_0-auc:0.886623 validation_1-auc:0.83383
[69] validation_0-auc:0.886824 validation_1-auc:0.833934
[70] validation_0-auc:0.887275 validation_1-auc:0.833931
[71] validation_0-auc:0.887809 validation_1-auc:0.834069
[72] validation_0-auc:0.888122 validation_1-auc:0.834017
[73] validation_0-auc:0.888453 validation_1-auc:0.834013
[74] validation_0-auc:0.888694 validation_1-auc:0.834005
[75] validation_0-auc:0.889104 validation_1-auc:0.833858
[76] validation_0-auc:0.889658 validation_1-auc:0.83397
[77] validation_0-auc:0.890218 validation_1-auc:0.83403
[78] validation_0-auc:0.890859 validation_1-auc:0.834077
[79] validation_0-auc:0.891268 validation_1-auc:0.834086
[80] validation_0-auc:0.891541 validation_1-auc:0.833823
[81] validation_0-auc:0.892001 validation_1-auc:0.834051
[82] validation_0-auc:0.892377 validation_1-auc:0.834509
[83] validation_0-auc:0.892935 validation_1-auc:0.834502
[84] validation_0-auc:0.893197 validation_1-auc:0.834635
[85] validation_0-auc:0.893311 validation_1-auc:0.834806
[86] validation_0-auc:0.893719 validation_1-auc:0.83481
[87] validation_0-auc:0.893898 validation_1-auc:0.834854
[88] validation_0-auc:0.894226 validation_1-auc:0.834957
[89] validation_0-auc:0.894621 validation_1-auc:0.835071
[90] validation_0-auc:0.894932 validation_1-auc:0.83514
[91] validation_0-auc:0.895332 validation_1-auc:0.835016
[92] validation_0-auc:0.895723 validation_1-auc:0.834881
[93] validation_0-auc:0.896136 validation_1-auc:0.834917
[94] validation_0-auc:0.8964 validation_1-auc:0.834905
[95] validation_0-auc:0.896683 validation_1-auc:0.835027
[96] validation_0-auc:0.897121 validation_1-auc:0.835191
[97] validation_0-auc:0.897486 validation_1-auc:0.835425
[98] validation_0-auc:0.897888 validation_1-auc:0.835669
[99] validation_0-auc:0.898217 validation_1-auc:0.835894
[0] validation_0-auc:0.735649 validation_1-auc:0.700029
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.821245 validation_1-auc:0.791236
[2] validation_0-auc:0.833171 validation_1-auc:0.807055
[3] validation_0-auc:0.838422 validation_1-auc:0.808869
[4] validation_0-auc:0.840689 validation_1-auc:0.810406
[5] validation_0-auc:0.839996 validation_1-auc:0.808383
[6] validation_0-auc:0.841978 validation_1-auc:0.809843
[7] validation_0-auc:0.845504 validation_1-auc:0.812591
[8] validation_0-auc:0.850165 validation_1-auc:0.815846
[9] validation_0-auc:0.851319 validation_1-auc:0.817164
[10] validation_0-auc:0.854182 validation_1-auc:0.818615
[11] validation_0-auc:0.855892 validation_1-auc:0.820608
[12] validation_0-auc:0.858867 validation_1-auc:0.820975
[13] validation_0-auc:0.861457 validation_1-auc:0.821942
[14] validation_0-auc:0.864048 validation_1-auc:0.824015
[15] validation_0-auc:0.865929 validation_1-auc:0.825738
[16] validation_0-auc:0.867894 validation_1-auc:0.826307
[17] validation_0-auc:0.870653 validation_1-auc:0.827473
[18] validation_0-auc:0.871826 validation_1-auc:0.827465
[19] validation_0-auc:0.87283 validation_1-auc:0.826552
[20] validation_0-auc:0.874622 validation_1-auc:0.826851
[21] validation_0-auc:0.876562 validation_1-auc:0.827343
[22] validation_0-auc:0.876506 validation_1-auc:0.827496
[23] validation_0-auc:0.878322 validation_1-auc:0.828718
[24] validation_0-auc:0.878987 validation_1-auc:0.829433
[25] validation_0-auc:0.88052 validation_1-auc:0.830273
[26] validation_0-auc:0.882004 validation_1-auc:0.829229
[27] validation_0-auc:0.883911 validation_1-auc:0.830152
[28] validation_0-auc:0.884985 validation_1-auc:0.830914
[29] validation_0-auc:0.884984 validation_1-auc:0.830152
[30] validation_0-auc:0.886269 validation_1-auc:0.830498
[31] validation_0-auc:0.887918 validation_1-auc:0.831398
[32] validation_0-auc:0.890261 validation_1-auc:0.831613
[33] validation_0-auc:0.891232 validation_1-auc:0.831529
[34] validation_0-auc:0.892832 validation_1-auc:0.832212
[35] validation_0-auc:0.894747 validation_1-auc:0.831902
[36] validation_0-auc:0.896609 validation_1-auc:0.83169
[37] validation_0-auc:0.897128 validation_1-auc:0.831654
[38] validation_0-auc:0.898475 validation_1-auc:0.831794
[39] validation_0-auc:0.899004 validation_1-auc:0.831867
[40] validation_0-auc:0.90004 validation_1-auc:0.830622
[41] validation_0-auc:0.900673 validation_1-auc:0.829426
[42] validation_0-auc:0.90157 validation_1-auc:0.82983
[43] validation_0-auc:0.902136 validation_1-auc:0.82893
[44] validation_0-auc:0.903717 validation_1-auc:0.829758
[45] validation_0-auc:0.904047 validation_1-auc:0.829302
[46] validation_0-auc:0.905306 validation_1-auc:0.830443
[47] validation_0-auc:0.906996 validation_1-auc:0.83128
[48] validation_0-auc:0.907657 validation_1-auc:0.830853
[49] validation_0-auc:0.908225 validation_1-auc:0.830622
[50] validation_0-auc:0.908539 validation_1-auc:0.830386
[51] validation_0-auc:0.909827 validation_1-auc:0.830589
[52] validation_0-auc:0.910578 validation_1-auc:0.830832
[53] validation_0-auc:0.911397 validation_1-auc:0.831174
[54] validation_0-auc:0.912195 validation_1-auc:0.83127
[55] validation_0-auc:0.912678 validation_1-auc:0.831295
[56] validation_0-auc:0.913389 validation_1-auc:0.831774
[57] validation_0-auc:0.914149 validation_1-auc:0.832029
[58] validation_0-auc:0.914528 validation_1-auc:0.8314
[59] validation_0-auc:0.914973 validation_1-auc:0.83164
[60] validation_0-auc:0.915433 validation_1-auc:0.831815
[61] validation_0-auc:0.915733 validation_1-auc:0.831557
[62] validation_0-auc:0.916274 validation_1-auc:0.83203
[63] validation_0-auc:0.916705 validation_1-auc:0.832076
[64] validation_0-auc:0.917023 validation_1-auc:0.83222
[65] validation_0-auc:0.917809 validation_1-auc:0.832031
[66] validation_0-auc:0.918658 validation_1-auc:0.831938
[67] validation_0-auc:0.919316 validation_1-auc:0.832061
[68] validation_0-auc:0.91984 validation_1-auc:0.832006
[69] validation_0-auc:0.91993 validation_1-auc:0.831817
[70] validation_0-auc:0.920171 validation_1-auc:0.83193
[71] validation_0-auc:0.920324 validation_1-auc:0.831859
[72] validation_0-auc:0.920653 validation_1-auc:0.83144
[73] validation_0-auc:0.921071 validation_1-auc:0.831754
[74] validation_0-auc:0.921216 validation_1-auc:0.831849
[75] validation_0-auc:0.921855 validation_1-auc:0.83185
[76] validation_0-auc:0.922097 validation_1-auc:0.831745
[77] validation_0-auc:0.92245 validation_1-auc:0.831505
[78] validation_0-auc:0.922812 validation_1-auc:0.83142
[79] validation_0-auc:0.922902 validation_1-auc:0.831412
[80] validation_0-auc:0.923065 validation_1-auc:0.831328
[81] validation_0-auc:0.923162 validation_1-auc:0.831124
[82] validation_0-auc:0.923806 validation_1-auc:0.831134
[83] validation_0-auc:0.923938 validation_1-auc:0.831075
[84] validation_0-auc:0.924096 validation_1-auc:0.831053
[85] validation_0-auc:0.924388 validation_1-auc:0.830911
[86] validation_0-auc:0.924558 validation_1-auc:0.830681
[87] validation_0-auc:0.924676 validation_1-auc:0.830777
[88] validation_0-auc:0.925182 validation_1-auc:0.830692
[89] validation_0-auc:0.925539 validation_1-auc:0.830557
[90] validation_0-auc:0.9259 validation_1-auc:0.830457
[91] validation_0-auc:0.925958 validation_1-auc:0.830409
[92] validation_0-auc:0.926077 validation_1-auc:0.830387
[93] validation_0-auc:0.926142 validation_1-auc:0.830335
[94] validation_0-auc:0.926181 validation_1-auc:0.830452
Stopping. Best iteration:
[64] validation_0-auc:0.917023 validation_1-auc:0.83222
[0] validation_0-auc:0.740203 validation_1-auc:0.720568
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.824594 validation_1-auc:0.800839
[2] validation_0-auc:0.831118 validation_1-auc:0.807665
[3] validation_0-auc:0.843909 validation_1-auc:0.819127
[4] validation_0-auc:0.849698 validation_1-auc:0.824291
[5] validation_0-auc:0.848855 validation_1-auc:0.818673
[6] validation_0-auc:0.851672 validation_1-auc:0.820105
[7] validation_0-auc:0.85383 validation_1-auc:0.822293
[8] validation_0-auc:0.860358 validation_1-auc:0.825487
[9] validation_0-auc:0.862399 validation_1-auc:0.826446
[10] validation_0-auc:0.864899 validation_1-auc:0.82781
[11] validation_0-auc:0.86613 validation_1-auc:0.828101
[12] validation_0-auc:0.866999 validation_1-auc:0.829216
[13] validation_0-auc:0.868834 validation_1-auc:0.829353
[14] validation_0-auc:0.870163 validation_1-auc:0.830118
[15] validation_0-auc:0.871273 validation_1-auc:0.83011
[16] validation_0-auc:0.873204 validation_1-auc:0.830064
[17] validation_0-auc:0.876094 validation_1-auc:0.830677
[18] validation_0-auc:0.876383 validation_1-auc:0.831241
[19] validation_0-auc:0.876925 validation_1-auc:0.830395
[20] validation_0-auc:0.878281 validation_1-auc:0.831074
[21] validation_0-auc:0.878738 validation_1-auc:0.83093
[22] validation_0-auc:0.87913 validation_1-auc:0.829948
[23] validation_0-auc:0.88047 validation_1-auc:0.831289
[24] validation_0-auc:0.881496 validation_1-auc:0.831619
[25] validation_0-auc:0.88416 validation_1-auc:0.832242
[26] validation_0-auc:0.885681 validation_1-auc:0.831089
[27] validation_0-auc:0.886822 validation_1-auc:0.831717
[28] validation_0-auc:0.888236 validation_1-auc:0.832117
[29] validation_0-auc:0.888346 validation_1-auc:0.831146
[30] validation_0-auc:0.889645 validation_1-auc:0.831789
[31] validation_0-auc:0.890861 validation_1-auc:0.83228
[32] validation_0-auc:0.892352 validation_1-auc:0.832996
[33] validation_0-auc:0.8936 validation_1-auc:0.832588
[34] validation_0-auc:0.895223 validation_1-auc:0.833549
[35] validation_0-auc:0.897286 validation_1-auc:0.83407
[36] validation_0-auc:0.898867 validation_1-auc:0.834109
[37] validation_0-auc:0.899833 validation_1-auc:0.833662
[38] validation_0-auc:0.900757 validation_1-auc:0.834234
[39] validation_0-auc:0.901214 validation_1-auc:0.833276
[40] validation_0-auc:0.901935 validation_1-auc:0.832465
[41] validation_0-auc:0.902241 validation_1-auc:0.832029
[42] validation_0-auc:0.90401 validation_1-auc:0.832469
[43] validation_0-auc:0.904183 validation_1-auc:0.831944
[44] validation_0-auc:0.906037 validation_1-auc:0.832407
[45] validation_0-auc:0.90638 validation_1-auc:0.831989
[46] validation_0-auc:0.908044 validation_1-auc:0.832398
[47] validation_0-auc:0.90946 validation_1-auc:0.832572
[48] validation_0-auc:0.91005 validation_1-auc:0.831811
[49] validation_0-auc:0.910492 validation_1-auc:0.831371
[50] validation_0-auc:0.911205 validation_1-auc:0.830815
[51] validation_0-auc:0.912547 validation_1-auc:0.831482
[52] validation_0-auc:0.913199 validation_1-auc:0.832357
[53] validation_0-auc:0.914111 validation_1-auc:0.832651
[54] validation_0-auc:0.914786 validation_1-auc:0.833407
[55] validation_0-auc:0.915374 validation_1-auc:0.832868
[56] validation_0-auc:0.916226 validation_1-auc:0.833248
[57] validation_0-auc:0.916876 validation_1-auc:0.833776
[58] validation_0-auc:0.917324 validation_1-auc:0.833054
[59] validation_0-auc:0.917622 validation_1-auc:0.83365
[60] validation_0-auc:0.917945 validation_1-auc:0.834103
[61] validation_0-auc:0.918251 validation_1-auc:0.834431
[62] validation_0-auc:0.918686 validation_1-auc:0.834911
[63] validation_0-auc:0.919281 validation_1-auc:0.835517
[64] validation_0-auc:0.919879 validation_1-auc:0.835523
[65] validation_0-auc:0.920454 validation_1-auc:0.835733
[66] validation_0-auc:0.9209 validation_1-auc:0.835728
[67] validation_0-auc:0.921292 validation_1-auc:0.835913
[68] validation_0-auc:0.921741 validation_1-auc:0.836181
[69] validation_0-auc:0.921973 validation_1-auc:0.836704
[70] validation_0-auc:0.922206 validation_1-auc:0.836828
[71] validation_0-auc:0.922662 validation_1-auc:0.836949
[72] validation_0-auc:0.923012 validation_1-auc:0.836944
[73] validation_0-auc:0.923471 validation_1-auc:0.836657
[74] validation_0-auc:0.923647 validation_1-auc:0.836787
[75] validation_0-auc:0.923937 validation_1-auc:0.836821
[76] validation_0-auc:0.924141 validation_1-auc:0.836842
[77] validation_0-auc:0.924343 validation_1-auc:0.837076
[78] validation_0-auc:0.924625 validation_1-auc:0.837136
[79] validation_0-auc:0.924854 validation_1-auc:0.837234
[80] validation_0-auc:0.924976 validation_1-auc:0.837272
[81] validation_0-auc:0.925179 validation_1-auc:0.837269
[82] validation_0-auc:0.925308 validation_1-auc:0.837166
[83] validation_0-auc:0.925564 validation_1-auc:0.837164
[84] validation_0-auc:0.926053 validation_1-auc:0.837256
[85] validation_0-auc:0.926416 validation_1-auc:0.837208
[86] validation_0-auc:0.92692 validation_1-auc:0.837155
[87] validation_0-auc:0.927088 validation_1-auc:0.837201
[88] validation_0-auc:0.927378 validation_1-auc:0.836984
[89] validation_0-auc:0.927709 validation_1-auc:0.837137
[90] validation_0-auc:0.927877 validation_1-auc:0.837128
[91] validation_0-auc:0.927981 validation_1-auc:0.837185
[92] validation_0-auc:0.928071 validation_1-auc:0.837129
[93] validation_0-auc:0.928284 validation_1-auc:0.837103
[94] validation_0-auc:0.928371 validation_1-auc:0.836944
[95] validation_0-auc:0.928384 validation_1-auc:0.836923
[96] validation_0-auc:0.928714 validation_1-auc:0.83698
[97] validation_0-auc:0.928982 validation_1-auc:0.837082
[98] validation_0-auc:0.929429 validation_1-auc:0.83715
[99] validation_0-auc:0.929596 validation_1-auc:0.837123
[0] validation_0-auc:0.723726 validation_1-auc:0.728885
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.82899 validation_1-auc:0.80777
[2] validation_0-auc:0.834439 validation_1-auc:0.812081
[3] validation_0-auc:0.84247 validation_1-auc:0.815271
[4] validation_0-auc:0.845031 validation_1-auc:0.817808
[5] validation_0-auc:0.84483 validation_1-auc:0.816963
[6] validation_0-auc:0.84795 validation_1-auc:0.816945
[7] validation_0-auc:0.851959 validation_1-auc:0.819371
[8] validation_0-auc:0.856728 validation_1-auc:0.820376
[9] validation_0-auc:0.860062 validation_1-auc:0.8238
[10] validation_0-auc:0.862415 validation_1-auc:0.82459
[11] validation_0-auc:0.862903 validation_1-auc:0.825293
[12] validation_0-auc:0.864969 validation_1-auc:0.82679
[13] validation_0-auc:0.868196 validation_1-auc:0.826652
[14] validation_0-auc:0.870668 validation_1-auc:0.826499
[15] validation_0-auc:0.872003 validation_1-auc:0.827241
[16] validation_0-auc:0.873007 validation_1-auc:0.827462
[17] validation_0-auc:0.87468 validation_1-auc:0.827133
[18] validation_0-auc:0.875389 validation_1-auc:0.827688
[19] validation_0-auc:0.876656 validation_1-auc:0.827182
[20] validation_0-auc:0.879538 validation_1-auc:0.828032
[21] validation_0-auc:0.881117 validation_1-auc:0.827539
[22] validation_0-auc:0.882351 validation_1-auc:0.827005
[23] validation_0-auc:0.88378 validation_1-auc:0.827458
[24] validation_0-auc:0.884248 validation_1-auc:0.828202
[25] validation_0-auc:0.886013 validation_1-auc:0.828403
[26] validation_0-auc:0.887066 validation_1-auc:0.827335
[27] validation_0-auc:0.888293 validation_1-auc:0.828178
[28] validation_0-auc:0.88996 validation_1-auc:0.829135
[29] validation_0-auc:0.890591 validation_1-auc:0.828226
[30] validation_0-auc:0.891203 validation_1-auc:0.828433
[31] validation_0-auc:0.891576 validation_1-auc:0.828341
[32] validation_0-auc:0.893673 validation_1-auc:0.829364
[33] validation_0-auc:0.89473 validation_1-auc:0.82809
[34] validation_0-auc:0.896292 validation_1-auc:0.828616
[35] validation_0-auc:0.898045 validation_1-auc:0.829546
[36] validation_0-auc:0.900098 validation_1-auc:0.830107
[37] validation_0-auc:0.900726 validation_1-auc:0.829329
[38] validation_0-auc:0.901623 validation_1-auc:0.829974
[39] validation_0-auc:0.902315 validation_1-auc:0.829178
[40] validation_0-auc:0.903244 validation_1-auc:0.828941
[41] validation_0-auc:0.904227 validation_1-auc:0.828603
[42] validation_0-auc:0.905452 validation_1-auc:0.829844
[43] validation_0-auc:0.905925 validation_1-auc:0.829527
[44] validation_0-auc:0.907903 validation_1-auc:0.830607
[45] validation_0-auc:0.908203 validation_1-auc:0.829735
[46] validation_0-auc:0.909275 validation_1-auc:0.83079
[47] validation_0-auc:0.910308 validation_1-auc:0.831559
[48] validation_0-auc:0.910988 validation_1-auc:0.831087
[49] validation_0-auc:0.911487 validation_1-auc:0.830995
[50] validation_0-auc:0.91172 validation_1-auc:0.83049
[51] validation_0-auc:0.912875 validation_1-auc:0.830882
[52] validation_0-auc:0.913468 validation_1-auc:0.831333
[53] validation_0-auc:0.914061 validation_1-auc:0.831558
[54] validation_0-auc:0.914815 validation_1-auc:0.83156
[55] validation_0-auc:0.915381 validation_1-auc:0.831679
[56] validation_0-auc:0.916041 validation_1-auc:0.832313
[57] validation_0-auc:0.916596 validation_1-auc:0.832733
[58] validation_0-auc:0.917294 validation_1-auc:0.832197
[59] validation_0-auc:0.917794 validation_1-auc:0.832414
[60] validation_0-auc:0.917875 validation_1-auc:0.832719
[61] validation_0-auc:0.91816 validation_1-auc:0.832781
[62] validation_0-auc:0.918673 validation_1-auc:0.833227
[63] validation_0-auc:0.919105 validation_1-auc:0.833265
[64] validation_0-auc:0.919727 validation_1-auc:0.833186
[65] validation_0-auc:0.920054 validation_1-auc:0.833267
[66] validation_0-auc:0.920686 validation_1-auc:0.832939
[67] validation_0-auc:0.921105 validation_1-auc:0.833036
[68] validation_0-auc:0.921415 validation_1-auc:0.833133
[69] validation_0-auc:0.921742 validation_1-auc:0.833539
[70] validation_0-auc:0.922152 validation_1-auc:0.833643
[71] validation_0-auc:0.922547 validation_1-auc:0.833778
[72] validation_0-auc:0.922898 validation_1-auc:0.833711
[73] validation_0-auc:0.923346 validation_1-auc:0.833655
[74] validation_0-auc:0.923666 validation_1-auc:0.833793
[75] validation_0-auc:0.923836 validation_1-auc:0.833705
[76] validation_0-auc:0.924333 validation_1-auc:0.833548
[77] validation_0-auc:0.924638 validation_1-auc:0.83359
[78] validation_0-auc:0.92524 validation_1-auc:0.833181
[79] validation_0-auc:0.925517 validation_1-auc:0.833353
[80] validation_0-auc:0.92574 validation_1-auc:0.833315
[81] validation_0-auc:0.925845 validation_1-auc:0.833175
[82] validation_0-auc:0.92613 validation_1-auc:0.833393
[83] validation_0-auc:0.926666 validation_1-auc:0.83342
[84] validation_0-auc:0.926862 validation_1-auc:0.833408
[85] validation_0-auc:0.92697 validation_1-auc:0.83365
[86] validation_0-auc:0.927437 validation_1-auc:0.833312
[87] validation_0-auc:0.92768 validation_1-auc:0.833459
[88] validation_0-auc:0.928029 validation_1-auc:0.833241
[89] validation_0-auc:0.928179 validation_1-auc:0.833199
[90] validation_0-auc:0.928659 validation_1-auc:0.833158
[91] validation_0-auc:0.928758 validation_1-auc:0.833345
[92] validation_0-auc:0.928888 validation_1-auc:0.833318
[93] validation_0-auc:0.929057 validation_1-auc:0.833365
[94] validation_0-auc:0.929215 validation_1-auc:0.833513
[95] validation_0-auc:0.929417 validation_1-auc:0.833559
[96] validation_0-auc:0.929498 validation_1-auc:0.833454
[97] validation_0-auc:0.929727 validation_1-auc:0.833717
[98] validation_0-auc:0.929823 validation_1-auc:0.833802
[99] validation_0-auc:0.929964 validation_1-auc:0.833968
[0] validation_0-auc:0.734013 validation_1-auc:0.699917
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.815712 validation_1-auc:0.791124
[2] validation_0-auc:0.826338 validation_1-auc:0.801531
[3] validation_0-auc:0.828901 validation_1-auc:0.805433
[4] validation_0-auc:0.832664 validation_1-auc:0.808972
[5] validation_0-auc:0.830629 validation_1-auc:0.805813
[6] validation_0-auc:0.82852 validation_1-auc:0.803892
[7] validation_0-auc:0.832839 validation_1-auc:0.806202
[8] validation_0-auc:0.836302 validation_1-auc:0.810553
[9] validation_0-auc:0.835354 validation_1-auc:0.808394
[10] validation_0-auc:0.837396 validation_1-auc:0.811226
[11] validation_0-auc:0.838546 validation_1-auc:0.812383
[12] validation_0-auc:0.840128 validation_1-auc:0.813825
[13] validation_0-auc:0.84085 validation_1-auc:0.813031
[14] validation_0-auc:0.839453 validation_1-auc:0.810413
[15] validation_0-auc:0.841015 validation_1-auc:0.81322
[16] validation_0-auc:0.841745 validation_1-auc:0.814337
[17] validation_0-auc:0.84312 validation_1-auc:0.816642
[18] validation_0-auc:0.844838 validation_1-auc:0.818179
[19] validation_0-auc:0.844116 validation_1-auc:0.816792
[20] validation_0-auc:0.84478 validation_1-auc:0.817556
[21] validation_0-auc:0.844987 validation_1-auc:0.817527
[22] validation_0-auc:0.84419 validation_1-auc:0.816193
[23] validation_0-auc:0.845298 validation_1-auc:0.816719
[24] validation_0-auc:0.846629 validation_1-auc:0.818211
[25] validation_0-auc:0.848246 validation_1-auc:0.819992
[26] validation_0-auc:0.847879 validation_1-auc:0.81863
[27] validation_0-auc:0.849203 validation_1-auc:0.820018
[28] validation_0-auc:0.849703 validation_1-auc:0.820577
[29] validation_0-auc:0.849769 validation_1-auc:0.819573
[30] validation_0-auc:0.84992 validation_1-auc:0.820074
[31] validation_0-auc:0.850242 validation_1-auc:0.821086
[32] validation_0-auc:0.849811 validation_1-auc:0.820068
[33] validation_0-auc:0.849646 validation_1-auc:0.819898
[34] validation_0-auc:0.849093 validation_1-auc:0.818926
[35] validation_0-auc:0.84978 validation_1-auc:0.819504
[36] validation_0-auc:0.850466 validation_1-auc:0.820026
[37] validation_0-auc:0.850342 validation_1-auc:0.819281
[38] validation_0-auc:0.850138 validation_1-auc:0.81877
[39] validation_0-auc:0.849777 validation_1-auc:0.818615
[40] validation_0-auc:0.849638 validation_1-auc:0.818007
[41] validation_0-auc:0.848896 validation_1-auc:0.817104
[42] validation_0-auc:0.850194 validation_1-auc:0.818141
[43] validation_0-auc:0.84964 validation_1-auc:0.817339
[44] validation_0-auc:0.850525 validation_1-auc:0.818563
[45] validation_0-auc:0.850347 validation_1-auc:0.817913
[46] validation_0-auc:0.851478 validation_1-auc:0.818898
[47] validation_0-auc:0.852212 validation_1-auc:0.819529
[48] validation_0-auc:0.852394 validation_1-auc:0.819604
[49] validation_0-auc:0.85238 validation_1-auc:0.819416
[50] validation_0-auc:0.851838 validation_1-auc:0.818648
[51] validation_0-auc:0.853284 validation_1-auc:0.819992
[52] validation_0-auc:0.853732 validation_1-auc:0.820719
[53] validation_0-auc:0.854587 validation_1-auc:0.821537
[54] validation_0-auc:0.855184 validation_1-auc:0.822208
[55] validation_0-auc:0.855527 validation_1-auc:0.822056
[56] validation_0-auc:0.856237 validation_1-auc:0.822697
[57] validation_0-auc:0.856817 validation_1-auc:0.823154
[58] validation_0-auc:0.856547 validation_1-auc:0.822759
[59] validation_0-auc:0.856915 validation_1-auc:0.823184
[60] validation_0-auc:0.857252 validation_1-auc:0.823501
[61] validation_0-auc:0.857608 validation_1-auc:0.823744
[62] validation_0-auc:0.85789 validation_1-auc:0.8241
[63] validation_0-auc:0.858372 validation_1-auc:0.824583
[64] validation_0-auc:0.858633 validation_1-auc:0.825026
[65] validation_0-auc:0.859244 validation_1-auc:0.825535
[66] validation_0-auc:0.859108 validation_1-auc:0.825269
[67] validation_0-auc:0.859886 validation_1-auc:0.825682
[68] validation_0-auc:0.860564 validation_1-auc:0.825963
[69] validation_0-auc:0.860931 validation_1-auc:0.82609
[70] validation_0-auc:0.861173 validation_1-auc:0.826426
[71] validation_0-auc:0.861326 validation_1-auc:0.826725
[72] validation_0-auc:0.861291 validation_1-auc:0.826306
[73] validation_0-auc:0.861509 validation_1-auc:0.826041
[74] validation_0-auc:0.861904 validation_1-auc:0.826413
[75] validation_0-auc:0.8626 validation_1-auc:0.827061
[76] validation_0-auc:0.862892 validation_1-auc:0.827618
[77] validation_0-auc:0.863024 validation_1-auc:0.827785
[78] validation_0-auc:0.86311 validation_1-auc:0.827709
[79] validation_0-auc:0.863481 validation_1-auc:0.82789
[80] validation_0-auc:0.863582 validation_1-auc:0.827546
[81] validation_0-auc:0.864253 validation_1-auc:0.828185
[82] validation_0-auc:0.864716 validation_1-auc:0.82846
[83] validation_0-auc:0.865249 validation_1-auc:0.828904
[84] validation_0-auc:0.865725 validation_1-auc:0.829403
[85] validation_0-auc:0.865946 validation_1-auc:0.829601
[86] validation_0-auc:0.865908 validation_1-auc:0.829392
[87] validation_0-auc:0.866058 validation_1-auc:0.829722
[88] validation_0-auc:0.866199 validation_1-auc:0.829513
[89] validation_0-auc:0.866218 validation_1-auc:0.829175
[90] validation_0-auc:0.866592 validation_1-auc:0.829465
[91] validation_0-auc:0.867172 validation_1-auc:0.829903
[92] validation_0-auc:0.86729 validation_1-auc:0.829645
[93] validation_0-auc:0.867397 validation_1-auc:0.829614
[94] validation_0-auc:0.867515 validation_1-auc:0.82987
[95] validation_0-auc:0.867796 validation_1-auc:0.82983
[96] validation_0-auc:0.868293 validation_1-auc:0.830078
[97] validation_0-auc:0.868661 validation_1-auc:0.830375
[98] validation_0-auc:0.868697 validation_1-auc:0.830195
[99] validation_0-auc:0.86887 validation_1-auc:0.830274
[0] validation_0-auc:0.736393 validation_1-auc:0.721274
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.819945 validation_1-auc:0.799981
[2] validation_0-auc:0.826548 validation_1-auc:0.808555
[3] validation_0-auc:0.82966 validation_1-auc:0.813513
[4] validation_0-auc:0.829315 validation_1-auc:0.813846
[5] validation_0-auc:0.831329 validation_1-auc:0.812628
[6] validation_0-auc:0.831593 validation_1-auc:0.810003
[7] validation_0-auc:0.835663 validation_1-auc:0.815299
[8] validation_0-auc:0.83682 validation_1-auc:0.816561
[9] validation_0-auc:0.838145 validation_1-auc:0.815808
[10] validation_0-auc:0.840941 validation_1-auc:0.819605
[11] validation_0-auc:0.841357 validation_1-auc:0.820294
[12] validation_0-auc:0.841594 validation_1-auc:0.820719
[13] validation_0-auc:0.843242 validation_1-auc:0.820939
[14] validation_0-auc:0.843457 validation_1-auc:0.819847
[15] validation_0-auc:0.845196 validation_1-auc:0.822067
[16] validation_0-auc:0.84644 validation_1-auc:0.823116
[17] validation_0-auc:0.847939 validation_1-auc:0.824632
[18] validation_0-auc:0.848396 validation_1-auc:0.825116
[19] validation_0-auc:0.847848 validation_1-auc:0.824053
[20] validation_0-auc:0.848584 validation_1-auc:0.824424
[21] validation_0-auc:0.848876 validation_1-auc:0.823331
[22] validation_0-auc:0.84891 validation_1-auc:0.822812
[23] validation_0-auc:0.849739 validation_1-auc:0.824247
[24] validation_0-auc:0.849969 validation_1-auc:0.82465
[25] validation_0-auc:0.850587 validation_1-auc:0.82549
[26] validation_0-auc:0.850868 validation_1-auc:0.824608
[27] validation_0-auc:0.851812 validation_1-auc:0.825921
[28] validation_0-auc:0.85274 validation_1-auc:0.826343
[29] validation_0-auc:0.852816 validation_1-auc:0.825782
[30] validation_0-auc:0.853368 validation_1-auc:0.82662
[31] validation_0-auc:0.854052 validation_1-auc:0.827135
[32] validation_0-auc:0.853763 validation_1-auc:0.826296
[33] validation_0-auc:0.854003 validation_1-auc:0.825601
[34] validation_0-auc:0.853725 validation_1-auc:0.825021
[35] validation_0-auc:0.854585 validation_1-auc:0.825649
[36] validation_0-auc:0.855229 validation_1-auc:0.82619
[37] validation_0-auc:0.855674 validation_1-auc:0.825819
[38] validation_0-auc:0.855639 validation_1-auc:0.82543
[39] validation_0-auc:0.855187 validation_1-auc:0.824685
[40] validation_0-auc:0.855314 validation_1-auc:0.823702
[41] validation_0-auc:0.855141 validation_1-auc:0.822471
[42] validation_0-auc:0.855854 validation_1-auc:0.823395
[43] validation_0-auc:0.855696 validation_1-auc:0.82292
[44] validation_0-auc:0.856587 validation_1-auc:0.823431
[45] validation_0-auc:0.856538 validation_1-auc:0.823024
[46] validation_0-auc:0.857635 validation_1-auc:0.823665
[47] validation_0-auc:0.858437 validation_1-auc:0.824172
[48] validation_0-auc:0.858305 validation_1-auc:0.823354
[49] validation_0-auc:0.858001 validation_1-auc:0.822805
[50] validation_0-auc:0.857601 validation_1-auc:0.822195
[51] validation_0-auc:0.858412 validation_1-auc:0.822587
[52] validation_0-auc:0.858744 validation_1-auc:0.823356
[53] validation_0-auc:0.859251 validation_1-auc:0.8241
[54] validation_0-auc:0.85981 validation_1-auc:0.82429
[55] validation_0-auc:0.85979 validation_1-auc:0.823706
[56] validation_0-auc:0.860648 validation_1-auc:0.824384
[57] validation_0-auc:0.860977 validation_1-auc:0.824821
[58] validation_0-auc:0.861036 validation_1-auc:0.824936
[59] validation_0-auc:0.861477 validation_1-auc:0.825534
[60] validation_0-auc:0.862023 validation_1-auc:0.825926
[61] validation_0-auc:0.862317 validation_1-auc:0.826228
Stopping. Best iteration:
[31] validation_0-auc:0.854052 validation_1-auc:0.827135
[0] validation_0-auc:0.723172 validation_1-auc:0.72646
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.817316 validation_1-auc:0.804567
[2] validation_0-auc:0.826558 validation_1-auc:0.812693
[3] validation_0-auc:0.829759 validation_1-auc:0.815196
[4] validation_0-auc:0.831283 validation_1-auc:0.816045
[5] validation_0-auc:0.83061 validation_1-auc:0.814743
[6] validation_0-auc:0.832406 validation_1-auc:0.815142
[7] validation_0-auc:0.835144 validation_1-auc:0.817325
[8] validation_0-auc:0.839112 validation_1-auc:0.820084
[9] validation_0-auc:0.838051 validation_1-auc:0.817387
[10] validation_0-auc:0.841376 validation_1-auc:0.819662
[11] validation_0-auc:0.842386 validation_1-auc:0.81962
[12] validation_0-auc:0.843241 validation_1-auc:0.820489
[13] validation_0-auc:0.845534 validation_1-auc:0.820977
[14] validation_0-auc:0.845558 validation_1-auc:0.820216
[15] validation_0-auc:0.846971 validation_1-auc:0.822003
[16] validation_0-auc:0.847713 validation_1-auc:0.82247
[17] validation_0-auc:0.848886 validation_1-auc:0.823048
[18] validation_0-auc:0.849732 validation_1-auc:0.823946
[19] validation_0-auc:0.849105 validation_1-auc:0.823575
[20] validation_0-auc:0.849547 validation_1-auc:0.823447
[21] validation_0-auc:0.849427 validation_1-auc:0.823288
[22] validation_0-auc:0.849761 validation_1-auc:0.822664
[23] validation_0-auc:0.850091 validation_1-auc:0.823243
[24] validation_0-auc:0.850362 validation_1-auc:0.824691
[25] validation_0-auc:0.850841 validation_1-auc:0.824735
[26] validation_0-auc:0.851126 validation_1-auc:0.824228
[27] validation_0-auc:0.851323 validation_1-auc:0.824637
[28] validation_0-auc:0.852174 validation_1-auc:0.824671
[29] validation_0-auc:0.852406 validation_1-auc:0.823983
[30] validation_0-auc:0.852587 validation_1-auc:0.824201
[31] validation_0-auc:0.853361 validation_1-auc:0.824923
[32] validation_0-auc:0.853351 validation_1-auc:0.824821
[33] validation_0-auc:0.85327 validation_1-auc:0.824608
[34] validation_0-auc:0.853094 validation_1-auc:0.82418
[35] validation_0-auc:0.853584 validation_1-auc:0.824472
[36] validation_0-auc:0.853912 validation_1-auc:0.824324
[37] validation_0-auc:0.853668 validation_1-auc:0.824226
[38] validation_0-auc:0.853789 validation_1-auc:0.82433
[39] validation_0-auc:0.853709 validation_1-auc:0.824014
[40] validation_0-auc:0.853976 validation_1-auc:0.823702
[41] validation_0-auc:0.853945 validation_1-auc:0.82344
[42] validation_0-auc:0.854581 validation_1-auc:0.823959
[43] validation_0-auc:0.854412 validation_1-auc:0.823947
[44] validation_0-auc:0.854847 validation_1-auc:0.824264
[45] validation_0-auc:0.855027 validation_1-auc:0.824386
[46] validation_0-auc:0.855591 validation_1-auc:0.824465
[47] validation_0-auc:0.855952 validation_1-auc:0.825147
[48] validation_0-auc:0.856114 validation_1-auc:0.824754
[49] validation_0-auc:0.856087 validation_1-auc:0.824592
[50] validation_0-auc:0.856149 validation_1-auc:0.823958
[51] validation_0-auc:0.856486 validation_1-auc:0.824409
[52] validation_0-auc:0.857014 validation_1-auc:0.825253
[53] validation_0-auc:0.857366 validation_1-auc:0.825764
[54] validation_0-auc:0.857718 validation_1-auc:0.826
[55] validation_0-auc:0.857558 validation_1-auc:0.825809
[56] validation_0-auc:0.858153 validation_1-auc:0.826069
[57] validation_0-auc:0.858466 validation_1-auc:0.826176
[58] validation_0-auc:0.858718 validation_1-auc:0.825965
[59] validation_0-auc:0.85903 validation_1-auc:0.826366
[60] validation_0-auc:0.859611 validation_1-auc:0.827075
[61] validation_0-auc:0.860085 validation_1-auc:0.827398
[62] validation_0-auc:0.860277 validation_1-auc:0.827591
[63] validation_0-auc:0.860643 validation_1-auc:0.827848
[64] validation_0-auc:0.860983 validation_1-auc:0.827996
[65] validation_0-auc:0.861243 validation_1-auc:0.828182
[66] validation_0-auc:0.861382 validation_1-auc:0.828071
[67] validation_0-auc:0.861726 validation_1-auc:0.828316
[68] validation_0-auc:0.861911 validation_1-auc:0.828517
[69] validation_0-auc:0.862169 validation_1-auc:0.828757
[70] validation_0-auc:0.862328 validation_1-auc:0.828989
[71] validation_0-auc:0.862671 validation_1-auc:0.82892
[72] validation_0-auc:0.862524 validation_1-auc:0.828526
[73] validation_0-auc:0.862787 validation_1-auc:0.827942
[74] validation_0-auc:0.863401 validation_1-auc:0.828084
[75] validation_0-auc:0.863806 validation_1-auc:0.828441
[76] validation_0-auc:0.864136 validation_1-auc:0.828629
[77] validation_0-auc:0.864287 validation_1-auc:0.828849
[78] validation_0-auc:0.864558 validation_1-auc:0.828541
[79] validation_0-auc:0.864868 validation_1-auc:0.828873
[80] validation_0-auc:0.864847 validation_1-auc:0.828635
[81] validation_0-auc:0.865257 validation_1-auc:0.828816
[82] validation_0-auc:0.865551 validation_1-auc:0.829031
[83] validation_0-auc:0.866265 validation_1-auc:0.82912
[84] validation_0-auc:0.866834 validation_1-auc:0.829302
[85] validation_0-auc:0.867148 validation_1-auc:0.829679
[86] validation_0-auc:0.867182 validation_1-auc:0.829106
[87] validation_0-auc:0.867591 validation_1-auc:0.830134
[88] validation_0-auc:0.86769 validation_1-auc:0.829375
[89] validation_0-auc:0.867878 validation_1-auc:0.829029
[90] validation_0-auc:0.868103 validation_1-auc:0.829314
[91] validation_0-auc:0.868576 validation_1-auc:0.829519
[92] validation_0-auc:0.868524 validation_1-auc:0.829111
[93] validation_0-auc:0.86879 validation_1-auc:0.828657
[94] validation_0-auc:0.869002 validation_1-auc:0.828981
[95] validation_0-auc:0.86939 validation_1-auc:0.829528
[96] validation_0-auc:0.869917 validation_1-auc:0.829692
[97] validation_0-auc:0.870228 validation_1-auc:0.830126
[98] validation_0-auc:0.870017 validation_1-auc:0.829752
[99] validation_0-auc:0.870439 validation_1-auc:0.830403
[0] validation_0-auc:0.821072 validation_1-auc:0.798407
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.828467 validation_1-auc:0.80402
[2] validation_0-auc:0.832878 validation_1-auc:0.806302
[3] validation_0-auc:0.833648 validation_1-auc:0.806495
[4] validation_0-auc:0.836793 validation_1-auc:0.807874
[5] validation_0-auc:0.838149 validation_1-auc:0.809864
[6] validation_0-auc:0.839087 validation_1-auc:0.810425
[7] validation_0-auc:0.840672 validation_1-auc:0.811842
[8] validation_0-auc:0.842946 validation_1-auc:0.81395
[9] validation_0-auc:0.843641 validation_1-auc:0.817024
[10] validation_0-auc:0.844721 validation_1-auc:0.817345
[11] validation_0-auc:0.845816 validation_1-auc:0.817641
[12] validation_0-auc:0.84879 validation_1-auc:0.818358
[13] validation_0-auc:0.851647 validation_1-auc:0.820848
[14] validation_0-auc:0.852874 validation_1-auc:0.821
[15] validation_0-auc:0.854195 validation_1-auc:0.822146
[16] validation_0-auc:0.854371 validation_1-auc:0.821984
[17] validation_0-auc:0.855586 validation_1-auc:0.823063
[18] validation_0-auc:0.856206 validation_1-auc:0.823002
[19] validation_0-auc:0.857394 validation_1-auc:0.824582
[20] validation_0-auc:0.859212 validation_1-auc:0.826106
[21] validation_0-auc:0.860881 validation_1-auc:0.828007
[22] validation_0-auc:0.86196 validation_1-auc:0.828968
[23] validation_0-auc:0.862289 validation_1-auc:0.829806
[24] validation_0-auc:0.862679 validation_1-auc:0.830288
[25] validation_0-auc:0.863598 validation_1-auc:0.829892
[26] validation_0-auc:0.86461 validation_1-auc:0.829889
[27] validation_0-auc:0.865458 validation_1-auc:0.829808
[28] validation_0-auc:0.866278 validation_1-auc:0.83
[29] validation_0-auc:0.867124 validation_1-auc:0.830838
[30] validation_0-auc:0.867763 validation_1-auc:0.831465
[31] validation_0-auc:0.868377 validation_1-auc:0.831173
[32] validation_0-auc:0.869216 validation_1-auc:0.831033
[33] validation_0-auc:0.87064 validation_1-auc:0.831726
[34] validation_0-auc:0.871385 validation_1-auc:0.831871
[35] validation_0-auc:0.872168 validation_1-auc:0.832004
[36] validation_0-auc:0.872942 validation_1-auc:0.832248
[37] validation_0-auc:0.873877 validation_1-auc:0.831894
[38] validation_0-auc:0.874519 validation_1-auc:0.83231
[39] validation_0-auc:0.875 validation_1-auc:0.832774
[40] validation_0-auc:0.876136 validation_1-auc:0.833159
[41] validation_0-auc:0.876669 validation_1-auc:0.833141
[42] validation_0-auc:0.877726 validation_1-auc:0.833482
[43] validation_0-auc:0.878163 validation_1-auc:0.833699
[44] validation_0-auc:0.878946 validation_1-auc:0.833633
[45] validation_0-auc:0.879695 validation_1-auc:0.833557
[46] validation_0-auc:0.880577 validation_1-auc:0.833405
[47] validation_0-auc:0.881111 validation_1-auc:0.83337
[48] validation_0-auc:0.881894 validation_1-auc:0.833353
[49] validation_0-auc:0.882482 validation_1-auc:0.83354
[50] validation_0-auc:0.883036 validation_1-auc:0.833292
[51] validation_0-auc:0.883738 validation_1-auc:0.833463
[52] validation_0-auc:0.88412 validation_1-auc:0.833331
[53] validation_0-auc:0.884566 validation_1-auc:0.833537
[54] validation_0-auc:0.885089 validation_1-auc:0.833465
[55] validation_0-auc:0.885833 validation_1-auc:0.833385
[56] validation_0-auc:0.886208 validation_1-auc:0.833411
[57] validation_0-auc:0.886498 validation_1-auc:0.833362
[58] validation_0-auc:0.886852 validation_1-auc:0.833008
[59] validation_0-auc:0.887017 validation_1-auc:0.833057
[60] validation_0-auc:0.88713 validation_1-auc:0.833147
[61] validation_0-auc:0.887194 validation_1-auc:0.833169
[62] validation_0-auc:0.887375 validation_1-auc:0.833139
[63] validation_0-auc:0.88757 validation_1-auc:0.832972
[64] validation_0-auc:0.887868 validation_1-auc:0.833227
[65] validation_0-auc:0.888269 validation_1-auc:0.832872
[66] validation_0-auc:0.889022 validation_1-auc:0.832841
[67] validation_0-auc:0.889373 validation_1-auc:0.832762
[68] validation_0-auc:0.889601 validation_1-auc:0.832733
[69] validation_0-auc:0.88977 validation_1-auc:0.832633
[70] validation_0-auc:0.889988 validation_1-auc:0.832693
[71] validation_0-auc:0.89007 validation_1-auc:0.832718
[72] validation_0-auc:0.89045 validation_1-auc:0.832641
[73] validation_0-auc:0.890699 validation_1-auc:0.83275
Stopping. Best iteration:
[43] validation_0-auc:0.878163 validation_1-auc:0.833699
[0] validation_0-auc:0.81749 validation_1-auc:0.805596
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.825504 validation_1-auc:0.810327
[2] validation_0-auc:0.829615 validation_1-auc:0.818355
[3] validation_0-auc:0.835838 validation_1-auc:0.821291
[4] validation_0-auc:0.836919 validation_1-auc:0.820414
[5] validation_0-auc:0.841819 validation_1-auc:0.822827
[6] validation_0-auc:0.843326 validation_1-auc:0.825234
[7] validation_0-auc:0.845034 validation_1-auc:0.826843
[8] validation_0-auc:0.846261 validation_1-auc:0.827912
[9] validation_0-auc:0.849072 validation_1-auc:0.828481
[10] validation_0-auc:0.848876 validation_1-auc:0.828699
[11] validation_0-auc:0.850182 validation_1-auc:0.829481
[12] validation_0-auc:0.85134 validation_1-auc:0.83002
[13] validation_0-auc:0.854141 validation_1-auc:0.830184
[14] validation_0-auc:0.855466 validation_1-auc:0.83098
[15] validation_0-auc:0.856438 validation_1-auc:0.831108
[16] validation_0-auc:0.857652 validation_1-auc:0.831663
[17] validation_0-auc:0.858786 validation_1-auc:0.832057
[18] validation_0-auc:0.859218 validation_1-auc:0.831991
[19] validation_0-auc:0.859853 validation_1-auc:0.83201
[20] validation_0-auc:0.861039 validation_1-auc:0.831955
[21] validation_0-auc:0.861523 validation_1-auc:0.831722
[22] validation_0-auc:0.861945 validation_1-auc:0.831622
[23] validation_0-auc:0.862484 validation_1-auc:0.832397
[24] validation_0-auc:0.8627 validation_1-auc:0.832507
[25] validation_0-auc:0.864463 validation_1-auc:0.832861
[26] validation_0-auc:0.865277 validation_1-auc:0.833379
[27] validation_0-auc:0.865815 validation_1-auc:0.833118
[28] validation_0-auc:0.866773 validation_1-auc:0.833411
[29] validation_0-auc:0.867645 validation_1-auc:0.833408
[30] validation_0-auc:0.86806 validation_1-auc:0.833619
[31] validation_0-auc:0.868807 validation_1-auc:0.833648
[32] validation_0-auc:0.869745 validation_1-auc:0.833825
[33] validation_0-auc:0.870459 validation_1-auc:0.833331
[34] validation_0-auc:0.871494 validation_1-auc:0.833739
[35] validation_0-auc:0.872645 validation_1-auc:0.834539
[36] validation_0-auc:0.873569 validation_1-auc:0.834772
[37] validation_0-auc:0.874613 validation_1-auc:0.835028
[38] validation_0-auc:0.875177 validation_1-auc:0.835622
[39] validation_0-auc:0.875789 validation_1-auc:0.835705
[40] validation_0-auc:0.877167 validation_1-auc:0.835234
[41] validation_0-auc:0.877578 validation_1-auc:0.835497
[42] validation_0-auc:0.878727 validation_1-auc:0.835926
[43] validation_0-auc:0.879366 validation_1-auc:0.836438
[44] validation_0-auc:0.879941 validation_1-auc:0.836641
[45] validation_0-auc:0.880512 validation_1-auc:0.83659
[46] validation_0-auc:0.881245 validation_1-auc:0.836989
[47] validation_0-auc:0.881945 validation_1-auc:0.83708
[48] validation_0-auc:0.882623 validation_1-auc:0.836501
[49] validation_0-auc:0.882957 validation_1-auc:0.836491
[50] validation_0-auc:0.88351 validation_1-auc:0.836466
[51] validation_0-auc:0.884183 validation_1-auc:0.836457
[52] validation_0-auc:0.88477 validation_1-auc:0.836322
[53] validation_0-auc:0.885178 validation_1-auc:0.836462
[54] validation_0-auc:0.885574 validation_1-auc:0.836805
[55] validation_0-auc:0.885978 validation_1-auc:0.836505
[56] validation_0-auc:0.886187 validation_1-auc:0.836599
[57] validation_0-auc:0.886348 validation_1-auc:0.836863
[58] validation_0-auc:0.886957 validation_1-auc:0.836628
[59] validation_0-auc:0.887036 validation_1-auc:0.836693
[60] validation_0-auc:0.88722 validation_1-auc:0.836624
[61] validation_0-auc:0.88745 validation_1-auc:0.836682
[62] validation_0-auc:0.887725 validation_1-auc:0.836752
[63] validation_0-auc:0.888063 validation_1-auc:0.836735
[64] validation_0-auc:0.888335 validation_1-auc:0.836736
[65] validation_0-auc:0.888647 validation_1-auc:0.836667
[66] validation_0-auc:0.889151 validation_1-auc:0.836771
[67] validation_0-auc:0.889579 validation_1-auc:0.836812
[68] validation_0-auc:0.889821 validation_1-auc:0.83677
[69] validation_0-auc:0.890407 validation_1-auc:0.836879
[70] validation_0-auc:0.890605 validation_1-auc:0.836906
[71] validation_0-auc:0.890836 validation_1-auc:0.836661
[72] validation_0-auc:0.891094 validation_1-auc:0.836623
[73] validation_0-auc:0.89146 validation_1-auc:0.836795
[74] validation_0-auc:0.891621 validation_1-auc:0.836745
[75] validation_0-auc:0.891981 validation_1-auc:0.836764
[76] validation_0-auc:0.892218 validation_1-auc:0.836687
[77] validation_0-auc:0.892556 validation_1-auc:0.836683
Stopping. Best iteration:
[47] validation_0-auc:0.881945 validation_1-auc:0.83708
[0] validation_0-auc:0.826237 validation_1-auc:0.809786
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.829787 validation_1-auc:0.814664
[2] validation_0-auc:0.8337 validation_1-auc:0.81826
[3] validation_0-auc:0.835737 validation_1-auc:0.819717
[4] validation_0-auc:0.837682 validation_1-auc:0.822556
[5] validation_0-auc:0.841279 validation_1-auc:0.824115
[6] validation_0-auc:0.842755 validation_1-auc:0.824282
[7] validation_0-auc:0.843971 validation_1-auc:0.823599
[8] validation_0-auc:0.84534 validation_1-auc:0.824434
[9] validation_0-auc:0.846465 validation_1-auc:0.824597
[10] validation_0-auc:0.847191 validation_1-auc:0.825197
[11] validation_0-auc:0.848853 validation_1-auc:0.825374
[12] validation_0-auc:0.850016 validation_1-auc:0.8258
[13] validation_0-auc:0.853317 validation_1-auc:0.827854
[14] validation_0-auc:0.854318 validation_1-auc:0.827667
[15] validation_0-auc:0.855525 validation_1-auc:0.828234
[16] validation_0-auc:0.856415 validation_1-auc:0.828893
[17] validation_0-auc:0.857688 validation_1-auc:0.829585
[18] validation_0-auc:0.859543 validation_1-auc:0.830368
[19] validation_0-auc:0.860837 validation_1-auc:0.83026
[20] validation_0-auc:0.861388 validation_1-auc:0.830842
[21] validation_0-auc:0.862264 validation_1-auc:0.830852
[22] validation_0-auc:0.863437 validation_1-auc:0.831156
[23] validation_0-auc:0.864748 validation_1-auc:0.832708
[24] validation_0-auc:0.864788 validation_1-auc:0.832684
[25] validation_0-auc:0.865764 validation_1-auc:0.832861
[26] validation_0-auc:0.866454 validation_1-auc:0.833202
[27] validation_0-auc:0.866731 validation_1-auc:0.833408
[28] validation_0-auc:0.867209 validation_1-auc:0.833139
[29] validation_0-auc:0.868086 validation_1-auc:0.833533
[30] validation_0-auc:0.868932 validation_1-auc:0.83462
[31] validation_0-auc:0.87023 validation_1-auc:0.835239
[32] validation_0-auc:0.871175 validation_1-auc:0.835221
[33] validation_0-auc:0.872037 validation_1-auc:0.834899
[34] validation_0-auc:0.872705 validation_1-auc:0.834869
[35] validation_0-auc:0.873733 validation_1-auc:0.835152
[36] validation_0-auc:0.874537 validation_1-auc:0.835725
[37] validation_0-auc:0.875174 validation_1-auc:0.836233
[38] validation_0-auc:0.876054 validation_1-auc:0.83577
[39] validation_0-auc:0.876692 validation_1-auc:0.835439
[40] validation_0-auc:0.878044 validation_1-auc:0.835555
[41] validation_0-auc:0.878636 validation_1-auc:0.835007
[42] validation_0-auc:0.879238 validation_1-auc:0.835372
[43] validation_0-auc:0.87965 validation_1-auc:0.835205
[44] validation_0-auc:0.880206 validation_1-auc:0.835153
[45] validation_0-auc:0.880657 validation_1-auc:0.834636
[46] validation_0-auc:0.88086 validation_1-auc:0.835041
[47] validation_0-auc:0.881191 validation_1-auc:0.834951
[48] validation_0-auc:0.881704 validation_1-auc:0.834815
[49] validation_0-auc:0.882437 validation_1-auc:0.834951
[50] validation_0-auc:0.882728 validation_1-auc:0.834536
[51] validation_0-auc:0.883517 validation_1-auc:0.834999
[52] validation_0-auc:0.884127 validation_1-auc:0.835496
[53] validation_0-auc:0.88451 validation_1-auc:0.835642
[54] validation_0-auc:0.884861 validation_1-auc:0.835856
[55] validation_0-auc:0.885418 validation_1-auc:0.835763
[56] validation_0-auc:0.885892 validation_1-auc:0.835969
[57] validation_0-auc:0.886356 validation_1-auc:0.836083
[58] validation_0-auc:0.887071 validation_1-auc:0.836024
[59] validation_0-auc:0.887445 validation_1-auc:0.836118
[60] validation_0-auc:0.88773 validation_1-auc:0.836284
[61] validation_0-auc:0.888132 validation_1-auc:0.836433
[62] validation_0-auc:0.888539 validation_1-auc:0.836706
[63] validation_0-auc:0.888847 validation_1-auc:0.836653
[64] validation_0-auc:0.889081 validation_1-auc:0.836836
[65] validation_0-auc:0.889221 validation_1-auc:0.837048
[66] validation_0-auc:0.889562 validation_1-auc:0.837007
[67] validation_0-auc:0.889967 validation_1-auc:0.837231
[68] validation_0-auc:0.890535 validation_1-auc:0.837502
[69] validation_0-auc:0.890746 validation_1-auc:0.837793
[70] validation_0-auc:0.890951 validation_1-auc:0.837936
[71] validation_0-auc:0.89119 validation_1-auc:0.837861
[72] validation_0-auc:0.891488 validation_1-auc:0.838036
[73] validation_0-auc:0.891981 validation_1-auc:0.838098
[74] validation_0-auc:0.892243 validation_1-auc:0.838124
[75] validation_0-auc:0.89287 validation_1-auc:0.838294
[76] validation_0-auc:0.893147 validation_1-auc:0.838272
[77] validation_0-auc:0.893599 validation_1-auc:0.838229
[78] validation_0-auc:0.893849 validation_1-auc:0.83807
[79] validation_0-auc:0.894031 validation_1-auc:0.838146
[80] validation_0-auc:0.894229 validation_1-auc:0.837949
[81] validation_0-auc:0.894689 validation_1-auc:0.838295
[82] validation_0-auc:0.894852 validation_1-auc:0.838311
[83] validation_0-auc:0.895216 validation_1-auc:0.838316
[84] validation_0-auc:0.895461 validation_1-auc:0.838283
[85] validation_0-auc:0.895624 validation_1-auc:0.838073
[86] validation_0-auc:0.895712 validation_1-auc:0.838006
[87] validation_0-auc:0.896162 validation_1-auc:0.838047
[88] validation_0-auc:0.896317 validation_1-auc:0.838009
[89] validation_0-auc:0.896702 validation_1-auc:0.838012
[90] validation_0-auc:0.896931 validation_1-auc:0.838013
[91] validation_0-auc:0.897066 validation_1-auc:0.837931
[92] validation_0-auc:0.897247 validation_1-auc:0.83797
[93] validation_0-auc:0.897561 validation_1-auc:0.837982
[94] validation_0-auc:0.897622 validation_1-auc:0.837945
[95] validation_0-auc:0.897738 validation_1-auc:0.837895
[96] validation_0-auc:0.897913 validation_1-auc:0.837825
[97] validation_0-auc:0.898047 validation_1-auc:0.83793
[98] validation_0-auc:0.898282 validation_1-auc:0.83783
[99] validation_0-auc:0.898391 validation_1-auc:0.837614
[0] validation_0-auc:0.736603 validation_1-auc:0.700164
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.826268 validation_1-auc:0.793071
[2] validation_0-auc:0.834067 validation_1-auc:0.801339
[3] validation_0-auc:0.842323 validation_1-auc:0.807246
[4] validation_0-auc:0.845691 validation_1-auc:0.808784
[5] validation_0-auc:0.845432 validation_1-auc:0.805356
[6] validation_0-auc:0.848913 validation_1-auc:0.808896
[7] validation_0-auc:0.852179 validation_1-auc:0.811769
[8] validation_0-auc:0.854038 validation_1-auc:0.813463
[9] validation_0-auc:0.857643 validation_1-auc:0.815556
[10] validation_0-auc:0.860425 validation_1-auc:0.81763
[11] validation_0-auc:0.862055 validation_1-auc:0.818383
[12] validation_0-auc:0.864843 validation_1-auc:0.818149
[13] validation_0-auc:0.865535 validation_1-auc:0.817178
[14] validation_0-auc:0.868887 validation_1-auc:0.818147
[15] validation_0-auc:0.870714 validation_1-auc:0.819966
[16] validation_0-auc:0.873908 validation_1-auc:0.821453
[17] validation_0-auc:0.876121 validation_1-auc:0.822837
[18] validation_0-auc:0.876826 validation_1-auc:0.823028
[19] validation_0-auc:0.87775 validation_1-auc:0.821857
[20] validation_0-auc:0.879919 validation_1-auc:0.822881
[21] validation_0-auc:0.882435 validation_1-auc:0.823004
[22] validation_0-auc:0.882736 validation_1-auc:0.822923
[23] validation_0-auc:0.883901 validation_1-auc:0.824036
[24] validation_0-auc:0.885202 validation_1-auc:0.824935
[25] validation_0-auc:0.887358 validation_1-auc:0.825209
[26] validation_0-auc:0.888633 validation_1-auc:0.82469
[27] validation_0-auc:0.890229 validation_1-auc:0.825046
[28] validation_0-auc:0.892357 validation_1-auc:0.825861
[29] validation_0-auc:0.893288 validation_1-auc:0.82527
[30] validation_0-auc:0.894338 validation_1-auc:0.825512
[31] validation_0-auc:0.895045 validation_1-auc:0.826269
[32] validation_0-auc:0.896656 validation_1-auc:0.826934
[33] validation_0-auc:0.898124 validation_1-auc:0.826804
[34] validation_0-auc:0.899745 validation_1-auc:0.827773
[35] validation_0-auc:0.901357 validation_1-auc:0.828317
[36] validation_0-auc:0.902492 validation_1-auc:0.828676
[37] validation_0-auc:0.903754 validation_1-auc:0.82922
[38] validation_0-auc:0.905034 validation_1-auc:0.828532
[39] validation_0-auc:0.905919 validation_1-auc:0.828535
[40] validation_0-auc:0.907886 validation_1-auc:0.827662
[41] validation_0-auc:0.908467 validation_1-auc:0.826411
[42] validation_0-auc:0.909511 validation_1-auc:0.826689
[43] validation_0-auc:0.911221 validation_1-auc:0.82706
[44] validation_0-auc:0.912808 validation_1-auc:0.827529
[45] validation_0-auc:0.91349 validation_1-auc:0.827149
[46] validation_0-auc:0.914831 validation_1-auc:0.82746
[47] validation_0-auc:0.916449 validation_1-auc:0.827691
[48] validation_0-auc:0.917225 validation_1-auc:0.827237
[49] validation_0-auc:0.918226 validation_1-auc:0.827066
[50] validation_0-auc:0.918657 validation_1-auc:0.826444
[51] validation_0-auc:0.920399 validation_1-auc:0.827227
[52] validation_0-auc:0.920695 validation_1-auc:0.827805
[53] validation_0-auc:0.921948 validation_1-auc:0.827994
[54] validation_0-auc:0.923277 validation_1-auc:0.828338
[55] validation_0-auc:0.923788 validation_1-auc:0.82746
[56] validation_0-auc:0.925017 validation_1-auc:0.827741
[57] validation_0-auc:0.92573 validation_1-auc:0.828306
[58] validation_0-auc:0.926734 validation_1-auc:0.827544
[59] validation_0-auc:0.927068 validation_1-auc:0.82802
[60] validation_0-auc:0.92774 validation_1-auc:0.828119
[61] validation_0-auc:0.928164 validation_1-auc:0.828481
[62] validation_0-auc:0.928628 validation_1-auc:0.828593
[63] validation_0-auc:0.929316 validation_1-auc:0.828497
[64] validation_0-auc:0.929833 validation_1-auc:0.828742
[65] validation_0-auc:0.930572 validation_1-auc:0.828987
[66] validation_0-auc:0.931412 validation_1-auc:0.828475
[67] validation_0-auc:0.932132 validation_1-auc:0.828711
Stopping. Best iteration:
[37] validation_0-auc:0.903754 validation_1-auc:0.82922
[0] validation_0-auc:0.740647 validation_1-auc:0.720913
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.826544 validation_1-auc:0.801956
[2] validation_0-auc:0.834004 validation_1-auc:0.808069
[3] validation_0-auc:0.849661 validation_1-auc:0.819607
[4] validation_0-auc:0.851272 validation_1-auc:0.822935
[5] validation_0-auc:0.850693 validation_1-auc:0.817582
[6] validation_0-auc:0.856199 validation_1-auc:0.821075
[7] validation_0-auc:0.859929 validation_1-auc:0.823648
[8] validation_0-auc:0.863687 validation_1-auc:0.824167
[9] validation_0-auc:0.866057 validation_1-auc:0.826058
[10] validation_0-auc:0.868358 validation_1-auc:0.827723
[11] validation_0-auc:0.869242 validation_1-auc:0.828113
[12] validation_0-auc:0.87102 validation_1-auc:0.827862
[13] validation_0-auc:0.872763 validation_1-auc:0.827984
[14] validation_0-auc:0.873857 validation_1-auc:0.82831
[15] validation_0-auc:0.875163 validation_1-auc:0.828587
[16] validation_0-auc:0.877165 validation_1-auc:0.828668
[17] validation_0-auc:0.878043 validation_1-auc:0.829345
[18] validation_0-auc:0.87883 validation_1-auc:0.829559
[19] validation_0-auc:0.879949 validation_1-auc:0.828957
[20] validation_0-auc:0.882307 validation_1-auc:0.830052
[21] validation_0-auc:0.883259 validation_1-auc:0.829465
[22] validation_0-auc:0.88586 validation_1-auc:0.828286
[23] validation_0-auc:0.887976 validation_1-auc:0.830594
[24] validation_0-auc:0.889163 validation_1-auc:0.831053
[25] validation_0-auc:0.891484 validation_1-auc:0.831408
[26] validation_0-auc:0.893273 validation_1-auc:0.830216
[27] validation_0-auc:0.894895 validation_1-auc:0.831752
[28] validation_0-auc:0.896062 validation_1-auc:0.831597
[29] validation_0-auc:0.89645 validation_1-auc:0.830469
[30] validation_0-auc:0.897077 validation_1-auc:0.830661
[31] validation_0-auc:0.897765 validation_1-auc:0.831046
[32] validation_0-auc:0.899459 validation_1-auc:0.832077
[33] validation_0-auc:0.900202 validation_1-auc:0.83023
[34] validation_0-auc:0.901841 validation_1-auc:0.830961
[35] validation_0-auc:0.904117 validation_1-auc:0.831305
[36] validation_0-auc:0.905193 validation_1-auc:0.831717
[37] validation_0-auc:0.908033 validation_1-auc:0.832032
[38] validation_0-auc:0.908802 validation_1-auc:0.832665
[39] validation_0-auc:0.90957 validation_1-auc:0.83215
[40] validation_0-auc:0.911034 validation_1-auc:0.831596
[41] validation_0-auc:0.911026 validation_1-auc:0.831058
[42] validation_0-auc:0.912699 validation_1-auc:0.831595
[43] validation_0-auc:0.914714 validation_1-auc:0.83234
[44] validation_0-auc:0.916067 validation_1-auc:0.83308
[45] validation_0-auc:0.916194 validation_1-auc:0.832429
[46] validation_0-auc:0.918095 validation_1-auc:0.832154
[47] validation_0-auc:0.919889 validation_1-auc:0.831798
[48] validation_0-auc:0.920904 validation_1-auc:0.830767
[49] validation_0-auc:0.921351 validation_1-auc:0.830555
[50] validation_0-auc:0.921822 validation_1-auc:0.829763
[51] validation_0-auc:0.923565 validation_1-auc:0.829949
[52] validation_0-auc:0.923844 validation_1-auc:0.83068
[53] validation_0-auc:0.924495 validation_1-auc:0.831304
[54] validation_0-auc:0.925644 validation_1-auc:0.831747
[55] validation_0-auc:0.926186 validation_1-auc:0.830887
[56] validation_0-auc:0.927229 validation_1-auc:0.83069
[57] validation_0-auc:0.928138 validation_1-auc:0.831079
[58] validation_0-auc:0.928591 validation_1-auc:0.830594
[59] validation_0-auc:0.928929 validation_1-auc:0.830672
[60] validation_0-auc:0.929503 validation_1-auc:0.831423
[61] validation_0-auc:0.929976 validation_1-auc:0.831576
[62] validation_0-auc:0.930568 validation_1-auc:0.831583
[63] validation_0-auc:0.931582 validation_1-auc:0.831323
[64] validation_0-auc:0.932371 validation_1-auc:0.831207
[65] validation_0-auc:0.933311 validation_1-auc:0.830712
[66] validation_0-auc:0.933892 validation_1-auc:0.830276
[67] validation_0-auc:0.934688 validation_1-auc:0.830357
[68] validation_0-auc:0.935516 validation_1-auc:0.830532
[69] validation_0-auc:0.935712 validation_1-auc:0.830701
[70] validation_0-auc:0.936281 validation_1-auc:0.830437
[71] validation_0-auc:0.936565 validation_1-auc:0.830998
[72] validation_0-auc:0.937175 validation_1-auc:0.830688
[73] validation_0-auc:0.937607 validation_1-auc:0.830187
[74] validation_0-auc:0.938053 validation_1-auc:0.83056
Stopping. Best iteration:
[44] validation_0-auc:0.916067 validation_1-auc:0.83308
[0] validation_0-auc:0.760365 validation_1-auc:0.75236
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.833385 validation_1-auc:0.809459
[2] validation_0-auc:0.839706 validation_1-auc:0.814516
[3] validation_0-auc:0.847371 validation_1-auc:0.818018
[4] validation_0-auc:0.849207 validation_1-auc:0.820151
[5] validation_0-auc:0.848635 validation_1-auc:0.819043
[6] validation_0-auc:0.852571 validation_1-auc:0.819276
[7] validation_0-auc:0.856872 validation_1-auc:0.821216
[8] validation_0-auc:0.858816 validation_1-auc:0.820708
[9] validation_0-auc:0.863021 validation_1-auc:0.82196
[10] validation_0-auc:0.865285 validation_1-auc:0.822227
[11] validation_0-auc:0.866896 validation_1-auc:0.822414
[12] validation_0-auc:0.86782 validation_1-auc:0.823515
[13] validation_0-auc:0.871704 validation_1-auc:0.823008
[14] validation_0-auc:0.874865 validation_1-auc:0.824809
[15] validation_0-auc:0.876533 validation_1-auc:0.825413
[16] validation_0-auc:0.878297 validation_1-auc:0.82585
[17] validation_0-auc:0.880328 validation_1-auc:0.826453
[18] validation_0-auc:0.880967 validation_1-auc:0.827295
[19] validation_0-auc:0.882951 validation_1-auc:0.826537
[20] validation_0-auc:0.884685 validation_1-auc:0.827463
[21] validation_0-auc:0.886374 validation_1-auc:0.826654
[22] validation_0-auc:0.887121 validation_1-auc:0.826501
[23] validation_0-auc:0.888223 validation_1-auc:0.827274
[24] validation_0-auc:0.889156 validation_1-auc:0.827434
[25] validation_0-auc:0.891725 validation_1-auc:0.828648
[26] validation_0-auc:0.893167 validation_1-auc:0.827894
[27] validation_0-auc:0.8948 validation_1-auc:0.82834
[28] validation_0-auc:0.895679 validation_1-auc:0.828519
[29] validation_0-auc:0.897492 validation_1-auc:0.827341
[30] validation_0-auc:0.89833 validation_1-auc:0.827893
[31] validation_0-auc:0.899402 validation_1-auc:0.82862
[32] validation_0-auc:0.901734 validation_1-auc:0.828743
[33] validation_0-auc:0.902991 validation_1-auc:0.828329
[34] validation_0-auc:0.904193 validation_1-auc:0.828534
[35] validation_0-auc:0.906037 validation_1-auc:0.828575
[36] validation_0-auc:0.907111 validation_1-auc:0.828996
[37] validation_0-auc:0.908502 validation_1-auc:0.82903
[38] validation_0-auc:0.909623 validation_1-auc:0.829456
[39] validation_0-auc:0.911128 validation_1-auc:0.829126
[40] validation_0-auc:0.912614 validation_1-auc:0.827594
[41] validation_0-auc:0.912793 validation_1-auc:0.827194
[42] validation_0-auc:0.914083 validation_1-auc:0.828021
[43] validation_0-auc:0.91525 validation_1-auc:0.828673
[44] validation_0-auc:0.916739 validation_1-auc:0.828956
[45] validation_0-auc:0.91752 validation_1-auc:0.828267
[46] validation_0-auc:0.918512 validation_1-auc:0.828785
[47] validation_0-auc:0.919695 validation_1-auc:0.829181
[48] validation_0-auc:0.920826 validation_1-auc:0.828917
[49] validation_0-auc:0.92163 validation_1-auc:0.828325
[50] validation_0-auc:0.921908 validation_1-auc:0.828083
[51] validation_0-auc:0.92354 validation_1-auc:0.828799
[52] validation_0-auc:0.924023 validation_1-auc:0.829568
[53] validation_0-auc:0.92457 validation_1-auc:0.829965
[54] validation_0-auc:0.925448 validation_1-auc:0.829946
[55] validation_0-auc:0.926227 validation_1-auc:0.829723
[56] validation_0-auc:0.927401 validation_1-auc:0.830357
[57] validation_0-auc:0.928388 validation_1-auc:0.830821
[58] validation_0-auc:0.929028 validation_1-auc:0.830256
[59] validation_0-auc:0.929521 validation_1-auc:0.830546
[60] validation_0-auc:0.929945 validation_1-auc:0.830534
[61] validation_0-auc:0.930073 validation_1-auc:0.831258
[62] validation_0-auc:0.930487 validation_1-auc:0.83178
[63] validation_0-auc:0.931158 validation_1-auc:0.831809
[64] validation_0-auc:0.931332 validation_1-auc:0.831991
[65] validation_0-auc:0.93208 validation_1-auc:0.83196
[66] validation_0-auc:0.932961 validation_1-auc:0.831531
[67] validation_0-auc:0.933736 validation_1-auc:0.831722
[68] validation_0-auc:0.934234 validation_1-auc:0.832189
[69] validation_0-auc:0.934416 validation_1-auc:0.832359
[70] validation_0-auc:0.934825 validation_1-auc:0.832443
[71] validation_0-auc:0.935244 validation_1-auc:0.832685
[72] validation_0-auc:0.935604 validation_1-auc:0.832497
[73] validation_0-auc:0.936316 validation_1-auc:0.832088
[74] validation_0-auc:0.936497 validation_1-auc:0.831806
[75] validation_0-auc:0.937128 validation_1-auc:0.832185
[76] validation_0-auc:0.937715 validation_1-auc:0.832507
[77] validation_0-auc:0.938155 validation_1-auc:0.832432
[78] validation_0-auc:0.938719 validation_1-auc:0.832197
[79] validation_0-auc:0.939175 validation_1-auc:0.83205
[80] validation_0-auc:0.939648 validation_1-auc:0.831967
[81] validation_0-auc:0.940012 validation_1-auc:0.832265
[82] validation_0-auc:0.940534 validation_1-auc:0.83231
[83] validation_0-auc:0.941051 validation_1-auc:0.83202
[84] validation_0-auc:0.941561 validation_1-auc:0.832272
[85] validation_0-auc:0.941614 validation_1-auc:0.83244
[86] validation_0-auc:0.942 validation_1-auc:0.832278
[87] validation_0-auc:0.942145 validation_1-auc:0.832352
[88] validation_0-auc:0.942448 validation_1-auc:0.832467
[89] validation_0-auc:0.942858 validation_1-auc:0.832379
[90] validation_0-auc:0.943099 validation_1-auc:0.832351
[91] validation_0-auc:0.943333 validation_1-auc:0.83246
[92] validation_0-auc:0.943568 validation_1-auc:0.832181
[93] validation_0-auc:0.943765 validation_1-auc:0.832222
[94] validation_0-auc:0.943804 validation_1-auc:0.832295
[95] validation_0-auc:0.943969 validation_1-auc:0.832336
[96] validation_0-auc:0.944228 validation_1-auc:0.83236
[97] validation_0-auc:0.944459 validation_1-auc:0.832402
[98] validation_0-auc:0.944643 validation_1-auc:0.832271
[99] validation_0-auc:0.944835 validation_1-auc:0.832455
[0] validation_0-auc:0.83027 validation_1-auc:0.799616
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.841155 validation_1-auc:0.804615
[2] validation_0-auc:0.846452 validation_1-auc:0.80878
[3] validation_0-auc:0.849344 validation_1-auc:0.80952
[4] validation_0-auc:0.852764 validation_1-auc:0.811025
[5] validation_0-auc:0.858383 validation_1-auc:0.815466
[6] validation_0-auc:0.861211 validation_1-auc:0.815642
[7] validation_0-auc:0.86588 validation_1-auc:0.817788
[8] validation_0-auc:0.869372 validation_1-auc:0.818135
[9] validation_0-auc:0.873694 validation_1-auc:0.819449
[10] validation_0-auc:0.876088 validation_1-auc:0.820788
[11] validation_0-auc:0.878617 validation_1-auc:0.820686
[12] validation_0-auc:0.88127 validation_1-auc:0.820783
[13] validation_0-auc:0.887858 validation_1-auc:0.821567
[14] validation_0-auc:0.890037 validation_1-auc:0.821178
[15] validation_0-auc:0.891367 validation_1-auc:0.821989
[16] validation_0-auc:0.893548 validation_1-auc:0.82261
[17] validation_0-auc:0.896476 validation_1-auc:0.823102
[18] validation_0-auc:0.899217 validation_1-auc:0.82262
[19] validation_0-auc:0.902015 validation_1-auc:0.82295
[20] validation_0-auc:0.90355 validation_1-auc:0.823374
[21] validation_0-auc:0.905847 validation_1-auc:0.822928
[22] validation_0-auc:0.907381 validation_1-auc:0.824412
[23] validation_0-auc:0.908399 validation_1-auc:0.82619
[24] validation_0-auc:0.909498 validation_1-auc:0.825934
[25] validation_0-auc:0.9111 validation_1-auc:0.827466
[26] validation_0-auc:0.913119 validation_1-auc:0.828862
[27] validation_0-auc:0.914689 validation_1-auc:0.829132
[28] validation_0-auc:0.91618 validation_1-auc:0.829515
[29] validation_0-auc:0.917779 validation_1-auc:0.828495
[30] validation_0-auc:0.918752 validation_1-auc:0.827936
[31] validation_0-auc:0.919624 validation_1-auc:0.827852
[32] validation_0-auc:0.920592 validation_1-auc:0.827808
[33] validation_0-auc:0.921488 validation_1-auc:0.828322
[34] validation_0-auc:0.922532 validation_1-auc:0.828788
[35] validation_0-auc:0.923487 validation_1-auc:0.828357
[36] validation_0-auc:0.924434 validation_1-auc:0.828147
[37] validation_0-auc:0.925505 validation_1-auc:0.828288
[38] validation_0-auc:0.926676 validation_1-auc:0.827869
[39] validation_0-auc:0.927631 validation_1-auc:0.827229
[40] validation_0-auc:0.928409 validation_1-auc:0.827497
[41] validation_0-auc:0.929474 validation_1-auc:0.827375
[42] validation_0-auc:0.92984 validation_1-auc:0.827616
[43] validation_0-auc:0.930282 validation_1-auc:0.827805
[44] validation_0-auc:0.930624 validation_1-auc:0.827272
[45] validation_0-auc:0.931175 validation_1-auc:0.827168
[46] validation_0-auc:0.93172 validation_1-auc:0.826702
[47] validation_0-auc:0.932362 validation_1-auc:0.826331
[48] validation_0-auc:0.932834 validation_1-auc:0.826293
[49] validation_0-auc:0.933276 validation_1-auc:0.826665
[50] validation_0-auc:0.93389 validation_1-auc:0.826701
[51] validation_0-auc:0.934417 validation_1-auc:0.826647
[52] validation_0-auc:0.934879 validation_1-auc:0.826548
[53] validation_0-auc:0.935264 validation_1-auc:0.826101
[54] validation_0-auc:0.93585 validation_1-auc:0.82563
[55] validation_0-auc:0.936309 validation_1-auc:0.825584
[56] validation_0-auc:0.93654 validation_1-auc:0.82554
[57] validation_0-auc:0.937018 validation_1-auc:0.825093
[58] validation_0-auc:0.937948 validation_1-auc:0.824666
Stopping. Best iteration:
[28] validation_0-auc:0.91618 validation_1-auc:0.829515
[0] validation_0-auc:0.82391 validation_1-auc:0.805276
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.835529 validation_1-auc:0.817547
[2] validation_0-auc:0.840889 validation_1-auc:0.821298
[3] validation_0-auc:0.847035 validation_1-auc:0.824839
[4] validation_0-auc:0.85018 validation_1-auc:0.824354
[5] validation_0-auc:0.85762 validation_1-auc:0.82709
[6] validation_0-auc:0.861966 validation_1-auc:0.829432
[7] validation_0-auc:0.867583 validation_1-auc:0.82878
[8] validation_0-auc:0.869589 validation_1-auc:0.829342
[9] validation_0-auc:0.871795 validation_1-auc:0.830284
[10] validation_0-auc:0.875445 validation_1-auc:0.829479
[11] validation_0-auc:0.878408 validation_1-auc:0.829187
[12] validation_0-auc:0.880545 validation_1-auc:0.829883
[13] validation_0-auc:0.885082 validation_1-auc:0.830278
[14] validation_0-auc:0.889085 validation_1-auc:0.831242
[15] validation_0-auc:0.890906 validation_1-auc:0.831753
[16] validation_0-auc:0.894463 validation_1-auc:0.833177
[17] validation_0-auc:0.897067 validation_1-auc:0.832781
[18] validation_0-auc:0.900807 validation_1-auc:0.832329
[19] validation_0-auc:0.90246 validation_1-auc:0.831799
[20] validation_0-auc:0.904774 validation_1-auc:0.831945
[21] validation_0-auc:0.906806 validation_1-auc:0.830662
[22] validation_0-auc:0.908327 validation_1-auc:0.83158
[23] validation_0-auc:0.909624 validation_1-auc:0.832047
[24] validation_0-auc:0.910588 validation_1-auc:0.832185
[25] validation_0-auc:0.912421 validation_1-auc:0.833158
[26] validation_0-auc:0.913809 validation_1-auc:0.832983
[27] validation_0-auc:0.915314 validation_1-auc:0.832579
[28] validation_0-auc:0.916602 validation_1-auc:0.833075
[29] validation_0-auc:0.918315 validation_1-auc:0.831855
[30] validation_0-auc:0.91924 validation_1-auc:0.831393
[31] validation_0-auc:0.920574 validation_1-auc:0.831781
[32] validation_0-auc:0.921915 validation_1-auc:0.832004
[33] validation_0-auc:0.923147 validation_1-auc:0.831477
[34] validation_0-auc:0.924237 validation_1-auc:0.831515
[35] validation_0-auc:0.925414 validation_1-auc:0.831669
[36] validation_0-auc:0.926414 validation_1-auc:0.831512
[37] validation_0-auc:0.92716 validation_1-auc:0.831232
[38] validation_0-auc:0.92786 validation_1-auc:0.83098
[39] validation_0-auc:0.928736 validation_1-auc:0.830477
[40] validation_0-auc:0.929222 validation_1-auc:0.830657
[41] validation_0-auc:0.930184 validation_1-auc:0.829997
[42] validation_0-auc:0.930597 validation_1-auc:0.829985
[43] validation_0-auc:0.93117 validation_1-auc:0.830138
[44] validation_0-auc:0.932138 validation_1-auc:0.830292
[45] validation_0-auc:0.932874 validation_1-auc:0.830094
[46] validation_0-auc:0.933083 validation_1-auc:0.830299
Stopping. Best iteration:
[16] validation_0-auc:0.894463 validation_1-auc:0.833177
[0] validation_0-auc:0.830664 validation_1-auc:0.81045
Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping.
Will train until validation_1-auc hasn't improved in 30 rounds.
[1] validation_0-auc:0.836815 validation_1-auc:0.815496
[2] validation_0-auc:0.843412 validation_1-auc:0.819402
[3] validation_0-auc:0.848638 validation_1-auc:0.822479
[4] validation_0-auc:0.853411 validation_1-auc:0.824273
[5] validation_0-auc:0.857382 validation_1-auc:0.825668
[6] validation_0-auc:0.862541 validation_1-auc:0.826634
[7] validation_0-auc:0.864991 validation_1-auc:0.826964
[8] validation_0-auc:0.8667 validation_1-auc:0.82714
[9] validation_0-auc:0.870898 validation_1-auc:0.826398
[10] validation_0-auc:0.87617 validation_1-auc:0.827615
[11] validation_0-auc:0.881478 validation_1-auc:0.827601
[12] validation_0-auc:0.884428 validation_1-auc:0.827953
[13] validation_0-auc:0.890193 validation_1-auc:0.829744
[14] validation_0-auc:0.893239 validation_1-auc:0.829396
[15] validation_0-auc:0.894808 validation_1-auc:0.829337
[16] validation_0-auc:0.896851 validation_1-auc:0.82907
[17] validation_0-auc:0.899093 validation_1-auc:0.828398
[18] validation_0-auc:0.900658 validation_1-auc:0.828423
[19] validation_0-auc:0.90228 validation_1-auc:0.828273
[20] validation_0-auc:0.904662 validation_1-auc:0.827938
[21] validation_0-auc:0.907762 validation_1-auc:0.826591
[22] validation_0-auc:0.909057 validation_1-auc:0.826794
[23] validation_0-auc:0.909973 validation_1-auc:0.826982
[24] validation_0-auc:0.910885 validation_1-auc:0.826896
[25] validation_0-auc:0.91242 validation_1-auc:0.827081
[26] validation_0-auc:0.913506 validation_1-auc:0.826952
[27] validation_0-auc:0.914993 validation_1-auc:0.827641
[28] validation_0-auc:0.916111 validation_1-auc:0.827261
[29] validation_0-auc:0.918795 validation_1-auc:0.826354
[30] validation_0-auc:0.920232 validation_1-auc:0.827431
[31] validation_0-auc:0.921468 validation_1-auc:0.828021
[32] validation_0-auc:0.922488 validation_1-auc:0.827853
[33] validation_0-auc:0.923465 validation_1-auc:0.828556
[34] validation_0-auc:0.924351 validation_1-auc:0.828953
[35] validation_0-auc:0.925259 validation_1-auc:0.82938
[36] validation_0-auc:0.926266 validation_1-auc:0.829653
[37] validation_0-auc:0.927185 validation_1-auc:0.829623
[38] validation_0-auc:0.927721 validation_1-auc:0.828955
[39] validation_0-auc:0.928389 validation_1-auc:0.828637
[40] validation_0-auc:0.929089 validation_1-auc:0.829117
[41] validation_0-auc:0.930122 validation_1-auc:0.82908
[42] validation_0-auc:0.930731 validation_1-auc:0.828845
[43] validation_0-auc:0.931347 validation_1-auc:0.829182
Stopping. Best iteration:
[13] validation_0-auc:0.890193 validation_1-auc:0.829744
100%|██████████| 50/50 [2:49:45<00:00, 203.72s/it, best loss: -0.8378303965626338]
best: {'colsample_bytree': 0.6915358325358198, 'learning_rate': 0.06117968759867564, 'max_depth': 5.0, 'min_child_weight': 6.0}
In [56]:
# n_estimators를 500 증가 후 최적으로 찾은 하이퍼 파라미터를 기반으로 학습과 예측 수행
xgb_clf = XGBClassifier(n_estimators=500,
learning_rate=round(best['learning_rate'],5),
max_depth=int(best['max_depth']),
min_child_weight=int(best['min_child_weight']),
colsample_bytree=round(best['colsample_bytree'],5)
)
# evaluation metric을 auc로, early stopping은 100으로 설정하고 학습 수행
xgb_clf.fit(X_tr, y_tr, early_stopping_rounds=100,
eval_metric='auc', eval_set=[(X_tr, y_tr), (X_val, y_val)])
xgb_roc_score = roc_auc_score(y_test, xgb_clf.predict_proba(X_test)[:, 1])
print('roc auc: {0:.4f}'.format(xgb_roc_score))
[0] validation_0-auc:0.815894 validation_1-auc:0.802468 Multiple eval metrics have been passed: 'validation_1-auc' will be used for early stopping. Will train until validation_1-auc hasn't improved in 100 rounds. [1] validation_0-auc:0.828321 validation_1-auc:0.810574 [2] validation_0-auc:0.827956 validation_1-auc:0.812988 [3] validation_0-auc:0.831543 validation_1-auc:0.815268 [4] validation_0-auc:0.834056 validation_1-auc:0.816575 [5] validation_0-auc:0.834248 validation_1-auc:0.816159 [6] validation_0-auc:0.835475 validation_1-auc:0.817462 [7] validation_0-auc:0.836061 validation_1-auc:0.816616 [8] validation_0-auc:0.836176 validation_1-auc:0.816591 [9] validation_0-auc:0.838824 validation_1-auc:0.81834 [10] validation_0-auc:0.838843 validation_1-auc:0.818357 [11] validation_0-auc:0.839741 validation_1-auc:0.818922 [12] validation_0-auc:0.839951 validation_1-auc:0.819023 [13] validation_0-auc:0.841382 validation_1-auc:0.821584 [14] validation_0-auc:0.841716 validation_1-auc:0.821807 [15] validation_0-auc:0.842257 validation_1-auc:0.822056 [16] validation_0-auc:0.842329 validation_1-auc:0.821835 [17] validation_0-auc:0.842632 validation_1-auc:0.822268 [18] validation_0-auc:0.843154 validation_1-auc:0.822414 [19] validation_0-auc:0.842962 validation_1-auc:0.822224 [20] validation_0-auc:0.8446 validation_1-auc:0.823052 [21] validation_0-auc:0.845586 validation_1-auc:0.824296 [22] validation_0-auc:0.846163 validation_1-auc:0.824428 [23] validation_0-auc:0.846493 validation_1-auc:0.824769 [24] validation_0-auc:0.847105 validation_1-auc:0.825226 [25] validation_0-auc:0.847829 validation_1-auc:0.825683 [26] validation_0-auc:0.84887 validation_1-auc:0.82642 [27] validation_0-auc:0.849564 validation_1-auc:0.826363 [28] validation_0-auc:0.84981 validation_1-auc:0.826773 [29] validation_0-auc:0.850604 validation_1-auc:0.827115 [30] validation_0-auc:0.851287 validation_1-auc:0.82712 [31] validation_0-auc:0.851613 validation_1-auc:0.827583 [32] validation_0-auc:0.851944 validation_1-auc:0.827838 [33] validation_0-auc:0.852284 validation_1-auc:0.828413 [34] validation_0-auc:0.852871 validation_1-auc:0.828905 [35] validation_0-auc:0.853667 validation_1-auc:0.829664 [36] validation_0-auc:0.854196 validation_1-auc:0.830192 [37] validation_0-auc:0.855018 validation_1-auc:0.830334 [38] validation_0-auc:0.855538 validation_1-auc:0.830362 [39] validation_0-auc:0.855756 validation_1-auc:0.83066 [40] validation_0-auc:0.856454 validation_1-auc:0.831032 [41] validation_0-auc:0.856321 validation_1-auc:0.830858 [42] validation_0-auc:0.857017 validation_1-auc:0.831635 [43] validation_0-auc:0.85759 validation_1-auc:0.832461 [44] validation_0-auc:0.858069 validation_1-auc:0.83252 [45] validation_0-auc:0.858009 validation_1-auc:0.832452 [46] validation_0-auc:0.858479 validation_1-auc:0.832868 [47] validation_0-auc:0.859344 validation_1-auc:0.833848 [48] validation_0-auc:0.859431 validation_1-auc:0.833949 [49] validation_0-auc:0.859322 validation_1-auc:0.833763 [50] validation_0-auc:0.859332 validation_1-auc:0.833769 [51] validation_0-auc:0.860275 validation_1-auc:0.834066 [52] validation_0-auc:0.860843 validation_1-auc:0.834125 [53] validation_0-auc:0.86144 validation_1-auc:0.834557 [54] validation_0-auc:0.862169 validation_1-auc:0.834673 [55] validation_0-auc:0.862387 validation_1-auc:0.834551 [56] validation_0-auc:0.86288 validation_1-auc:0.834596 [57] validation_0-auc:0.863376 validation_1-auc:0.834741 [58] validation_0-auc:0.863716 validation_1-auc:0.835185 [59] validation_0-auc:0.863865 validation_1-auc:0.835464 [60] validation_0-auc:0.86417 validation_1-auc:0.835388 [61] validation_0-auc:0.864471 validation_1-auc:0.835283 [62] validation_0-auc:0.864977 validation_1-auc:0.835212 [63] validation_0-auc:0.865201 validation_1-auc:0.835311 [64] validation_0-auc:0.865494 validation_1-auc:0.835418 [65] validation_0-auc:0.865698 validation_1-auc:0.835465 [66] validation_0-auc:0.866147 validation_1-auc:0.835633 [67] validation_0-auc:0.866449 validation_1-auc:0.835857 [68] validation_0-auc:0.866868 validation_1-auc:0.836153 [69] validation_0-auc:0.867167 validation_1-auc:0.836339 [70] validation_0-auc:0.867609 validation_1-auc:0.836496 [71] validation_0-auc:0.867984 validation_1-auc:0.836541 [72] validation_0-auc:0.868149 validation_1-auc:0.836451 [73] validation_0-auc:0.868568 validation_1-auc:0.83671 [74] validation_0-auc:0.868842 validation_1-auc:0.836612 [75] validation_0-auc:0.869078 validation_1-auc:0.836734 [76] validation_0-auc:0.869394 validation_1-auc:0.836584 [77] validation_0-auc:0.869529 validation_1-auc:0.836439 [78] validation_0-auc:0.869971 validation_1-auc:0.836447 [79] validation_0-auc:0.870119 validation_1-auc:0.836475 [80] validation_0-auc:0.870429 validation_1-auc:0.836603 [81] validation_0-auc:0.870647 validation_1-auc:0.836636 [82] validation_0-auc:0.870818 validation_1-auc:0.836515 [83] validation_0-auc:0.870848 validation_1-auc:0.83672 [84] validation_0-auc:0.870963 validation_1-auc:0.836558 [85] validation_0-auc:0.871014 validation_1-auc:0.836687 [86] validation_0-auc:0.871344 validation_1-auc:0.836663 [87] validation_0-auc:0.871559 validation_1-auc:0.836525 [88] validation_0-auc:0.871904 validation_1-auc:0.836634 [89] validation_0-auc:0.872052 validation_1-auc:0.836565 [90] validation_0-auc:0.872262 validation_1-auc:0.836555 [91] validation_0-auc:0.872314 validation_1-auc:0.836665 [92] validation_0-auc:0.872484 validation_1-auc:0.836757 [93] validation_0-auc:0.872744 validation_1-auc:0.836758 [94] validation_0-auc:0.873133 validation_1-auc:0.836745 [95] validation_0-auc:0.873216 validation_1-auc:0.836856 [96] validation_0-auc:0.873297 validation_1-auc:0.836908 [97] validation_0-auc:0.873438 validation_1-auc:0.836919 [98] validation_0-auc:0.873628 validation_1-auc:0.836955 [99] validation_0-auc:0.873906 validation_1-auc:0.836948 [100] validation_0-auc:0.874196 validation_1-auc:0.836988 [101] validation_0-auc:0.874501 validation_1-auc:0.836892 [102] validation_0-auc:0.874661 validation_1-auc:0.836882 [103] validation_0-auc:0.874835 validation_1-auc:0.83689 [104] validation_0-auc:0.874912 validation_1-auc:0.837066 [105] validation_0-auc:0.875099 validation_1-auc:0.837014 [106] validation_0-auc:0.875159 validation_1-auc:0.837083 [107] validation_0-auc:0.87537 validation_1-auc:0.836944 [108] validation_0-auc:0.875538 validation_1-auc:0.837097 [109] validation_0-auc:0.875635 validation_1-auc:0.837204 [110] validation_0-auc:0.875919 validation_1-auc:0.837223 [111] validation_0-auc:0.875978 validation_1-auc:0.837218 [112] validation_0-auc:0.87606 validation_1-auc:0.837296 [113] validation_0-auc:0.876304 validation_1-auc:0.837285 [114] validation_0-auc:0.876418 validation_1-auc:0.837217 [115] validation_0-auc:0.876721 validation_1-auc:0.837482 [116] validation_0-auc:0.87694 validation_1-auc:0.837557 [117] validation_0-auc:0.877074 validation_1-auc:0.837582 [118] validation_0-auc:0.877499 validation_1-auc:0.837574 [119] validation_0-auc:0.877705 validation_1-auc:0.83758 [120] validation_0-auc:0.877781 validation_1-auc:0.837541 [121] validation_0-auc:0.877827 validation_1-auc:0.837566 [122] validation_0-auc:0.878143 validation_1-auc:0.837643 [123] validation_0-auc:0.878312 validation_1-auc:0.837608 [124] validation_0-auc:0.878506 validation_1-auc:0.837673 [125] validation_0-auc:0.878773 validation_1-auc:0.837635 [126] validation_0-auc:0.878869 validation_1-auc:0.837609 [127] validation_0-auc:0.878986 validation_1-auc:0.837561 [128] validation_0-auc:0.879113 validation_1-auc:0.83759 [129] validation_0-auc:0.879174 validation_1-auc:0.837561 [130] validation_0-auc:0.879285 validation_1-auc:0.837527 [131] validation_0-auc:0.879614 validation_1-auc:0.837545 [132] validation_0-auc:0.879715 validation_1-auc:0.837529 [133] validation_0-auc:0.879759 validation_1-auc:0.837549 [134] validation_0-auc:0.879839 validation_1-auc:0.837565 [135] validation_0-auc:0.880035 validation_1-auc:0.837484 [136] validation_0-auc:0.880099 validation_1-auc:0.837454 [137] validation_0-auc:0.880142 validation_1-auc:0.837549 [138] validation_0-auc:0.880194 validation_1-auc:0.837539 [139] validation_0-auc:0.880482 validation_1-auc:0.837478 [140] validation_0-auc:0.880644 validation_1-auc:0.837447 [141] validation_0-auc:0.880723 validation_1-auc:0.837456 [142] validation_0-auc:0.881038 validation_1-auc:0.837471 [143] validation_0-auc:0.881127 validation_1-auc:0.8375 [144] validation_0-auc:0.881251 validation_1-auc:0.83744 [145] validation_0-auc:0.881674 validation_1-auc:0.837397 [146] validation_0-auc:0.88176 validation_1-auc:0.837377 [147] validation_0-auc:0.881843 validation_1-auc:0.837482 [148] validation_0-auc:0.881892 validation_1-auc:0.837423 [149] validation_0-auc:0.882075 validation_1-auc:0.837444 [150] validation_0-auc:0.882205 validation_1-auc:0.837439 [151] validation_0-auc:0.882258 validation_1-auc:0.837491 [152] validation_0-auc:0.882301 validation_1-auc:0.837529 [153] validation_0-auc:0.882463 validation_1-auc:0.837588 [154] validation_0-auc:0.882504 validation_1-auc:0.837599 [155] validation_0-auc:0.882591 validation_1-auc:0.837668 [156] validation_0-auc:0.882791 validation_1-auc:0.837674 [157] validation_0-auc:0.882842 validation_1-auc:0.837645 [158] validation_0-auc:0.882877 validation_1-auc:0.837675 [159] validation_0-auc:0.882894 validation_1-auc:0.83769 [160] validation_0-auc:0.883035 validation_1-auc:0.837723 [161] validation_0-auc:0.883194 validation_1-auc:0.837683 [162] validation_0-auc:0.883489 validation_1-auc:0.837655 [163] validation_0-auc:0.883824 validation_1-auc:0.837774 [164] validation_0-auc:0.884012 validation_1-auc:0.837755 [165] validation_0-auc:0.884093 validation_1-auc:0.837657 [166] validation_0-auc:0.884229 validation_1-auc:0.837599 [167] validation_0-auc:0.884304 validation_1-auc:0.837584 [168] validation_0-auc:0.884369 validation_1-auc:0.837729 [169] validation_0-auc:0.884604 validation_1-auc:0.837703 [170] validation_0-auc:0.884872 validation_1-auc:0.837813 [171] validation_0-auc:0.884918 validation_1-auc:0.837894 [172] validation_0-auc:0.885061 validation_1-auc:0.837789 [173] validation_0-auc:0.885122 validation_1-auc:0.83776 [174] validation_0-auc:0.885249 validation_1-auc:0.837793 [175] validation_0-auc:0.885312 validation_1-auc:0.837795 [176] validation_0-auc:0.885505 validation_1-auc:0.837674 [177] validation_0-auc:0.885793 validation_1-auc:0.837666 [178] validation_0-auc:0.885898 validation_1-auc:0.837692 [179] validation_0-auc:0.88601 validation_1-auc:0.83774 [180] validation_0-auc:0.886024 validation_1-auc:0.837753 [181] validation_0-auc:0.886076 validation_1-auc:0.837677 [182] validation_0-auc:0.886351 validation_1-auc:0.837641 [183] validation_0-auc:0.886397 validation_1-auc:0.837572 [184] validation_0-auc:0.886512 validation_1-auc:0.837627 [185] validation_0-auc:0.886774 validation_1-auc:0.837586 [186] validation_0-auc:0.886967 validation_1-auc:0.837424 [187] validation_0-auc:0.887024 validation_1-auc:0.837505 [188] validation_0-auc:0.887194 validation_1-auc:0.837569 [189] validation_0-auc:0.887445 validation_1-auc:0.837522 [190] validation_0-auc:0.887485 validation_1-auc:0.83752 [191] validation_0-auc:0.8877 validation_1-auc:0.837501 [192] validation_0-auc:0.887812 validation_1-auc:0.837435 [193] validation_0-auc:0.887879 validation_1-auc:0.837405 [194] validation_0-auc:0.888014 validation_1-auc:0.837375 [195] validation_0-auc:0.888096 validation_1-auc:0.837445 [196] validation_0-auc:0.888133 validation_1-auc:0.837448 [197] validation_0-auc:0.888482 validation_1-auc:0.837417 [198] validation_0-auc:0.88866 validation_1-auc:0.837436 [199] validation_0-auc:0.88884 validation_1-auc:0.837398 [200] validation_0-auc:0.88888 validation_1-auc:0.837418 [201] validation_0-auc:0.888935 validation_1-auc:0.83736 [202] validation_0-auc:0.888975 validation_1-auc:0.837375 [203] validation_0-auc:0.889005 validation_1-auc:0.837377 [204] validation_0-auc:0.889014 validation_1-auc:0.837377 [205] validation_0-auc:0.889157 validation_1-auc:0.837406 [206] validation_0-auc:0.889482 validation_1-auc:0.837336 [207] validation_0-auc:0.889602 validation_1-auc:0.837281 [208] validation_0-auc:0.88968 validation_1-auc:0.837285 [209] validation_0-auc:0.889954 validation_1-auc:0.837178 [210] validation_0-auc:0.889978 validation_1-auc:0.837154 [211] validation_0-auc:0.890077 validation_1-auc:0.837192 [212] validation_0-auc:0.890252 validation_1-auc:0.837219 [213] validation_0-auc:0.890463 validation_1-auc:0.83719 [214] validation_0-auc:0.890487 validation_1-auc:0.837203 [215] validation_0-auc:0.890501 validation_1-auc:0.837185 [216] validation_0-auc:0.890686 validation_1-auc:0.837163 [217] validation_0-auc:0.890855 validation_1-auc:0.837012 [218] validation_0-auc:0.890878 validation_1-auc:0.83702 [219] validation_0-auc:0.890941 validation_1-auc:0.836942 [220] validation_0-auc:0.890973 validation_1-auc:0.836905 [221] validation_0-auc:0.891182 validation_1-auc:0.836844 [222] validation_0-auc:0.891247 validation_1-auc:0.836861 [223] validation_0-auc:0.891267 validation_1-auc:0.836851 [224] validation_0-auc:0.891379 validation_1-auc:0.83686 [225] validation_0-auc:0.891681 validation_1-auc:0.836963 [226] validation_0-auc:0.891742 validation_1-auc:0.836905 [227] validation_0-auc:0.89179 validation_1-auc:0.836872 [228] validation_0-auc:0.891965 validation_1-auc:0.836883 [229] validation_0-auc:0.892081 validation_1-auc:0.836853 [230] validation_0-auc:0.892116 validation_1-auc:0.836918 [231] validation_0-auc:0.892159 validation_1-auc:0.836865 [232] validation_0-auc:0.892191 validation_1-auc:0.836863 [233] validation_0-auc:0.892268 validation_1-auc:0.836781 [234] validation_0-auc:0.892275 validation_1-auc:0.836788 [235] validation_0-auc:0.892427 validation_1-auc:0.836748 [236] validation_0-auc:0.892524 validation_1-auc:0.836744 [237] validation_0-auc:0.892602 validation_1-auc:0.836602 [238] validation_0-auc:0.892909 validation_1-auc:0.836546 [239] validation_0-auc:0.893039 validation_1-auc:0.83653 [240] validation_0-auc:0.893159 validation_1-auc:0.836515 [241] validation_0-auc:0.893298 validation_1-auc:0.836508 [242] validation_0-auc:0.893339 validation_1-auc:0.83651 [243] validation_0-auc:0.893421 validation_1-auc:0.836491 [244] validation_0-auc:0.893598 validation_1-auc:0.836455 [245] validation_0-auc:0.893633 validation_1-auc:0.836398 [246] validation_0-auc:0.893662 validation_1-auc:0.836359 [247] validation_0-auc:0.893694 validation_1-auc:0.836358 [248] validation_0-auc:0.893735 validation_1-auc:0.836322 [249] validation_0-auc:0.893777 validation_1-auc:0.836305 [250] validation_0-auc:0.893858 validation_1-auc:0.836345 [251] validation_0-auc:0.893982 validation_1-auc:0.83639 [252] validation_0-auc:0.894018 validation_1-auc:0.836405 [253] validation_0-auc:0.894134 validation_1-auc:0.836346 [254] validation_0-auc:0.894199 validation_1-auc:0.836342 [255] validation_0-auc:0.894227 validation_1-auc:0.836302 [256] validation_0-auc:0.894509 validation_1-auc:0.836265 [257] validation_0-auc:0.894576 validation_1-auc:0.836264 [258] validation_0-auc:0.894657 validation_1-auc:0.836257 [259] validation_0-auc:0.894702 validation_1-auc:0.83626 [260] validation_0-auc:0.89475 validation_1-auc:0.836285 [261] validation_0-auc:0.894994 validation_1-auc:0.836123 [262] validation_0-auc:0.895132 validation_1-auc:0.836191 [263] validation_0-auc:0.895249 validation_1-auc:0.83613 [264] validation_0-auc:0.895268 validation_1-auc:0.836163 [265] validation_0-auc:0.895349 validation_1-auc:0.836156 [266] validation_0-auc:0.895592 validation_1-auc:0.836057 [267] validation_0-auc:0.895682 validation_1-auc:0.836068 [268] validation_0-auc:0.895949 validation_1-auc:0.83613 [269] validation_0-auc:0.896148 validation_1-auc:0.836124 [270] validation_0-auc:0.896326 validation_1-auc:0.836104 [271] validation_0-auc:0.896367 validation_1-auc:0.836094 Stopping. Best iteration: [171] validation_0-auc:0.884918 validation_1-auc:0.837894 roc auc: 0.8454
In [58]:
from xgboost import plot_importance
import matplotlib.pyplot as plt
fig, ax = plt.subplots(1, 1, figsize=(10,8))
plot_importance(xgb_clf, ax=ax, max_num_features=20, height=0.4)
Out[58]:
<matplotlib.axes._subplots.AxesSubplot at 0x7fde745f9310>
LightGBM 모델 학습과 하이퍼 파라미터 튜닝¶
- 분류 실습 - 캐글 산탄데르 고객 만족 예측
In [60]:
from lightgbm import LGBMClassifier
lgbm_clf = LGBMClassifier(n_estimators=500)
# LightGBM의 조기 중단의 검증 데이터 세트로 사용하기 위해서
# X_train, y_train을 다시 학습과 검증 데이터 세트로 분리
eval_set=[(X_tr, y_tr), (X_val, y_val)]
lgbm_clf.fit(X_tr, y_tr, early_stopping_rounds=100, eval_metric='auc', eval_set=eval_set)
# 모델 평가
lgbm_roc_score = roc_auc_score(y_test, lgbm_clf.predict_proba(X_test)[:, 1])
print('roc auc: {0:.4f}'.format(lgbm_roc_score))
[1] training's auc: 0.82625 training's binary_logloss: 0.15523 valid_1's auc: 0.809814 valid_1's binary_logloss: 0.15774 Training until validation scores don't improve for 100 rounds. [2] training's auc: 0.833881 training's binary_logloss: 0.149605 valid_1's auc: 0.81209 valid_1's binary_logloss: 0.153277 [3] training's auc: 0.839548 training's binary_logloss: 0.145485 valid_1's auc: 0.814018 valid_1's binary_logloss: 0.150052 [4] training's auc: 0.845749 training's binary_logloss: 0.142197 valid_1's auc: 0.819257 valid_1's binary_logloss: 0.147501 [5] training's auc: 0.848218 training's binary_logloss: 0.139595 valid_1's auc: 0.821853 valid_1's binary_logloss: 0.145477 [6] training's auc: 0.853054 training's binary_logloss: 0.137401 valid_1's auc: 0.822606 valid_1's binary_logloss: 0.144005 [7] training's auc: 0.854533 training's binary_logloss: 0.135455 valid_1's auc: 0.823109 valid_1's binary_logloss: 0.14262 [8] training's auc: 0.858009 training's binary_logloss: 0.133684 valid_1's auc: 0.824085 valid_1's binary_logloss: 0.141618 [9] training's auc: 0.861718 training's binary_logloss: 0.132171 valid_1's auc: 0.825242 valid_1's binary_logloss: 0.140688 [10] training's auc: 0.865122 training's binary_logloss: 0.130802 valid_1's auc: 0.826889 valid_1's binary_logloss: 0.139862 [11] training's auc: 0.867982 training's binary_logloss: 0.129644 valid_1's auc: 0.82696 valid_1's binary_logloss: 0.139212 [12] training's auc: 0.869789 training's binary_logloss: 0.12858 valid_1's auc: 0.827896 valid_1's binary_logloss: 0.138669 [13] training's auc: 0.871538 training's binary_logloss: 0.12756 valid_1's auc: 0.828122 valid_1's binary_logloss: 0.138238 [14] training's auc: 0.872943 training's binary_logloss: 0.126636 valid_1's auc: 0.827706 valid_1's binary_logloss: 0.137919 [15] training's auc: 0.875327 training's binary_logloss: 0.125767 valid_1's auc: 0.827592 valid_1's binary_logloss: 0.13759 [16] training's auc: 0.876874 training's binary_logloss: 0.124915 valid_1's auc: 0.827849 valid_1's binary_logloss: 0.137333 [17] training's auc: 0.878285 training's binary_logloss: 0.124163 valid_1's auc: 0.827256 valid_1's binary_logloss: 0.137169 [18] training's auc: 0.880692 training's binary_logloss: 0.123432 valid_1's auc: 0.827729 valid_1's binary_logloss: 0.136969 [19] training's auc: 0.882869 training's binary_logloss: 0.122649 valid_1's auc: 0.827513 valid_1's binary_logloss: 0.136814 [20] training's auc: 0.885001 training's binary_logloss: 0.121919 valid_1's auc: 0.827684 valid_1's binary_logloss: 0.13668 [21] training's auc: 0.886741 training's binary_logloss: 0.121244 valid_1's auc: 0.82737 valid_1's binary_logloss: 0.136577 [22] training's auc: 0.888272 training's binary_logloss: 0.120649 valid_1's auc: 0.827495 valid_1's binary_logloss: 0.13651 [23] training's auc: 0.889611 training's binary_logloss: 0.120113 valid_1's auc: 0.82788 valid_1's binary_logloss: 0.136343 [24] training's auc: 0.891084 training's binary_logloss: 0.119625 valid_1's auc: 0.827689 valid_1's binary_logloss: 0.136307 [25] training's auc: 0.893091 training's binary_logloss: 0.119054 valid_1's auc: 0.827891 valid_1's binary_logloss: 0.136226 [26] training's auc: 0.893945 training's binary_logloss: 0.118631 valid_1's auc: 0.828007 valid_1's binary_logloss: 0.13617 [27] training's auc: 0.895026 training's binary_logloss: 0.118145 valid_1's auc: 0.827694 valid_1's binary_logloss: 0.136168 [28] training's auc: 0.897223 training's binary_logloss: 0.11762 valid_1's auc: 0.828886 valid_1's binary_logloss: 0.136004 [29] training's auc: 0.898022 training's binary_logloss: 0.11719 valid_1's auc: 0.828966 valid_1's binary_logloss: 0.135988 [30] training's auc: 0.89939 training's binary_logloss: 0.11674 valid_1's auc: 0.828896 valid_1's binary_logloss: 0.135963 [31] training's auc: 0.900355 training's binary_logloss: 0.11634 valid_1's auc: 0.828633 valid_1's binary_logloss: 0.135982 [32] training's auc: 0.901421 training's binary_logloss: 0.115958 valid_1's auc: 0.829047 valid_1's binary_logloss: 0.13591 [33] training's auc: 0.902208 training's binary_logloss: 0.115516 valid_1's auc: 0.829802 valid_1's binary_logloss: 0.135775 [34] training's auc: 0.903961 training's binary_logloss: 0.115047 valid_1's auc: 0.830034 valid_1's binary_logloss: 0.13576 [35] training's auc: 0.90462 training's binary_logloss: 0.114716 valid_1's auc: 0.829946 valid_1's binary_logloss: 0.135775 [36] training's auc: 0.906085 training's binary_logloss: 0.114265 valid_1's auc: 0.829737 valid_1's binary_logloss: 0.135775 [37] training's auc: 0.907215 training's binary_logloss: 0.113956 valid_1's auc: 0.830227 valid_1's binary_logloss: 0.135748 [38] training's auc: 0.907881 training's binary_logloss: 0.113639 valid_1's auc: 0.829999 valid_1's binary_logloss: 0.135754 [39] training's auc: 0.908534 training's binary_logloss: 0.113307 valid_1's auc: 0.830029 valid_1's binary_logloss: 0.135734 [40] training's auc: 0.909252 training's binary_logloss: 0.112926 valid_1's auc: 0.829887 valid_1's binary_logloss: 0.135784 [41] training's auc: 0.909889 training's binary_logloss: 0.112652 valid_1's auc: 0.82961 valid_1's binary_logloss: 0.135831 [42] training's auc: 0.911022 training's binary_logloss: 0.112262 valid_1's auc: 0.829516 valid_1's binary_logloss: 0.135863 [43] training's auc: 0.911532 training's binary_logloss: 0.111997 valid_1's auc: 0.829619 valid_1's binary_logloss: 0.135848 [44] training's auc: 0.912393 training's binary_logloss: 0.111627 valid_1's auc: 0.829227 valid_1's binary_logloss: 0.135928 [45] training's auc: 0.913147 training's binary_logloss: 0.111278 valid_1's auc: 0.829345 valid_1's binary_logloss: 0.135901 [46] training's auc: 0.913893 training's binary_logloss: 0.110974 valid_1's auc: 0.829442 valid_1's binary_logloss: 0.135852 [47] training's auc: 0.914499 training's binary_logloss: 0.110675 valid_1's auc: 0.829157 valid_1's binary_logloss: 0.13594 [48] training's auc: 0.915177 training's binary_logloss: 0.110362 valid_1's auc: 0.829315 valid_1's binary_logloss: 0.135941 [49] training's auc: 0.915697 training's binary_logloss: 0.110134 valid_1's auc: 0.829254 valid_1's binary_logloss: 0.135972 [50] training's auc: 0.916065 training's binary_logloss: 0.109921 valid_1's auc: 0.82911 valid_1's binary_logloss: 0.135988 [51] training's auc: 0.916682 training's binary_logloss: 0.109615 valid_1's auc: 0.829439 valid_1's binary_logloss: 0.135909 [52] training's auc: 0.917033 training's binary_logloss: 0.109393 valid_1's auc: 0.829255 valid_1's binary_logloss: 0.13594 [53] training's auc: 0.917669 training's binary_logloss: 0.109056 valid_1's auc: 0.828977 valid_1's binary_logloss: 0.135987 [54] training's auc: 0.917884 training's binary_logloss: 0.108897 valid_1's auc: 0.828943 valid_1's binary_logloss: 0.135972 [55] training's auc: 0.918203 training's binary_logloss: 0.108711 valid_1's auc: 0.829004 valid_1's binary_logloss: 0.135972 [56] training's auc: 0.919107 training's binary_logloss: 0.10848 valid_1's auc: 0.829042 valid_1's binary_logloss: 0.135983 [57] training's auc: 0.91978 training's binary_logloss: 0.108181 valid_1's auc: 0.828725 valid_1's binary_logloss: 0.136075 [58] training's auc: 0.920196 training's binary_logloss: 0.107963 valid_1's auc: 0.828542 valid_1's binary_logloss: 0.13613 [59] training's auc: 0.920887 training's binary_logloss: 0.10776 valid_1's auc: 0.8285 valid_1's binary_logloss: 0.13615 [60] training's auc: 0.921263 training's binary_logloss: 0.107585 valid_1's auc: 0.828549 valid_1's binary_logloss: 0.136151 [61] training's auc: 0.921743 training's binary_logloss: 0.10734 valid_1's auc: 0.828332 valid_1's binary_logloss: 0.136152 [62] training's auc: 0.922409 training's binary_logloss: 0.1071 valid_1's auc: 0.828438 valid_1's binary_logloss: 0.136152 [63] training's auc: 0.92285 training's binary_logloss: 0.106896 valid_1's auc: 0.828466 valid_1's binary_logloss: 0.136183 [64] training's auc: 0.923241 training's binary_logloss: 0.106661 valid_1's auc: 0.828734 valid_1's binary_logloss: 0.136197 [65] training's auc: 0.923869 training's binary_logloss: 0.106407 valid_1's auc: 0.828448 valid_1's binary_logloss: 0.136297 [66] training's auc: 0.924074 training's binary_logloss: 0.106233 valid_1's auc: 0.82826 valid_1's binary_logloss: 0.136308 [67] training's auc: 0.925316 training's binary_logloss: 0.105867 valid_1's auc: 0.827826 valid_1's binary_logloss: 0.136383 [68] training's auc: 0.925458 training's binary_logloss: 0.105739 valid_1's auc: 0.827956 valid_1's binary_logloss: 0.136365 [69] training's auc: 0.925644 training's binary_logloss: 0.10559 valid_1's auc: 0.827922 valid_1's binary_logloss: 0.136386 [70] training's auc: 0.926138 training's binary_logloss: 0.105345 valid_1's auc: 0.827656 valid_1's binary_logloss: 0.136465 [71] training's auc: 0.926687 training's binary_logloss: 0.105172 valid_1's auc: 0.82805 valid_1's binary_logloss: 0.136393 [72] training's auc: 0.926942 training's binary_logloss: 0.104974 valid_1's auc: 0.827869 valid_1's binary_logloss: 0.136455 [73] training's auc: 0.927174 training's binary_logloss: 0.104807 valid_1's auc: 0.827802 valid_1's binary_logloss: 0.136503 [74] training's auc: 0.927499 training's binary_logloss: 0.104589 valid_1's auc: 0.827721 valid_1's binary_logloss: 0.136537 [75] training's auc: 0.927637 training's binary_logloss: 0.10446 valid_1's auc: 0.827677 valid_1's binary_logloss: 0.136536 [76] training's auc: 0.928135 training's binary_logloss: 0.104282 valid_1's auc: 0.827319 valid_1's binary_logloss: 0.136599 [77] training's auc: 0.928317 training's binary_logloss: 0.104121 valid_1's auc: 0.827427 valid_1's binary_logloss: 0.136585 [78] training's auc: 0.928666 training's binary_logloss: 0.103924 valid_1's auc: 0.827316 valid_1's binary_logloss: 0.136614 [79] training's auc: 0.928977 training's binary_logloss: 0.103712 valid_1's auc: 0.827226 valid_1's binary_logloss: 0.136648 [80] training's auc: 0.929257 training's binary_logloss: 0.10353 valid_1's auc: 0.827635 valid_1's binary_logloss: 0.136582 [81] training's auc: 0.9296 training's binary_logloss: 0.103401 valid_1's auc: 0.82765 valid_1's binary_logloss: 0.136607 [82] training's auc: 0.929861 training's binary_logloss: 0.103208 valid_1's auc: 0.827799 valid_1's binary_logloss: 0.13662 [83] training's auc: 0.930341 training's binary_logloss: 0.102946 valid_1's auc: 0.827819 valid_1's binary_logloss: 0.136616 [84] training's auc: 0.93069 training's binary_logloss: 0.102746 valid_1's auc: 0.827718 valid_1's binary_logloss: 0.13667 [85] training's auc: 0.930919 training's binary_logloss: 0.102569 valid_1's auc: 0.827399 valid_1's binary_logloss: 0.136735 [86] training's auc: 0.931353 training's binary_logloss: 0.102328 valid_1's auc: 0.827153 valid_1's binary_logloss: 0.136782 [87] training's auc: 0.931539 training's binary_logloss: 0.102163 valid_1's auc: 0.827326 valid_1's binary_logloss: 0.136779 [88] training's auc: 0.931778 training's binary_logloss: 0.102002 valid_1's auc: 0.827304 valid_1's binary_logloss: 0.136815 [89] training's auc: 0.932172 training's binary_logloss: 0.101798 valid_1's auc: 0.827257 valid_1's binary_logloss: 0.136826 [90] training's auc: 0.932293 training's binary_logloss: 0.101681 valid_1's auc: 0.827444 valid_1's binary_logloss: 0.136805 [91] training's auc: 0.932758 training's binary_logloss: 0.101442 valid_1's auc: 0.827623 valid_1's binary_logloss: 0.136811 [92] training's auc: 0.933181 training's binary_logloss: 0.101315 valid_1's auc: 0.827609 valid_1's binary_logloss: 0.13684 [93] training's auc: 0.933336 training's binary_logloss: 0.101188 valid_1's auc: 0.827331 valid_1's binary_logloss: 0.136928 [94] training's auc: 0.933463 training's binary_logloss: 0.10109 valid_1's auc: 0.827063 valid_1's binary_logloss: 0.136991 [95] training's auc: 0.933686 training's binary_logloss: 0.100922 valid_1's auc: 0.826949 valid_1's binary_logloss: 0.137009 [96] training's auc: 0.933964 training's binary_logloss: 0.100752 valid_1's auc: 0.826753 valid_1's binary_logloss: 0.13706 [97] training's auc: 0.934074 training's binary_logloss: 0.10064 valid_1's auc: 0.826527 valid_1's binary_logloss: 0.137101 [98] training's auc: 0.934571 training's binary_logloss: 0.100382 valid_1's auc: 0.826545 valid_1's binary_logloss: 0.137089 [99] training's auc: 0.934753 training's binary_logloss: 0.100252 valid_1's auc: 0.826425 valid_1's binary_logloss: 0.137116 [100] training's auc: 0.935004 training's binary_logloss: 0.100091 valid_1's auc: 0.826284 valid_1's binary_logloss: 0.137148 [101] training's auc: 0.935318 training's binary_logloss: 0.0999751 valid_1's auc: 0.82633 valid_1's binary_logloss: 0.137165 [102] training's auc: 0.935438 training's binary_logloss: 0.0998615 valid_1's auc: 0.826078 valid_1's binary_logloss: 0.137206 [103] training's auc: 0.935534 training's binary_logloss: 0.0997669 valid_1's auc: 0.826106 valid_1's binary_logloss: 0.137235 [104] training's auc: 0.935853 training's binary_logloss: 0.0996092 valid_1's auc: 0.825948 valid_1's binary_logloss: 0.137293 [105] training's auc: 0.936184 training's binary_logloss: 0.0994004 valid_1's auc: 0.826048 valid_1's binary_logloss: 0.1373 [106] training's auc: 0.936547 training's binary_logloss: 0.0992208 valid_1's auc: 0.826359 valid_1's binary_logloss: 0.137259 [107] training's auc: 0.936846 training's binary_logloss: 0.0990304 valid_1's auc: 0.826353 valid_1's binary_logloss: 0.137282 [108] training's auc: 0.937006 training's binary_logloss: 0.0988933 valid_1's auc: 0.826189 valid_1's binary_logloss: 0.137327 [109] training's auc: 0.937152 training's binary_logloss: 0.0987827 valid_1's auc: 0.826255 valid_1's binary_logloss: 0.137325 [110] training's auc: 0.937245 training's binary_logloss: 0.0986977 valid_1's auc: 0.8262 valid_1's binary_logloss: 0.137343 [111] training's auc: 0.937305 training's binary_logloss: 0.0986243 valid_1's auc: 0.826248 valid_1's binary_logloss: 0.137352 [112] training's auc: 0.937666 training's binary_logloss: 0.0984418 valid_1's auc: 0.82597 valid_1's binary_logloss: 0.137417 [113] training's auc: 0.938102 training's binary_logloss: 0.0982409 valid_1's auc: 0.826053 valid_1's binary_logloss: 0.137455 [114] training's auc: 0.938354 training's binary_logloss: 0.0980808 valid_1's auc: 0.826104 valid_1's binary_logloss: 0.137448 [115] training's auc: 0.938703 training's binary_logloss: 0.0978838 valid_1's auc: 0.826029 valid_1's binary_logloss: 0.137478 [116] training's auc: 0.938934 training's binary_logloss: 0.0977793 valid_1's auc: 0.826014 valid_1's binary_logloss: 0.137515 [117] training's auc: 0.939166 training's binary_logloss: 0.097632 valid_1's auc: 0.825926 valid_1's binary_logloss: 0.137547 [118] training's auc: 0.939695 training's binary_logloss: 0.0974914 valid_1's auc: 0.826264 valid_1's binary_logloss: 0.137488 [119] training's auc: 0.939859 training's binary_logloss: 0.0973563 valid_1's auc: 0.826174 valid_1's binary_logloss: 0.137525 [120] training's auc: 0.93992 training's binary_logloss: 0.0972722 valid_1's auc: 0.826248 valid_1's binary_logloss: 0.137533 [121] training's auc: 0.940061 training's binary_logloss: 0.097149 valid_1's auc: 0.826211 valid_1's binary_logloss: 0.137572 [122] training's auc: 0.940195 training's binary_logloss: 0.0970275 valid_1's auc: 0.826157 valid_1's binary_logloss: 0.137581 [123] training's auc: 0.940249 training's binary_logloss: 0.096942 valid_1's auc: 0.826171 valid_1's binary_logloss: 0.137587 [124] training's auc: 0.940331 training's binary_logloss: 0.0968476 valid_1's auc: 0.826036 valid_1's binary_logloss: 0.137621 [125] training's auc: 0.940438 training's binary_logloss: 0.0967391 valid_1's auc: 0.825829 valid_1's binary_logloss: 0.13768 [126] training's auc: 0.940764 training's binary_logloss: 0.0965667 valid_1's auc: 0.825712 valid_1's binary_logloss: 0.137745 [127] training's auc: 0.940971 training's binary_logloss: 0.0964585 valid_1's auc: 0.825579 valid_1's binary_logloss: 0.137812 [128] training's auc: 0.941089 training's binary_logloss: 0.0963486 valid_1's auc: 0.825555 valid_1's binary_logloss: 0.137846 [129] training's auc: 0.941314 training's binary_logloss: 0.0961895 valid_1's auc: 0.825735 valid_1's binary_logloss: 0.13782 [130] training's auc: 0.941521 training's binary_logloss: 0.0960485 valid_1's auc: 0.825674 valid_1's binary_logloss: 0.13787 [131] training's auc: 0.941919 training's binary_logloss: 0.0958162 valid_1's auc: 0.825686 valid_1's binary_logloss: 0.137846 [132] training's auc: 0.941973 training's binary_logloss: 0.0957536 valid_1's auc: 0.825406 valid_1's binary_logloss: 0.13791 [133] training's auc: 0.942172 training's binary_logloss: 0.0956215 valid_1's auc: 0.825463 valid_1's binary_logloss: 0.137934 [134] training's auc: 0.942253 training's binary_logloss: 0.0955278 valid_1's auc: 0.825323 valid_1's binary_logloss: 0.137962 [135] training's auc: 0.942364 training's binary_logloss: 0.0954322 valid_1's auc: 0.825133 valid_1's binary_logloss: 0.138019 [136] training's auc: 0.942501 training's binary_logloss: 0.0953015 valid_1's auc: 0.824859 valid_1's binary_logloss: 0.138098 [137] training's auc: 0.942573 training's binary_logloss: 0.0952128 valid_1's auc: 0.824764 valid_1's binary_logloss: 0.138157 Early stopping, best iteration is: [37] training's auc: 0.907215 training's binary_logloss: 0.113956 valid_1's auc: 0.830227 valid_1's binary_logloss: 0.135748 roc auc: 0.8389
In [61]:
lgbm_search_space = {'num_leaves' : hp.quniform('num_leaves', 32, 64, 1),
'max_depth': hp.quniform('max_depth', 100, 160, 1),
'min_child_samples' : hp.quniform('min_child_samples', 60, 100, 1),
'subsample' : hp.uniform('subsample', 0.7, 1),
'learning_rate' : hp.uniform('learning_rate', 0.01, 0.2)
}
In [67]:
def objective_func(search_space):
lgbm_clf = LGBMClassifier(n_estimators=100,
num_leaves=int(search_space['num_leaves']),
max_depth=int(search_space['max_depth']),
min_child_samples=int(search_space['min_child_samples']),
subsample=search_space['subsample'],
learning_rate=search_space['learning_rate']
)
# 3개 K-fold 방식으로 평가된 roc_auc 지표를 담는 list
roc_auc_list = []
# 3개 k-fold 방식 적용
kf = KFold(n_splits=3)
# X_train을 다시 학습과 검증용 데이터로 분리
for tr_index, val_index in kf.split(X_train):
# kf.split(X_train)으로 추출된 학습과 검증 index 값으로 학습과 검증 데이터 세트 분리
X_tr, y_tr = X_train.iloc[tr_index], y_train.iloc[tr_index]
X_val, y_val = X_train.iloc[val_index], y_train.iloc[val_index]
# early stopping은 30회로 설정하고 추출된 학습과 검증 데이터로 XGBClassifier 학습 수행
lgbm_clf.fit(X_tr, y_tr, early_stopping_rounds=30, eval_metric='auc',
eval_set=[(X_tr, y_tr), (X_val, y_val)])
# 1로 예측한 확률값 추출 후 roc auc 계산하고 평균 roc auc 계산을 위해 list 결과값 담음
score = roc_auc_score(y_val, lgbm_clf.predict_proba(X_val)[:, 1])
roc_auc_list.append(score)
# 3개 k-fold로 계산된 roc_auc 값의 평균값을 반환하되,
# HyperOpt는 목적 함수의 최솟값을 위한 입력값을 찾으므로 -1을 곱한 뒤 반환
return -1*np.mean(roc_auc_list)
In [68]:
from hyperopt import fmin, tpe, Trials
trials = Trials()
# fmin() 함수를 호출. max_evals 지정된 횟수만큼 반복 후 목적함수의 최솟값을 가지는 최적 입력값 추출
best = fmin(fn=objective_func,
space=lgbm_search_space,
algo=tpe.suggest,
max_evals=50, # 최대 반복 횟수
trials=trials)
print('best:', best)
스트리밍 출력 내용이 길어서 마지막 5000줄이 삭제되었습니다.
[26] training's auc: 0.893943 training's binary_logloss: 0.118538 valid_1's auc: 0.828387 valid_1's binary_logloss: 0.136273
[27] training's auc: 0.895348 training's binary_logloss: 0.118083 valid_1's auc: 0.828389 valid_1's binary_logloss: 0.13629
[28] training's auc: 0.897215 training's binary_logloss: 0.117652 valid_1's auc: 0.828829 valid_1's binary_logloss: 0.136258
[29] training's auc: 0.898446 training's binary_logloss: 0.11718 valid_1's auc: 0.828693 valid_1's binary_logloss: 0.1363
[30] training's auc: 0.899462 training's binary_logloss: 0.116727 valid_1's auc: 0.828671 valid_1's binary_logloss: 0.136359
[31] training's auc: 0.90079 training's binary_logloss: 0.116276 valid_1's auc: 0.828776 valid_1's binary_logloss: 0.136377
[32] training's auc: 0.902014 training's binary_logloss: 0.115821 valid_1's auc: 0.828672 valid_1's binary_logloss: 0.136427
[33] training's auc: 0.902993 training's binary_logloss: 0.11541 valid_1's auc: 0.829017 valid_1's binary_logloss: 0.136413
[34] training's auc: 0.903825 training's binary_logloss: 0.115064 valid_1's auc: 0.828407 valid_1's binary_logloss: 0.136542
[35] training's auc: 0.904942 training's binary_logloss: 0.114685 valid_1's auc: 0.827871 valid_1's binary_logloss: 0.136651
[36] training's auc: 0.905538 training's binary_logloss: 0.114426 valid_1's auc: 0.827279 valid_1's binary_logloss: 0.136785
[37] training's auc: 0.906151 training's binary_logloss: 0.114126 valid_1's auc: 0.826696 valid_1's binary_logloss: 0.136918
[38] training's auc: 0.906912 training's binary_logloss: 0.113783 valid_1's auc: 0.826664 valid_1's binary_logloss: 0.136977
[39] training's auc: 0.907909 training's binary_logloss: 0.11348 valid_1's auc: 0.826491 valid_1's binary_logloss: 0.137029
[40] training's auc: 0.908619 training's binary_logloss: 0.113121 valid_1's auc: 0.826029 valid_1's binary_logloss: 0.137166
[41] training's auc: 0.909205 training's binary_logloss: 0.112766 valid_1's auc: 0.825728 valid_1's binary_logloss: 0.137284
[42] training's auc: 0.90977 training's binary_logloss: 0.11245 valid_1's auc: 0.825247 valid_1's binary_logloss: 0.137417
[43] training's auc: 0.910884 training's binary_logloss: 0.112077 valid_1's auc: 0.824829 valid_1's binary_logloss: 0.13751
[44] training's auc: 0.911851 training's binary_logloss: 0.111685 valid_1's auc: 0.824376 valid_1's binary_logloss: 0.137586
[45] training's auc: 0.912736 training's binary_logloss: 0.111298 valid_1's auc: 0.824272 valid_1's binary_logloss: 0.137673
[46] training's auc: 0.913436 training's binary_logloss: 0.110934 valid_1's auc: 0.823938 valid_1's binary_logloss: 0.137749
[47] training's auc: 0.914264 training's binary_logloss: 0.110566 valid_1's auc: 0.823527 valid_1's binary_logloss: 0.137887
[48] training's auc: 0.914716 training's binary_logloss: 0.110275 valid_1's auc: 0.822898 valid_1's binary_logloss: 0.138041
Early stopping, best iteration is:
[18] training's auc: 0.882763 training's binary_logloss: 0.122803 valid_1's auc: 0.829078 valid_1's binary_logloss: 0.136389
[1] training's auc: 0.826103 training's binary_logloss: 0.155963 valid_1's auc: 0.813834 valid_1's binary_logloss: 0.151558
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.835675 training's binary_logloss: 0.149801 valid_1's auc: 0.821421 valid_1's binary_logloss: 0.146514
[3] training's auc: 0.842676 training's binary_logloss: 0.145555 valid_1's auc: 0.824966 valid_1's binary_logloss: 0.142942
[4] training's auc: 0.845444 training's binary_logloss: 0.142325 valid_1's auc: 0.82711 valid_1's binary_logloss: 0.14044
[5] training's auc: 0.850933 training's binary_logloss: 0.139611 valid_1's auc: 0.831152 valid_1's binary_logloss: 0.138608
[6] training's auc: 0.853522 training's binary_logloss: 0.137472 valid_1's auc: 0.834374 valid_1's binary_logloss: 0.137003
[7] training's auc: 0.858214 training's binary_logloss: 0.135628 valid_1's auc: 0.83481 valid_1's binary_logloss: 0.135841
[8] training's auc: 0.862843 training's binary_logloss: 0.133961 valid_1's auc: 0.834364 valid_1's binary_logloss: 0.135003
[9] training's auc: 0.865776 training's binary_logloss: 0.132583 valid_1's auc: 0.83514 valid_1's binary_logloss: 0.134154
[10] training's auc: 0.867456 training's binary_logloss: 0.131389 valid_1's auc: 0.835112 valid_1's binary_logloss: 0.133554
[11] training's auc: 0.86938 training's binary_logloss: 0.130357 valid_1's auc: 0.834496 valid_1's binary_logloss: 0.13305
[12] training's auc: 0.871001 training's binary_logloss: 0.1293 valid_1's auc: 0.833799 valid_1's binary_logloss: 0.132681
[13] training's auc: 0.873806 training's binary_logloss: 0.128321 valid_1's auc: 0.83387 valid_1's binary_logloss: 0.132387
[14] training's auc: 0.875751 training's binary_logloss: 0.127477 valid_1's auc: 0.834548 valid_1's binary_logloss: 0.132079
[15] training's auc: 0.877419 training's binary_logloss: 0.126633 valid_1's auc: 0.834649 valid_1's binary_logloss: 0.131873
[16] training's auc: 0.878987 training's binary_logloss: 0.125881 valid_1's auc: 0.834401 valid_1's binary_logloss: 0.131661
[17] training's auc: 0.880354 training's binary_logloss: 0.12521 valid_1's auc: 0.834204 valid_1's binary_logloss: 0.131495
[18] training's auc: 0.881525 training's binary_logloss: 0.124621 valid_1's auc: 0.834656 valid_1's binary_logloss: 0.131345
[19] training's auc: 0.884134 training's binary_logloss: 0.123937 valid_1's auc: 0.834725 valid_1's binary_logloss: 0.131242
[20] training's auc: 0.885136 training's binary_logloss: 0.123397 valid_1's auc: 0.83405 valid_1's binary_logloss: 0.131238
[21] training's auc: 0.886301 training's binary_logloss: 0.122856 valid_1's auc: 0.834245 valid_1's binary_logloss: 0.13116
[22] training's auc: 0.887861 training's binary_logloss: 0.122356 valid_1's auc: 0.834853 valid_1's binary_logloss: 0.130988
[23] training's auc: 0.889085 training's binary_logloss: 0.121831 valid_1's auc: 0.834729 valid_1's binary_logloss: 0.131
[24] training's auc: 0.890874 training's binary_logloss: 0.121321 valid_1's auc: 0.835093 valid_1's binary_logloss: 0.130939
[25] training's auc: 0.892239 training's binary_logloss: 0.120861 valid_1's auc: 0.83536 valid_1's binary_logloss: 0.130877
[26] training's auc: 0.893592 training's binary_logloss: 0.120423 valid_1's auc: 0.834835 valid_1's binary_logloss: 0.130928
[27] training's auc: 0.89523 training's binary_logloss: 0.119999 valid_1's auc: 0.834678 valid_1's binary_logloss: 0.130932
[28] training's auc: 0.896671 training's binary_logloss: 0.119586 valid_1's auc: 0.834484 valid_1's binary_logloss: 0.130971
[29] training's auc: 0.897796 training's binary_logloss: 0.119133 valid_1's auc: 0.834388 valid_1's binary_logloss: 0.130987
[30] training's auc: 0.899261 training's binary_logloss: 0.118678 valid_1's auc: 0.834472 valid_1's binary_logloss: 0.130992
[31] training's auc: 0.900404 training's binary_logloss: 0.118198 valid_1's auc: 0.834108 valid_1's binary_logloss: 0.131119
[32] training's auc: 0.901276 training's binary_logloss: 0.117779 valid_1's auc: 0.83372 valid_1's binary_logloss: 0.131185
[33] training's auc: 0.902184 training's binary_logloss: 0.11738 valid_1's auc: 0.834135 valid_1's binary_logloss: 0.13116
[34] training's auc: 0.903378 training's binary_logloss: 0.116938 valid_1's auc: 0.833526 valid_1's binary_logloss: 0.131293
[35] training's auc: 0.90394 training's binary_logloss: 0.116651 valid_1's auc: 0.833493 valid_1's binary_logloss: 0.131293
[36] training's auc: 0.905022 training's binary_logloss: 0.116281 valid_1's auc: 0.833218 valid_1's binary_logloss: 0.131355
[37] training's auc: 0.905688 training's binary_logloss: 0.115925 valid_1's auc: 0.833205 valid_1's binary_logloss: 0.131404
[38] training's auc: 0.906329 training's binary_logloss: 0.115601 valid_1's auc: 0.832817 valid_1's binary_logloss: 0.131479
[39] training's auc: 0.906987 training's binary_logloss: 0.115242 valid_1's auc: 0.832719 valid_1's binary_logloss: 0.131473
[40] training's auc: 0.90761 training's binary_logloss: 0.114954 valid_1's auc: 0.832476 valid_1's binary_logloss: 0.131554
[41] training's auc: 0.908331 training's binary_logloss: 0.114617 valid_1's auc: 0.832245 valid_1's binary_logloss: 0.131573
[42] training's auc: 0.908831 training's binary_logloss: 0.114299 valid_1's auc: 0.832061 valid_1's binary_logloss: 0.131615
[43] training's auc: 0.909421 training's binary_logloss: 0.11406 valid_1's auc: 0.831961 valid_1's binary_logloss: 0.13166
[44] training's auc: 0.910112 training's binary_logloss: 0.113786 valid_1's auc: 0.832201 valid_1's binary_logloss: 0.131626
[45] training's auc: 0.910811 training's binary_logloss: 0.113428 valid_1's auc: 0.83206 valid_1's binary_logloss: 0.131654
[46] training's auc: 0.911409 training's binary_logloss: 0.113122 valid_1's auc: 0.831873 valid_1's binary_logloss: 0.131695
[47] training's auc: 0.912197 training's binary_logloss: 0.11278 valid_1's auc: 0.832059 valid_1's binary_logloss: 0.131706
[48] training's auc: 0.912619 training's binary_logloss: 0.112525 valid_1's auc: 0.832046 valid_1's binary_logloss: 0.13171
[49] training's auc: 0.91328 training's binary_logloss: 0.112203 valid_1's auc: 0.831994 valid_1's binary_logloss: 0.131755
[50] training's auc: 0.913879 training's binary_logloss: 0.111934 valid_1's auc: 0.832078 valid_1's binary_logloss: 0.131746
[51] training's auc: 0.915429 training's binary_logloss: 0.111492 valid_1's auc: 0.831934 valid_1's binary_logloss: 0.131796
[52] training's auc: 0.915803 training's binary_logloss: 0.111273 valid_1's auc: 0.831535 valid_1's binary_logloss: 0.131896
[53] training's auc: 0.916148 training's binary_logloss: 0.111058 valid_1's auc: 0.831451 valid_1's binary_logloss: 0.131891
[54] training's auc: 0.916488 training's binary_logloss: 0.110826 valid_1's auc: 0.831117 valid_1's binary_logloss: 0.131968
[55] training's auc: 0.917115 training's binary_logloss: 0.110535 valid_1's auc: 0.831014 valid_1's binary_logloss: 0.131953
Early stopping, best iteration is:
[25] training's auc: 0.892239 training's binary_logloss: 0.120861 valid_1's auc: 0.83536 valid_1's binary_logloss: 0.130877
[1] training's auc: 0.826693 training's binary_logloss: 0.152513 valid_1's auc: 0.806983 valid_1's binary_logloss: 0.158812
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.837299 training's binary_logloss: 0.146498 valid_1's auc: 0.815637 valid_1's binary_logloss: 0.153238
[3] training's auc: 0.845156 training's binary_logloss: 0.142275 valid_1's auc: 0.823589 valid_1's binary_logloss: 0.149536
[4] training's auc: 0.848999 training's binary_logloss: 0.138932 valid_1's auc: 0.827002 valid_1's binary_logloss: 0.146717
[5] training's auc: 0.851518 training's binary_logloss: 0.136391 valid_1's auc: 0.828307 valid_1's binary_logloss: 0.144731
[6] training's auc: 0.853266 training's binary_logloss: 0.134254 valid_1's auc: 0.829161 valid_1's binary_logloss: 0.143167
[7] training's auc: 0.855495 training's binary_logloss: 0.132441 valid_1's auc: 0.830027 valid_1's binary_logloss: 0.141978
[8] training's auc: 0.859876 training's binary_logloss: 0.130992 valid_1's auc: 0.831239 valid_1's binary_logloss: 0.140913
[9] training's auc: 0.862323 training's binary_logloss: 0.129654 valid_1's auc: 0.831169 valid_1's binary_logloss: 0.140158
[10] training's auc: 0.86582 training's binary_logloss: 0.128543 valid_1's auc: 0.832992 valid_1's binary_logloss: 0.13952
[11] training's auc: 0.868777 training's binary_logloss: 0.127518 valid_1's auc: 0.833515 valid_1's binary_logloss: 0.138949
[12] training's auc: 0.871409 training's binary_logloss: 0.126584 valid_1's auc: 0.833555 valid_1's binary_logloss: 0.138494
[13] training's auc: 0.874193 training's binary_logloss: 0.125608 valid_1's auc: 0.834794 valid_1's binary_logloss: 0.138131
[14] training's auc: 0.876946 training's binary_logloss: 0.124803 valid_1's auc: 0.834939 valid_1's binary_logloss: 0.137849
[15] training's auc: 0.87858 training's binary_logloss: 0.124082 valid_1's auc: 0.834439 valid_1's binary_logloss: 0.137667
[16] training's auc: 0.880326 training's binary_logloss: 0.123405 valid_1's auc: 0.834133 valid_1's binary_logloss: 0.137511
[17] training's auc: 0.882023 training's binary_logloss: 0.122656 valid_1's auc: 0.833744 valid_1's binary_logloss: 0.137432
[18] training's auc: 0.883693 training's binary_logloss: 0.122027 valid_1's auc: 0.834274 valid_1's binary_logloss: 0.137226
[19] training's auc: 0.884886 training's binary_logloss: 0.121469 valid_1's auc: 0.834245 valid_1's binary_logloss: 0.13714
[20] training's auc: 0.885855 training's binary_logloss: 0.120947 valid_1's auc: 0.834293 valid_1's binary_logloss: 0.137026
[21] training's auc: 0.887232 training's binary_logloss: 0.120394 valid_1's auc: 0.834016 valid_1's binary_logloss: 0.136967
[22] training's auc: 0.888847 training's binary_logloss: 0.119846 valid_1's auc: 0.834082 valid_1's binary_logloss: 0.136867
[23] training's auc: 0.890118 training's binary_logloss: 0.119334 valid_1's auc: 0.833646 valid_1's binary_logloss: 0.136932
[24] training's auc: 0.891246 training's binary_logloss: 0.118891 valid_1's auc: 0.833644 valid_1's binary_logloss: 0.13694
[25] training's auc: 0.892328 training's binary_logloss: 0.11844 valid_1's auc: 0.833753 valid_1's binary_logloss: 0.136935
[26] training's auc: 0.893997 training's binary_logloss: 0.117835 valid_1's auc: 0.833349 valid_1's binary_logloss: 0.136947
[27] training's auc: 0.895199 training's binary_logloss: 0.117363 valid_1's auc: 0.833358 valid_1's binary_logloss: 0.136949
[28] training's auc: 0.896728 training's binary_logloss: 0.116879 valid_1's auc: 0.833397 valid_1's binary_logloss: 0.136945
[29] training's auc: 0.898125 training's binary_logloss: 0.116373 valid_1's auc: 0.833404 valid_1's binary_logloss: 0.136993
[30] training's auc: 0.899113 training's binary_logloss: 0.115938 valid_1's auc: 0.833078 valid_1's binary_logloss: 0.137079
[31] training's auc: 0.900072 training's binary_logloss: 0.115535 valid_1's auc: 0.833271 valid_1's binary_logloss: 0.137031
[32] training's auc: 0.901184 training's binary_logloss: 0.115115 valid_1's auc: 0.833409 valid_1's binary_logloss: 0.137024
[33] training's auc: 0.90255 training's binary_logloss: 0.114611 valid_1's auc: 0.833015 valid_1's binary_logloss: 0.137086
[34] training's auc: 0.903764 training's binary_logloss: 0.114168 valid_1's auc: 0.833119 valid_1's binary_logloss: 0.137068
[35] training's auc: 0.904917 training's binary_logloss: 0.113739 valid_1's auc: 0.832926 valid_1's binary_logloss: 0.137145
[36] training's auc: 0.905825 training's binary_logloss: 0.113378 valid_1's auc: 0.832718 valid_1's binary_logloss: 0.137178
[37] training's auc: 0.906547 training's binary_logloss: 0.113034 valid_1's auc: 0.832775 valid_1's binary_logloss: 0.137221
[38] training's auc: 0.907349 training's binary_logloss: 0.112657 valid_1's auc: 0.832456 valid_1's binary_logloss: 0.137333
[39] training's auc: 0.908394 training's binary_logloss: 0.11224 valid_1's auc: 0.83254 valid_1's binary_logloss: 0.137299
[40] training's auc: 0.909315 training's binary_logloss: 0.111931 valid_1's auc: 0.832718 valid_1's binary_logloss: 0.137284
[41] training's auc: 0.909842 training's binary_logloss: 0.111655 valid_1's auc: 0.832437 valid_1's binary_logloss: 0.137339
[42] training's auc: 0.910303 training's binary_logloss: 0.111364 valid_1's auc: 0.832278 valid_1's binary_logloss: 0.137418
[43] training's auc: 0.910819 training's binary_logloss: 0.111077 valid_1's auc: 0.832069 valid_1's binary_logloss: 0.137462
[44] training's auc: 0.911395 training's binary_logloss: 0.110762 valid_1's auc: 0.831548 valid_1's binary_logloss: 0.137578
Early stopping, best iteration is:
[14] training's auc: 0.876946 training's binary_logloss: 0.124803 valid_1's auc: 0.834939 valid_1's binary_logloss: 0.137849
[1] training's auc: 0.833002 training's binary_logloss: 0.155934 valid_1's auc: 0.808905 valid_1's binary_logloss: 0.158304
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.839472 training's binary_logloss: 0.150649 valid_1's auc: 0.810532 valid_1's binary_logloss: 0.154002
[3] training's auc: 0.847078 training's binary_logloss: 0.146663 valid_1's auc: 0.816377 valid_1's binary_logloss: 0.150765
[4] training's auc: 0.852057 training's binary_logloss: 0.143527 valid_1's auc: 0.820275 valid_1's binary_logloss: 0.148236
[5] training's auc: 0.853878 training's binary_logloss: 0.140898 valid_1's auc: 0.82282 valid_1's binary_logloss: 0.146292
[6] training's auc: 0.856952 training's binary_logloss: 0.138716 valid_1's auc: 0.824184 valid_1's binary_logloss: 0.144748
[7] training's auc: 0.858673 training's binary_logloss: 0.13683 valid_1's auc: 0.82494 valid_1's binary_logloss: 0.143278
[8] training's auc: 0.861311 training's binary_logloss: 0.135236 valid_1's auc: 0.824891 valid_1's binary_logloss: 0.142171
[9] training's auc: 0.863812 training's binary_logloss: 0.133821 valid_1's auc: 0.825075 valid_1's binary_logloss: 0.141287
[10] training's auc: 0.865881 training's binary_logloss: 0.132567 valid_1's auc: 0.82668 valid_1's binary_logloss: 0.140443
[11] training's auc: 0.867105 training's binary_logloss: 0.131395 valid_1's auc: 0.827313 valid_1's binary_logloss: 0.139762
[12] training's auc: 0.868409 training's binary_logloss: 0.130386 valid_1's auc: 0.826916 valid_1's binary_logloss: 0.139282
[13] training's auc: 0.869692 training's binary_logloss: 0.129432 valid_1's auc: 0.82751 valid_1's binary_logloss: 0.138719
[14] training's auc: 0.871277 training's binary_logloss: 0.128595 valid_1's auc: 0.828058 valid_1's binary_logloss: 0.138349
[15] training's auc: 0.873514 training's binary_logloss: 0.127676 valid_1's auc: 0.827805 valid_1's binary_logloss: 0.138063
[16] training's auc: 0.874986 training's binary_logloss: 0.126881 valid_1's auc: 0.828261 valid_1's binary_logloss: 0.137741
[17] training's auc: 0.876865 training's binary_logloss: 0.126159 valid_1's auc: 0.82893 valid_1's binary_logloss: 0.137402
[18] training's auc: 0.878034 training's binary_logloss: 0.125497 valid_1's auc: 0.828824 valid_1's binary_logloss: 0.137189
[19] training's auc: 0.879205 training's binary_logloss: 0.124897 valid_1's auc: 0.828838 valid_1's binary_logloss: 0.137003
[20] training's auc: 0.881474 training's binary_logloss: 0.124232 valid_1's auc: 0.829435 valid_1's binary_logloss: 0.136839
[21] training's auc: 0.882567 training's binary_logloss: 0.12369 valid_1's auc: 0.830045 valid_1's binary_logloss: 0.136635
[22] training's auc: 0.883736 training's binary_logloss: 0.123166 valid_1's auc: 0.830175 valid_1's binary_logloss: 0.136506
[23] training's auc: 0.884676 training's binary_logloss: 0.122686 valid_1's auc: 0.830685 valid_1's binary_logloss: 0.136293
[24] training's auc: 0.885908 training's binary_logloss: 0.122217 valid_1's auc: 0.830643 valid_1's binary_logloss: 0.136245
[25] training's auc: 0.886675 training's binary_logloss: 0.121761 valid_1's auc: 0.831041 valid_1's binary_logloss: 0.136128
[26] training's auc: 0.888215 training's binary_logloss: 0.121277 valid_1's auc: 0.830509 valid_1's binary_logloss: 0.136103
[27] training's auc: 0.889389 training's binary_logloss: 0.120843 valid_1's auc: 0.830246 valid_1's binary_logloss: 0.136096
[28] training's auc: 0.890575 training's binary_logloss: 0.120388 valid_1's auc: 0.830596 valid_1's binary_logloss: 0.135982
[29] training's auc: 0.89168 training's binary_logloss: 0.119951 valid_1's auc: 0.830631 valid_1's binary_logloss: 0.135922
[30] training's auc: 0.892578 training's binary_logloss: 0.119615 valid_1's auc: 0.830637 valid_1's binary_logloss: 0.135895
[31] training's auc: 0.893598 training's binary_logloss: 0.119271 valid_1's auc: 0.831323 valid_1's binary_logloss: 0.135778
[32] training's auc: 0.894364 training's binary_logloss: 0.118898 valid_1's auc: 0.830937 valid_1's binary_logloss: 0.135811
[33] training's auc: 0.895529 training's binary_logloss: 0.118528 valid_1's auc: 0.831459 valid_1's binary_logloss: 0.135744
[34] training's auc: 0.896725 training's binary_logloss: 0.118165 valid_1's auc: 0.831148 valid_1's binary_logloss: 0.135775
[35] training's auc: 0.897571 training's binary_logloss: 0.117817 valid_1's auc: 0.830851 valid_1's binary_logloss: 0.135845
[36] training's auc: 0.898731 training's binary_logloss: 0.117492 valid_1's auc: 0.831209 valid_1's binary_logloss: 0.135773
[37] training's auc: 0.89983 training's binary_logloss: 0.117159 valid_1's auc: 0.831448 valid_1's binary_logloss: 0.135725
[38] training's auc: 0.900638 training's binary_logloss: 0.116879 valid_1's auc: 0.831234 valid_1's binary_logloss: 0.135759
[39] training's auc: 0.901608 training's binary_logloss: 0.116533 valid_1's auc: 0.830774 valid_1's binary_logloss: 0.135844
[40] training's auc: 0.902547 training's binary_logloss: 0.116191 valid_1's auc: 0.831084 valid_1's binary_logloss: 0.135797
[41] training's auc: 0.903428 training's binary_logloss: 0.115858 valid_1's auc: 0.83126 valid_1's binary_logloss: 0.135783
[42] training's auc: 0.904157 training's binary_logloss: 0.115551 valid_1's auc: 0.831174 valid_1's binary_logloss: 0.135859
[43] training's auc: 0.90495 training's binary_logloss: 0.115228 valid_1's auc: 0.830791 valid_1's binary_logloss: 0.135937
[44] training's auc: 0.90598 training's binary_logloss: 0.114883 valid_1's auc: 0.830744 valid_1's binary_logloss: 0.13597
[45] training's auc: 0.906578 training's binary_logloss: 0.114591 valid_1's auc: 0.830799 valid_1's binary_logloss: 0.13598
[46] training's auc: 0.907186 training's binary_logloss: 0.1143 valid_1's auc: 0.830481 valid_1's binary_logloss: 0.136047
[47] training's auc: 0.907714 training's binary_logloss: 0.114036 valid_1's auc: 0.830319 valid_1's binary_logloss: 0.136097
[48] training's auc: 0.908689 training's binary_logloss: 0.113685 valid_1's auc: 0.830148 valid_1's binary_logloss: 0.136128
[49] training's auc: 0.909236 training's binary_logloss: 0.113431 valid_1's auc: 0.829899 valid_1's binary_logloss: 0.136169
[50] training's auc: 0.91024 training's binary_logloss: 0.113114 valid_1's auc: 0.829858 valid_1's binary_logloss: 0.136188
[51] training's auc: 0.911077 training's binary_logloss: 0.112803 valid_1's auc: 0.82922 valid_1's binary_logloss: 0.136302
[52] training's auc: 0.911969 training's binary_logloss: 0.112428 valid_1's auc: 0.829126 valid_1's binary_logloss: 0.13635
[53] training's auc: 0.91255 training's binary_logloss: 0.112154 valid_1's auc: 0.828876 valid_1's binary_logloss: 0.136431
[54] training's auc: 0.91317 training's binary_logloss: 0.111857 valid_1's auc: 0.828581 valid_1's binary_logloss: 0.136501
[55] training's auc: 0.913887 training's binary_logloss: 0.111557 valid_1's auc: 0.828523 valid_1's binary_logloss: 0.136502
[56] training's auc: 0.914501 training's binary_logloss: 0.111285 valid_1's auc: 0.828115 valid_1's binary_logloss: 0.136581
[57] training's auc: 0.915246 training's binary_logloss: 0.110998 valid_1's auc: 0.827903 valid_1's binary_logloss: 0.136623
[58] training's auc: 0.915595 training's binary_logloss: 0.110798 valid_1's auc: 0.827668 valid_1's binary_logloss: 0.1367
[59] training's auc: 0.916135 training's binary_logloss: 0.110528 valid_1's auc: 0.827511 valid_1's binary_logloss: 0.136787
[60] training's auc: 0.916655 training's binary_logloss: 0.110317 valid_1's auc: 0.827162 valid_1's binary_logloss: 0.136854
[61] training's auc: 0.917141 training's binary_logloss: 0.110042 valid_1's auc: 0.82683 valid_1's binary_logloss: 0.136938
[62] training's auc: 0.917519 training's binary_logloss: 0.109826 valid_1's auc: 0.82649 valid_1's binary_logloss: 0.137024
[63] training's auc: 0.91844 training's binary_logloss: 0.109588 valid_1's auc: 0.826346 valid_1's binary_logloss: 0.137097
Early stopping, best iteration is:
[33] training's auc: 0.895529 training's binary_logloss: 0.118528 valid_1's auc: 0.831459 valid_1's binary_logloss: 0.135744
[1] training's auc: 0.827317 training's binary_logloss: 0.158538 valid_1's auc: 0.81468 valid_1's binary_logloss: 0.153787
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.837741 training's binary_logloss: 0.152979 valid_1's auc: 0.822156 valid_1's binary_logloss: 0.149249
[3] training's auc: 0.843164 training's binary_logloss: 0.149027 valid_1's auc: 0.825419 valid_1's binary_logloss: 0.145946
[4] training's auc: 0.850264 training's binary_logloss: 0.145778 valid_1's auc: 0.828833 valid_1's binary_logloss: 0.14344
[5] training's auc: 0.852926 training's binary_logloss: 0.143189 valid_1's auc: 0.831034 valid_1's binary_logloss: 0.141436
[6] training's auc: 0.855979 training's binary_logloss: 0.140989 valid_1's auc: 0.833783 valid_1's binary_logloss: 0.13973
[7] training's auc: 0.858918 training's binary_logloss: 0.13909 valid_1's auc: 0.835055 valid_1's binary_logloss: 0.138296
[8] training's auc: 0.860904 training's binary_logloss: 0.137483 valid_1's auc: 0.83556 valid_1's binary_logloss: 0.137103
[9] training's auc: 0.862511 training's binary_logloss: 0.136035 valid_1's auc: 0.835133 valid_1's binary_logloss: 0.136155
[10] training's auc: 0.864117 training's binary_logloss: 0.134754 valid_1's auc: 0.835122 valid_1's binary_logloss: 0.135253
[11] training's auc: 0.8656 training's binary_logloss: 0.133664 valid_1's auc: 0.835409 valid_1's binary_logloss: 0.134566
[12] training's auc: 0.86689 training's binary_logloss: 0.132602 valid_1's auc: 0.835298 valid_1's binary_logloss: 0.134006
[13] training's auc: 0.868787 training's binary_logloss: 0.131658 valid_1's auc: 0.835545 valid_1's binary_logloss: 0.133455
[14] training's auc: 0.87056 training's binary_logloss: 0.130756 valid_1's auc: 0.835418 valid_1's binary_logloss: 0.133058
[15] training's auc: 0.871446 training's binary_logloss: 0.129944 valid_1's auc: 0.83604 valid_1's binary_logloss: 0.132649
[16] training's auc: 0.873059 training's binary_logloss: 0.129191 valid_1's auc: 0.836076 valid_1's binary_logloss: 0.132343
[17] training's auc: 0.874283 training's binary_logloss: 0.128477 valid_1's auc: 0.835674 valid_1's binary_logloss: 0.132159
[18] training's auc: 0.875519 training's binary_logloss: 0.127792 valid_1's auc: 0.835576 valid_1's binary_logloss: 0.131916
[19] training's auc: 0.876518 training's binary_logloss: 0.127155 valid_1's auc: 0.836336 valid_1's binary_logloss: 0.131647
[20] training's auc: 0.877786 training's binary_logloss: 0.126534 valid_1's auc: 0.836057 valid_1's binary_logloss: 0.131472
[21] training's auc: 0.879306 training's binary_logloss: 0.125912 valid_1's auc: 0.836764 valid_1's binary_logloss: 0.131317
[22] training's auc: 0.881018 training's binary_logloss: 0.125329 valid_1's auc: 0.836543 valid_1's binary_logloss: 0.131201
[23] training's auc: 0.882032 training's binary_logloss: 0.124819 valid_1's auc: 0.8366 valid_1's binary_logloss: 0.131052
[24] training's auc: 0.883273 training's binary_logloss: 0.124311 valid_1's auc: 0.836628 valid_1's binary_logloss: 0.130954
[25] training's auc: 0.884399 training's binary_logloss: 0.123814 valid_1's auc: 0.836422 valid_1's binary_logloss: 0.130901
[26] training's auc: 0.885643 training's binary_logloss: 0.123338 valid_1's auc: 0.836572 valid_1's binary_logloss: 0.130857
[27] training's auc: 0.886649 training's binary_logloss: 0.122878 valid_1's auc: 0.83709 valid_1's binary_logloss: 0.130754
[28] training's auc: 0.887894 training's binary_logloss: 0.122449 valid_1's auc: 0.837041 valid_1's binary_logloss: 0.130683
[29] training's auc: 0.888628 training's binary_logloss: 0.121979 valid_1's auc: 0.836785 valid_1's binary_logloss: 0.130646
[30] training's auc: 0.89002 training's binary_logloss: 0.121537 valid_1's auc: 0.83723 valid_1's binary_logloss: 0.130554
[31] training's auc: 0.891033 training's binary_logloss: 0.121149 valid_1's auc: 0.837522 valid_1's binary_logloss: 0.130461
[32] training's auc: 0.892152 training's binary_logloss: 0.12078 valid_1's auc: 0.837331 valid_1's binary_logloss: 0.130456
[33] training's auc: 0.893391 training's binary_logloss: 0.120347 valid_1's auc: 0.837269 valid_1's binary_logloss: 0.130454
[34] training's auc: 0.894645 training's binary_logloss: 0.120009 valid_1's auc: 0.837165 valid_1's binary_logloss: 0.130465
[35] training's auc: 0.896267 training's binary_logloss: 0.119622 valid_1's auc: 0.83712 valid_1's binary_logloss: 0.130459
[36] training's auc: 0.897659 training's binary_logloss: 0.11923 valid_1's auc: 0.836966 valid_1's binary_logloss: 0.13043
[37] training's auc: 0.898702 training's binary_logloss: 0.118928 valid_1's auc: 0.836646 valid_1's binary_logloss: 0.130479
[38] training's auc: 0.899562 training's binary_logloss: 0.118588 valid_1's auc: 0.836611 valid_1's binary_logloss: 0.130461
[39] training's auc: 0.900351 training's binary_logloss: 0.118258 valid_1's auc: 0.836688 valid_1's binary_logloss: 0.130443
[40] training's auc: 0.901533 training's binary_logloss: 0.117926 valid_1's auc: 0.836837 valid_1's binary_logloss: 0.130407
[41] training's auc: 0.902749 training's binary_logloss: 0.117535 valid_1's auc: 0.836779 valid_1's binary_logloss: 0.130388
[42] training's auc: 0.903559 training's binary_logloss: 0.117245 valid_1's auc: 0.836738 valid_1's binary_logloss: 0.1304
[43] training's auc: 0.904316 training's binary_logloss: 0.116911 valid_1's auc: 0.836383 valid_1's binary_logloss: 0.130483
[44] training's auc: 0.905113 training's binary_logloss: 0.116585 valid_1's auc: 0.836266 valid_1's binary_logloss: 0.130496
[45] training's auc: 0.905769 training's binary_logloss: 0.11631 valid_1's auc: 0.836291 valid_1's binary_logloss: 0.130496
[46] training's auc: 0.906513 training's binary_logloss: 0.116008 valid_1's auc: 0.836238 valid_1's binary_logloss: 0.130492
[47] training's auc: 0.907217 training's binary_logloss: 0.115685 valid_1's auc: 0.836136 valid_1's binary_logloss: 0.130508
[48] training's auc: 0.907839 training's binary_logloss: 0.115372 valid_1's auc: 0.836166 valid_1's binary_logloss: 0.130509
[49] training's auc: 0.908819 training's binary_logloss: 0.115044 valid_1's auc: 0.835921 valid_1's binary_logloss: 0.130551
[50] training's auc: 0.909721 training's binary_logloss: 0.114805 valid_1's auc: 0.835875 valid_1's binary_logloss: 0.130566
[51] training's auc: 0.910322 training's binary_logloss: 0.114557 valid_1's auc: 0.835728 valid_1's binary_logloss: 0.130577
[52] training's auc: 0.910947 training's binary_logloss: 0.114248 valid_1's auc: 0.83565 valid_1's binary_logloss: 0.130605
[53] training's auc: 0.911571 training's binary_logloss: 0.113955 valid_1's auc: 0.835351 valid_1's binary_logloss: 0.130632
[54] training's auc: 0.912303 training's binary_logloss: 0.113646 valid_1's auc: 0.83541 valid_1's binary_logloss: 0.130618
[55] training's auc: 0.91275 training's binary_logloss: 0.113395 valid_1's auc: 0.835174 valid_1's binary_logloss: 0.130676
[56] training's auc: 0.913116 training's binary_logloss: 0.113167 valid_1's auc: 0.834907 valid_1's binary_logloss: 0.130733
[57] training's auc: 0.913804 training's binary_logloss: 0.11286 valid_1's auc: 0.834395 valid_1's binary_logloss: 0.130841
[58] training's auc: 0.91443 training's binary_logloss: 0.112624 valid_1's auc: 0.834361 valid_1's binary_logloss: 0.130878
[59] training's auc: 0.914952 training's binary_logloss: 0.112378 valid_1's auc: 0.834463 valid_1's binary_logloss: 0.130856
[60] training's auc: 0.915429 training's binary_logloss: 0.112124 valid_1's auc: 0.834585 valid_1's binary_logloss: 0.130854
[61] training's auc: 0.915842 training's binary_logloss: 0.111911 valid_1's auc: 0.834357 valid_1's binary_logloss: 0.130898
Early stopping, best iteration is:
[31] training's auc: 0.891033 training's binary_logloss: 0.121149 valid_1's auc: 0.837522 valid_1's binary_logloss: 0.130461
[1] training's auc: 0.828074 training's binary_logloss: 0.155045 valid_1's auc: 0.808517 valid_1's binary_logloss: 0.161167
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.835901 training's binary_logloss: 0.149814 valid_1's auc: 0.817428 valid_1's binary_logloss: 0.156079
[3] training's auc: 0.844219 training's binary_logloss: 0.145978 valid_1's auc: 0.825287 valid_1's binary_logloss: 0.152481
[4] training's auc: 0.848362 training's binary_logloss: 0.142752 valid_1's auc: 0.827159 valid_1's binary_logloss: 0.149786
[5] training's auc: 0.851144 training's binary_logloss: 0.140132 valid_1's auc: 0.827706 valid_1's binary_logloss: 0.147646
[6] training's auc: 0.853353 training's binary_logloss: 0.137967 valid_1's auc: 0.82933 valid_1's binary_logloss: 0.145934
[7] training's auc: 0.855411 training's binary_logloss: 0.136036 valid_1's auc: 0.829697 valid_1's binary_logloss: 0.144528
[8] training's auc: 0.857698 training's binary_logloss: 0.134477 valid_1's auc: 0.8303 valid_1's binary_logloss: 0.143338
[9] training's auc: 0.859161 training's binary_logloss: 0.133056 valid_1's auc: 0.831469 valid_1's binary_logloss: 0.142295
[10] training's auc: 0.861223 training's binary_logloss: 0.131803 valid_1's auc: 0.832022 valid_1's binary_logloss: 0.141412
[11] training's auc: 0.863198 training's binary_logloss: 0.130745 valid_1's auc: 0.831802 valid_1's binary_logloss: 0.140717
[12] training's auc: 0.864814 training's binary_logloss: 0.129694 valid_1's auc: 0.832654 valid_1's binary_logloss: 0.14009
[13] training's auc: 0.867493 training's binary_logloss: 0.128747 valid_1's auc: 0.833841 valid_1's binary_logloss: 0.139548
[14] training's auc: 0.870552 training's binary_logloss: 0.127863 valid_1's auc: 0.835573 valid_1's binary_logloss: 0.139031
[15] training's auc: 0.872263 training's binary_logloss: 0.127047 valid_1's auc: 0.836093 valid_1's binary_logloss: 0.138602
[16] training's auc: 0.873669 training's binary_logloss: 0.126313 valid_1's auc: 0.836101 valid_1's binary_logloss: 0.138297
[17] training's auc: 0.874913 training's binary_logloss: 0.125599 valid_1's auc: 0.836516 valid_1's binary_logloss: 0.137951
[18] training's auc: 0.876956 training's binary_logloss: 0.124867 valid_1's auc: 0.837181 valid_1's binary_logloss: 0.137557
[19] training's auc: 0.878661 training's binary_logloss: 0.124234 valid_1's auc: 0.837247 valid_1's binary_logloss: 0.137342
[20] training's auc: 0.880344 training's binary_logloss: 0.123619 valid_1's auc: 0.837269 valid_1's binary_logloss: 0.137179
[21] training's auc: 0.881693 training's binary_logloss: 0.123034 valid_1's auc: 0.837089 valid_1's binary_logloss: 0.137022
[22] training's auc: 0.882894 training's binary_logloss: 0.122456 valid_1's auc: 0.837141 valid_1's binary_logloss: 0.136915
[23] training's auc: 0.884475 training's binary_logloss: 0.121836 valid_1's auc: 0.837437 valid_1's binary_logloss: 0.1368
[24] training's auc: 0.886356 training's binary_logloss: 0.121286 valid_1's auc: 0.837206 valid_1's binary_logloss: 0.136705
[25] training's auc: 0.887678 training's binary_logloss: 0.12077 valid_1's auc: 0.837235 valid_1's binary_logloss: 0.136635
[26] training's auc: 0.888765 training's binary_logloss: 0.120324 valid_1's auc: 0.837067 valid_1's binary_logloss: 0.136569
[27] training's auc: 0.89002 training's binary_logloss: 0.119869 valid_1's auc: 0.837116 valid_1's binary_logloss: 0.136478
[28] training's auc: 0.890813 training's binary_logloss: 0.119449 valid_1's auc: 0.836718 valid_1's binary_logloss: 0.136484
[29] training's auc: 0.8918 training's binary_logloss: 0.119019 valid_1's auc: 0.8366 valid_1's binary_logloss: 0.136438
[30] training's auc: 0.892972 training's binary_logloss: 0.118617 valid_1's auc: 0.836196 valid_1's binary_logloss: 0.136454
[31] training's auc: 0.893952 training's binary_logloss: 0.118221 valid_1's auc: 0.835973 valid_1's binary_logloss: 0.136453
[32] training's auc: 0.895621 training's binary_logloss: 0.117825 valid_1's auc: 0.835744 valid_1's binary_logloss: 0.136457
[33] training's auc: 0.896389 training's binary_logloss: 0.117459 valid_1's auc: 0.835554 valid_1's binary_logloss: 0.136465
[34] training's auc: 0.897641 training's binary_logloss: 0.117107 valid_1's auc: 0.83577 valid_1's binary_logloss: 0.136427
[35] training's auc: 0.898618 training's binary_logloss: 0.116734 valid_1's auc: 0.835825 valid_1's binary_logloss: 0.136433
[36] training's auc: 0.900038 training's binary_logloss: 0.116346 valid_1's auc: 0.835972 valid_1's binary_logloss: 0.136402
[37] training's auc: 0.900947 training's binary_logloss: 0.116008 valid_1's auc: 0.835713 valid_1's binary_logloss: 0.136418
[38] training's auc: 0.90212 training's binary_logloss: 0.115656 valid_1's auc: 0.835433 valid_1's binary_logloss: 0.136474
[39] training's auc: 0.90306 training's binary_logloss: 0.115304 valid_1's auc: 0.835079 valid_1's binary_logloss: 0.136539
[40] training's auc: 0.903711 training's binary_logloss: 0.114999 valid_1's auc: 0.834842 valid_1's binary_logloss: 0.136601
[41] training's auc: 0.904489 training's binary_logloss: 0.114635 valid_1's auc: 0.834684 valid_1's binary_logloss: 0.136661
[42] training's auc: 0.905353 training's binary_logloss: 0.114337 valid_1's auc: 0.834754 valid_1's binary_logloss: 0.136641
[43] training's auc: 0.906012 training's binary_logloss: 0.114047 valid_1's auc: 0.834764 valid_1's binary_logloss: 0.136692
[44] training's auc: 0.906582 training's binary_logloss: 0.113764 valid_1's auc: 0.834851 valid_1's binary_logloss: 0.136672
[45] training's auc: 0.907477 training's binary_logloss: 0.11341 valid_1's auc: 0.834907 valid_1's binary_logloss: 0.136681
[46] training's auc: 0.908115 training's binary_logloss: 0.11311 valid_1's auc: 0.83457 valid_1's binary_logloss: 0.136745
[47] training's auc: 0.909007 training's binary_logloss: 0.112751 valid_1's auc: 0.834375 valid_1's binary_logloss: 0.136784
[48] training's auc: 0.909773 training's binary_logloss: 0.112438 valid_1's auc: 0.834445 valid_1's binary_logloss: 0.136809
[49] training's auc: 0.910348 training's binary_logloss: 0.112156 valid_1's auc: 0.83428 valid_1's binary_logloss: 0.136874
[50] training's auc: 0.91091 training's binary_logloss: 0.111884 valid_1's auc: 0.833815 valid_1's binary_logloss: 0.136998
[51] training's auc: 0.911635 training's binary_logloss: 0.111601 valid_1's auc: 0.833693 valid_1's binary_logloss: 0.137027
[52] training's auc: 0.91211 training's binary_logloss: 0.111344 valid_1's auc: 0.833499 valid_1's binary_logloss: 0.13707
[53] training's auc: 0.913025 training's binary_logloss: 0.111007 valid_1's auc: 0.833038 valid_1's binary_logloss: 0.137163
Early stopping, best iteration is:
[23] training's auc: 0.884475 training's binary_logloss: 0.121836 valid_1's auc: 0.837437 valid_1's binary_logloss: 0.1368
[1] training's auc: 0.837692 training's binary_logloss: 0.156994 valid_1's auc: 0.809128 valid_1's binary_logloss: 0.159271
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.841404 training's binary_logloss: 0.152037 valid_1's auc: 0.810331 valid_1's binary_logloss: 0.155319
[3] training's auc: 0.846375 training's binary_logloss: 0.148232 valid_1's auc: 0.813621 valid_1's binary_logloss: 0.152244
[4] training's auc: 0.853229 training's binary_logloss: 0.145192 valid_1's auc: 0.817843 valid_1's binary_logloss: 0.149781
[5] training's auc: 0.855789 training's binary_logloss: 0.142572 valid_1's auc: 0.821113 valid_1's binary_logloss: 0.147865
[6] training's auc: 0.85842 training's binary_logloss: 0.140354 valid_1's auc: 0.822265 valid_1's binary_logloss: 0.146323
[7] training's auc: 0.862475 training's binary_logloss: 0.13838 valid_1's auc: 0.824868 valid_1's binary_logloss: 0.144876
[8] training's auc: 0.864785 training's binary_logloss: 0.136723 valid_1's auc: 0.825964 valid_1's binary_logloss: 0.143571
[9] training's auc: 0.865957 training's binary_logloss: 0.135208 valid_1's auc: 0.826282 valid_1's binary_logloss: 0.142512
[10] training's auc: 0.867413 training's binary_logloss: 0.133851 valid_1's auc: 0.826851 valid_1's binary_logloss: 0.141722
[11] training's auc: 0.868679 training's binary_logloss: 0.132664 valid_1's auc: 0.828232 valid_1's binary_logloss: 0.140944
[12] training's auc: 0.870098 training's binary_logloss: 0.131503 valid_1's auc: 0.828801 valid_1's binary_logloss: 0.14029
[13] training's auc: 0.873393 training's binary_logloss: 0.130449 valid_1's auc: 0.828202 valid_1's binary_logloss: 0.139784
[14] training's auc: 0.875304 training's binary_logloss: 0.129467 valid_1's auc: 0.829156 valid_1's binary_logloss: 0.13924
[15] training's auc: 0.876262 training's binary_logloss: 0.128567 valid_1's auc: 0.829293 valid_1's binary_logloss: 0.138804
[16] training's auc: 0.877294 training's binary_logloss: 0.12777 valid_1's auc: 0.829456 valid_1's binary_logloss: 0.138449
[17] training's auc: 0.879615 training's binary_logloss: 0.126962 valid_1's auc: 0.829648 valid_1's binary_logloss: 0.138127
[18] training's auc: 0.880784 training's binary_logloss: 0.126187 valid_1's auc: 0.829701 valid_1's binary_logloss: 0.137853
[19] training's auc: 0.881904 training's binary_logloss: 0.125495 valid_1's auc: 0.829553 valid_1's binary_logloss: 0.137582
[20] training's auc: 0.88361 training's binary_logloss: 0.124799 valid_1's auc: 0.829659 valid_1's binary_logloss: 0.137375
[21] training's auc: 0.884934 training's binary_logloss: 0.124146 valid_1's auc: 0.829624 valid_1's binary_logloss: 0.137146
[22] training's auc: 0.886077 training's binary_logloss: 0.123559 valid_1's auc: 0.82909 valid_1's binary_logloss: 0.137038
[23] training's auc: 0.887174 training's binary_logloss: 0.123 valid_1's auc: 0.829116 valid_1's binary_logloss: 0.1369
[24] training's auc: 0.88835 training's binary_logloss: 0.122458 valid_1's auc: 0.829752 valid_1's binary_logloss: 0.136694
[25] training's auc: 0.889375 training's binary_logloss: 0.121939 valid_1's auc: 0.829709 valid_1's binary_logloss: 0.13659
[26] training's auc: 0.89039 training's binary_logloss: 0.121449 valid_1's auc: 0.829659 valid_1's binary_logloss: 0.13652
[27] training's auc: 0.891353 training's binary_logloss: 0.120984 valid_1's auc: 0.829793 valid_1's binary_logloss: 0.136453
[28] training's auc: 0.892264 training's binary_logloss: 0.12053 valid_1's auc: 0.829632 valid_1's binary_logloss: 0.136429
[29] training's auc: 0.893078 training's binary_logloss: 0.12006 valid_1's auc: 0.82947 valid_1's binary_logloss: 0.136404
[30] training's auc: 0.894194 training's binary_logloss: 0.119555 valid_1's auc: 0.829708 valid_1's binary_logloss: 0.136348
[31] training's auc: 0.895331 training's binary_logloss: 0.119083 valid_1's auc: 0.82916 valid_1's binary_logloss: 0.136362
[32] training's auc: 0.897147 training's binary_logloss: 0.118633 valid_1's auc: 0.828618 valid_1's binary_logloss: 0.136399
[33] training's auc: 0.898089 training's binary_logloss: 0.118257 valid_1's auc: 0.828364 valid_1's binary_logloss: 0.136352
[34] training's auc: 0.899162 training's binary_logloss: 0.117857 valid_1's auc: 0.828931 valid_1's binary_logloss: 0.136285
[35] training's auc: 0.900452 training's binary_logloss: 0.117408 valid_1's auc: 0.828894 valid_1's binary_logloss: 0.136266
[36] training's auc: 0.901293 training's binary_logloss: 0.117019 valid_1's auc: 0.828384 valid_1's binary_logloss: 0.136314
[37] training's auc: 0.902342 training's binary_logloss: 0.116593 valid_1's auc: 0.82817 valid_1's binary_logloss: 0.136377
[38] training's auc: 0.903418 training's binary_logloss: 0.116193 valid_1's auc: 0.828656 valid_1's binary_logloss: 0.13632
[39] training's auc: 0.904374 training's binary_logloss: 0.115805 valid_1's auc: 0.828523 valid_1's binary_logloss: 0.136302
[40] training's auc: 0.905558 training's binary_logloss: 0.115466 valid_1's auc: 0.828727 valid_1's binary_logloss: 0.136263
[41] training's auc: 0.906261 training's binary_logloss: 0.115091 valid_1's auc: 0.828862 valid_1's binary_logloss: 0.13626
[42] training's auc: 0.907302 training's binary_logloss: 0.114778 valid_1's auc: 0.828604 valid_1's binary_logloss: 0.136287
[43] training's auc: 0.907995 training's binary_logloss: 0.114437 valid_1's auc: 0.82826 valid_1's binary_logloss: 0.136361
[44] training's auc: 0.90866 training's binary_logloss: 0.114103 valid_1's auc: 0.828172 valid_1's binary_logloss: 0.136403
[45] training's auc: 0.9095 training's binary_logloss: 0.113761 valid_1's auc: 0.828237 valid_1's binary_logloss: 0.13642
[46] training's auc: 0.910181 training's binary_logloss: 0.113465 valid_1's auc: 0.828228 valid_1's binary_logloss: 0.13642
[47] training's auc: 0.910923 training's binary_logloss: 0.113144 valid_1's auc: 0.828412 valid_1's binary_logloss: 0.136455
[48] training's auc: 0.911701 training's binary_logloss: 0.112777 valid_1's auc: 0.828237 valid_1's binary_logloss: 0.136527
[49] training's auc: 0.912461 training's binary_logloss: 0.112447 valid_1's auc: 0.828174 valid_1's binary_logloss: 0.136583
[50] training's auc: 0.913128 training's binary_logloss: 0.112154 valid_1's auc: 0.828131 valid_1's binary_logloss: 0.136615
[51] training's auc: 0.913892 training's binary_logloss: 0.111796 valid_1's auc: 0.827866 valid_1's binary_logloss: 0.136668
[52] training's auc: 0.914544 training's binary_logloss: 0.111493 valid_1's auc: 0.827724 valid_1's binary_logloss: 0.13672
[53] training's auc: 0.915165 training's binary_logloss: 0.111189 valid_1's auc: 0.827121 valid_1's binary_logloss: 0.136862
[54] training's auc: 0.915824 training's binary_logloss: 0.110886 valid_1's auc: 0.827081 valid_1's binary_logloss: 0.136916
[55] training's auc: 0.91652 training's binary_logloss: 0.110575 valid_1's auc: 0.826878 valid_1's binary_logloss: 0.136967
[56] training's auc: 0.917179 training's binary_logloss: 0.110291 valid_1's auc: 0.826725 valid_1's binary_logloss: 0.137016
[57] training's auc: 0.917742 training's binary_logloss: 0.109975 valid_1's auc: 0.82655 valid_1's binary_logloss: 0.137011
Early stopping, best iteration is:
[27] training's auc: 0.891353 training's binary_logloss: 0.120984 valid_1's auc: 0.829793 valid_1's binary_logloss: 0.136453
[1] training's auc: 0.832817 training's binary_logloss: 0.159519 valid_1's auc: 0.814571 valid_1's binary_logloss: 0.154796
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.841622 training's binary_logloss: 0.15441 valid_1's auc: 0.819835 valid_1's binary_logloss: 0.150632
[3] training's auc: 0.848359 training's binary_logloss: 0.150559 valid_1's auc: 0.826517 valid_1's binary_logloss: 0.147476
[4] training's auc: 0.852218 training's binary_logloss: 0.147447 valid_1's auc: 0.828135 valid_1's binary_logloss: 0.145129
[5] training's auc: 0.855755 training's binary_logloss: 0.144851 valid_1's auc: 0.828416 valid_1's binary_logloss: 0.143206
[6] training's auc: 0.857586 training's binary_logloss: 0.142618 valid_1's auc: 0.832555 valid_1's binary_logloss: 0.141518
[7] training's auc: 0.861011 training's binary_logloss: 0.1406 valid_1's auc: 0.833436 valid_1's binary_logloss: 0.140134
[8] training's auc: 0.863252 training's binary_logloss: 0.138882 valid_1's auc: 0.833353 valid_1's binary_logloss: 0.139002
[9] training's auc: 0.866114 training's binary_logloss: 0.137305 valid_1's auc: 0.833142 valid_1's binary_logloss: 0.138001
[10] training's auc: 0.86843 training's binary_logloss: 0.135912 valid_1's auc: 0.833539 valid_1's binary_logloss: 0.137123
[11] training's auc: 0.869235 training's binary_logloss: 0.134679 valid_1's auc: 0.833654 valid_1's binary_logloss: 0.136305
[12] training's auc: 0.870657 training's binary_logloss: 0.133533 valid_1's auc: 0.833825 valid_1's binary_logloss: 0.135621
[13] training's auc: 0.871642 training's binary_logloss: 0.132502 valid_1's auc: 0.833792 valid_1's binary_logloss: 0.135033
[14] training's auc: 0.873213 training's binary_logloss: 0.1315 valid_1's auc: 0.833545 valid_1's binary_logloss: 0.134551
[15] training's auc: 0.874758 training's binary_logloss: 0.130611 valid_1's auc: 0.833033 valid_1's binary_logloss: 0.134142
[16] training's auc: 0.875884 training's binary_logloss: 0.129777 valid_1's auc: 0.833419 valid_1's binary_logloss: 0.133731
[17] training's auc: 0.877336 training's binary_logloss: 0.129014 valid_1's auc: 0.833213 valid_1's binary_logloss: 0.133408
[18] training's auc: 0.878647 training's binary_logloss: 0.128279 valid_1's auc: 0.833168 valid_1's binary_logloss: 0.13315
[19] training's auc: 0.879992 training's binary_logloss: 0.127544 valid_1's auc: 0.832115 valid_1's binary_logloss: 0.133014
[20] training's auc: 0.881019 training's binary_logloss: 0.126921 valid_1's auc: 0.831625 valid_1's binary_logloss: 0.132835
[21] training's auc: 0.881707 training's binary_logloss: 0.126316 valid_1's auc: 0.831357 valid_1's binary_logloss: 0.132635
[22] training's auc: 0.882859 training's binary_logloss: 0.125725 valid_1's auc: 0.831563 valid_1's binary_logloss: 0.132479
[23] training's auc: 0.884416 training's binary_logloss: 0.125116 valid_1's auc: 0.832041 valid_1's binary_logloss: 0.132265
[24] training's auc: 0.885865 training's binary_logloss: 0.124497 valid_1's auc: 0.831999 valid_1's binary_logloss: 0.13209
[25] training's auc: 0.887261 training's binary_logloss: 0.123917 valid_1's auc: 0.8321 valid_1's binary_logloss: 0.131944
[26] training's auc: 0.88857 training's binary_logloss: 0.123388 valid_1's auc: 0.832512 valid_1's binary_logloss: 0.131819
[27] training's auc: 0.889705 training's binary_logloss: 0.122829 valid_1's auc: 0.832428 valid_1's binary_logloss: 0.131727
[28] training's auc: 0.89101 training's binary_logloss: 0.122303 valid_1's auc: 0.832877 valid_1's binary_logloss: 0.131588
[29] training's auc: 0.892041 training's binary_logloss: 0.121803 valid_1's auc: 0.832964 valid_1's binary_logloss: 0.131505
[30] training's auc: 0.893111 training's binary_logloss: 0.121326 valid_1's auc: 0.833301 valid_1's binary_logloss: 0.131382
[31] training's auc: 0.894177 training's binary_logloss: 0.120867 valid_1's auc: 0.833248 valid_1's binary_logloss: 0.131315
[32] training's auc: 0.895026 training's binary_logloss: 0.120498 valid_1's auc: 0.833389 valid_1's binary_logloss: 0.131224
[33] training's auc: 0.896426 training's binary_logloss: 0.120028 valid_1's auc: 0.833362 valid_1's binary_logloss: 0.13118
[34] training's auc: 0.897589 training's binary_logloss: 0.119597 valid_1's auc: 0.83304 valid_1's binary_logloss: 0.131201
[35] training's auc: 0.898312 training's binary_logloss: 0.119234 valid_1's auc: 0.833033 valid_1's binary_logloss: 0.131145
[36] training's auc: 0.899201 training's binary_logloss: 0.118849 valid_1's auc: 0.833164 valid_1's binary_logloss: 0.131125
[37] training's auc: 0.900389 training's binary_logloss: 0.118449 valid_1's auc: 0.833221 valid_1's binary_logloss: 0.131094
[38] training's auc: 0.901729 training's binary_logloss: 0.118064 valid_1's auc: 0.833463 valid_1's binary_logloss: 0.131056
[39] training's auc: 0.902648 training's binary_logloss: 0.117683 valid_1's auc: 0.833579 valid_1's binary_logloss: 0.131032
[40] training's auc: 0.904109 training's binary_logloss: 0.11725 valid_1's auc: 0.833454 valid_1's binary_logloss: 0.13104
[41] training's auc: 0.905258 training's binary_logloss: 0.116915 valid_1's auc: 0.833579 valid_1's binary_logloss: 0.131021
[42] training's auc: 0.906096 training's binary_logloss: 0.11656 valid_1's auc: 0.833518 valid_1's binary_logloss: 0.131029
Early stopping, best iteration is:
[12] training's auc: 0.870657 training's binary_logloss: 0.133533 valid_1's auc: 0.833825 valid_1's binary_logloss: 0.135621
[1] training's auc: 0.834179 training's binary_logloss: 0.156104 valid_1's auc: 0.814018 valid_1's binary_logloss: 0.162197
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.840405 training's binary_logloss: 0.151282 valid_1's auc: 0.819011 valid_1's binary_logloss: 0.157721
[3] training's auc: 0.846993 training's binary_logloss: 0.147488 valid_1's auc: 0.826478 valid_1's binary_logloss: 0.154218
[4] training's auc: 0.851736 training's binary_logloss: 0.144393 valid_1's auc: 0.827371 valid_1's binary_logloss: 0.15152
[5] training's auc: 0.854428 training's binary_logloss: 0.141787 valid_1's auc: 0.82925 valid_1's binary_logloss: 0.149384
[6] training's auc: 0.856721 training's binary_logloss: 0.13956 valid_1's auc: 0.829565 valid_1's binary_logloss: 0.147631
[7] training's auc: 0.858987 training's binary_logloss: 0.137687 valid_1's auc: 0.830565 valid_1's binary_logloss: 0.146086
[8] training's auc: 0.861309 training's binary_logloss: 0.135882 valid_1's auc: 0.830398 valid_1's binary_logloss: 0.144897
[9] training's auc: 0.862548 training's binary_logloss: 0.134382 valid_1's auc: 0.830274 valid_1's binary_logloss: 0.143887
[10] training's auc: 0.864593 training's binary_logloss: 0.133053 valid_1's auc: 0.831276 valid_1's binary_logloss: 0.142927
[11] training's auc: 0.866051 training's binary_logloss: 0.131909 valid_1's auc: 0.831913 valid_1's binary_logloss: 0.142096
[12] training's auc: 0.868924 training's binary_logloss: 0.130824 valid_1's auc: 0.833848 valid_1's binary_logloss: 0.141405
[13] training's auc: 0.869995 training's binary_logloss: 0.12981 valid_1's auc: 0.83435 valid_1's binary_logloss: 0.14068
[14] training's auc: 0.872011 training's binary_logloss: 0.128823 valid_1's auc: 0.834675 valid_1's binary_logloss: 0.140134
[15] training's auc: 0.873555 training's binary_logloss: 0.127966 valid_1's auc: 0.834546 valid_1's binary_logloss: 0.139638
[16] training's auc: 0.875558 training's binary_logloss: 0.127164 valid_1's auc: 0.834284 valid_1's binary_logloss: 0.139261
[17] training's auc: 0.877647 training's binary_logloss: 0.126378 valid_1's auc: 0.834377 valid_1's binary_logloss: 0.138889
[18] training's auc: 0.87968 training's binary_logloss: 0.1256 valid_1's auc: 0.834937 valid_1's binary_logloss: 0.138574
[19] training's auc: 0.881228 training's binary_logloss: 0.1249 valid_1's auc: 0.835384 valid_1's binary_logloss: 0.138259
[20] training's auc: 0.883216 training's binary_logloss: 0.124156 valid_1's auc: 0.836013 valid_1's binary_logloss: 0.137893
[21] training's auc: 0.884435 training's binary_logloss: 0.123504 valid_1's auc: 0.835711 valid_1's binary_logloss: 0.137677
[22] training's auc: 0.88575 training's binary_logloss: 0.122928 valid_1's auc: 0.835949 valid_1's binary_logloss: 0.137493
[23] training's auc: 0.887172 training's binary_logloss: 0.122346 valid_1's auc: 0.835116 valid_1's binary_logloss: 0.137445
[24] training's auc: 0.888394 training's binary_logloss: 0.121759 valid_1's auc: 0.834945 valid_1's binary_logloss: 0.137307
[25] training's auc: 0.889537 training's binary_logloss: 0.1212 valid_1's auc: 0.834603 valid_1's binary_logloss: 0.137226
[26] training's auc: 0.890827 training's binary_logloss: 0.120655 valid_1's auc: 0.834335 valid_1's binary_logloss: 0.137136
[27] training's auc: 0.892033 training's binary_logloss: 0.12013 valid_1's auc: 0.834262 valid_1's binary_logloss: 0.137049
[28] training's auc: 0.893573 training's binary_logloss: 0.119653 valid_1's auc: 0.834639 valid_1's binary_logloss: 0.13694
[29] training's auc: 0.894633 training's binary_logloss: 0.119114 valid_1's auc: 0.834772 valid_1's binary_logloss: 0.136908
[30] training's auc: 0.895853 training's binary_logloss: 0.118644 valid_1's auc: 0.835274 valid_1's binary_logloss: 0.13679
[31] training's auc: 0.896704 training's binary_logloss: 0.118208 valid_1's auc: 0.835165 valid_1's binary_logloss: 0.136749
[32] training's auc: 0.897869 training's binary_logloss: 0.117737 valid_1's auc: 0.834912 valid_1's binary_logloss: 0.13672
[33] training's auc: 0.898892 training's binary_logloss: 0.117316 valid_1's auc: 0.834782 valid_1's binary_logloss: 0.136708
[34] training's auc: 0.900127 training's binary_logloss: 0.116865 valid_1's auc: 0.834405 valid_1's binary_logloss: 0.136696
[35] training's auc: 0.901055 training's binary_logloss: 0.116455 valid_1's auc: 0.83465 valid_1's binary_logloss: 0.136662
[36] training's auc: 0.902122 training's binary_logloss: 0.116072 valid_1's auc: 0.834295 valid_1's binary_logloss: 0.136693
[37] training's auc: 0.902862 training's binary_logloss: 0.115671 valid_1's auc: 0.834099 valid_1's binary_logloss: 0.136759
[38] training's auc: 0.90376 training's binary_logloss: 0.115288 valid_1's auc: 0.833819 valid_1's binary_logloss: 0.136754
[39] training's auc: 0.904522 training's binary_logloss: 0.114933 valid_1's auc: 0.833735 valid_1's binary_logloss: 0.136744
[40] training's auc: 0.905424 training's binary_logloss: 0.11458 valid_1's auc: 0.833681 valid_1's binary_logloss: 0.136733
[41] training's auc: 0.906469 training's binary_logloss: 0.114227 valid_1's auc: 0.833288 valid_1's binary_logloss: 0.136797
[42] training's auc: 0.907296 training's binary_logloss: 0.113909 valid_1's auc: 0.833002 valid_1's binary_logloss: 0.136857
[43] training's auc: 0.908213 training's binary_logloss: 0.113601 valid_1's auc: 0.833058 valid_1's binary_logloss: 0.136842
[44] training's auc: 0.909127 training's binary_logloss: 0.113216 valid_1's auc: 0.833293 valid_1's binary_logloss: 0.136791
[45] training's auc: 0.910053 training's binary_logloss: 0.112843 valid_1's auc: 0.833114 valid_1's binary_logloss: 0.136854
[46] training's auc: 0.910659 training's binary_logloss: 0.112547 valid_1's auc: 0.833385 valid_1's binary_logloss: 0.136803
[47] training's auc: 0.911748 training's binary_logloss: 0.112197 valid_1's auc: 0.833238 valid_1's binary_logloss: 0.136834
[48] training's auc: 0.912469 training's binary_logloss: 0.111874 valid_1's auc: 0.832987 valid_1's binary_logloss: 0.136913
[49] training's auc: 0.913433 training's binary_logloss: 0.111457 valid_1's auc: 0.832755 valid_1's binary_logloss: 0.136968
[50] training's auc: 0.913949 training's binary_logloss: 0.111156 valid_1's auc: 0.83286 valid_1's binary_logloss: 0.136978
Early stopping, best iteration is:
[20] training's auc: 0.883216 training's binary_logloss: 0.124156 valid_1's auc: 0.836013 valid_1's binary_logloss: 0.137893
[1] training's auc: 0.832718 training's binary_logloss: 0.15276 valid_1's auc: 0.809735 valid_1's binary_logloss: 0.155658
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.842017 training's binary_logloss: 0.146723 valid_1's auc: 0.813696 valid_1's binary_logloss: 0.150637
[3] training's auc: 0.849963 training's binary_logloss: 0.142445 valid_1's auc: 0.820481 valid_1's binary_logloss: 0.147244
[4] training's auc: 0.855174 training's binary_logloss: 0.139139 valid_1's auc: 0.82279 valid_1's binary_logloss: 0.144867
[5] training's auc: 0.858614 training's binary_logloss: 0.13655 valid_1's auc: 0.824324 valid_1's binary_logloss: 0.142872
[6] training's auc: 0.860109 training's binary_logloss: 0.134461 valid_1's auc: 0.824974 valid_1's binary_logloss: 0.141408
[7] training's auc: 0.863446 training's binary_logloss: 0.132634 valid_1's auc: 0.826372 valid_1's binary_logloss: 0.140269
[8] training's auc: 0.865673 training's binary_logloss: 0.131092 valid_1's auc: 0.827553 valid_1's binary_logloss: 0.139321
[9] training's auc: 0.86799 training's binary_logloss: 0.129787 valid_1's auc: 0.827699 valid_1's binary_logloss: 0.138655
[10] training's auc: 0.870777 training's binary_logloss: 0.128613 valid_1's auc: 0.828806 valid_1's binary_logloss: 0.138009
[11] training's auc: 0.872989 training's binary_logloss: 0.12755 valid_1's auc: 0.829499 valid_1's binary_logloss: 0.137564
[12] training's auc: 0.875519 training's binary_logloss: 0.126601 valid_1's auc: 0.828713 valid_1's binary_logloss: 0.137383
[13] training's auc: 0.877338 training's binary_logloss: 0.125674 valid_1's auc: 0.828065 valid_1's binary_logloss: 0.137125
[14] training's auc: 0.879506 training's binary_logloss: 0.124867 valid_1's auc: 0.828151 valid_1's binary_logloss: 0.136916
[15] training's auc: 0.881487 training's binary_logloss: 0.124079 valid_1's auc: 0.828809 valid_1's binary_logloss: 0.136634
[16] training's auc: 0.882904 training's binary_logloss: 0.12335 valid_1's auc: 0.829632 valid_1's binary_logloss: 0.136363
[17] training's auc: 0.884264 training's binary_logloss: 0.1227 valid_1's auc: 0.829526 valid_1's binary_logloss: 0.136313
[18] training's auc: 0.886618 training's binary_logloss: 0.122082 valid_1's auc: 0.829996 valid_1's binary_logloss: 0.13616
[19] training's auc: 0.888135 training's binary_logloss: 0.121481 valid_1's auc: 0.830723 valid_1's binary_logloss: 0.136035
[20] training's auc: 0.889471 training's binary_logloss: 0.120947 valid_1's auc: 0.830846 valid_1's binary_logloss: 0.135963
[21] training's auc: 0.89117 training's binary_logloss: 0.120365 valid_1's auc: 0.830266 valid_1's binary_logloss: 0.136046
[22] training's auc: 0.892942 training's binary_logloss: 0.119727 valid_1's auc: 0.830635 valid_1's binary_logloss: 0.136032
[23] training's auc: 0.894352 training's binary_logloss: 0.119177 valid_1's auc: 0.830158 valid_1's binary_logloss: 0.136067
[24] training's auc: 0.896084 training's binary_logloss: 0.118716 valid_1's auc: 0.830336 valid_1's binary_logloss: 0.135991
[25] training's auc: 0.897081 training's binary_logloss: 0.118285 valid_1's auc: 0.830271 valid_1's binary_logloss: 0.135969
[26] training's auc: 0.898176 training's binary_logloss: 0.117792 valid_1's auc: 0.830096 valid_1's binary_logloss: 0.135979
[27] training's auc: 0.899722 training's binary_logloss: 0.117296 valid_1's auc: 0.829887 valid_1's binary_logloss: 0.13596
[28] training's auc: 0.901048 training's binary_logloss: 0.116823 valid_1's auc: 0.829708 valid_1's binary_logloss: 0.136056
[29] training's auc: 0.902206 training's binary_logloss: 0.116359 valid_1's auc: 0.830013 valid_1's binary_logloss: 0.136001
[30] training's auc: 0.903166 training's binary_logloss: 0.115967 valid_1's auc: 0.830088 valid_1's binary_logloss: 0.135999
[31] training's auc: 0.90428 training's binary_logloss: 0.115617 valid_1's auc: 0.829864 valid_1's binary_logloss: 0.136063
[32] training's auc: 0.905556 training's binary_logloss: 0.115106 valid_1's auc: 0.829558 valid_1's binary_logloss: 0.136139
[33] training's auc: 0.906276 training's binary_logloss: 0.114771 valid_1's auc: 0.829376 valid_1's binary_logloss: 0.136206
[34] training's auc: 0.907074 training's binary_logloss: 0.114413 valid_1's auc: 0.829304 valid_1's binary_logloss: 0.136251
[35] training's auc: 0.90874 training's binary_logloss: 0.113974 valid_1's auc: 0.828969 valid_1's binary_logloss: 0.136303
[36] training's auc: 0.909639 training's binary_logloss: 0.113525 valid_1's auc: 0.828416 valid_1's binary_logloss: 0.136461
[37] training's auc: 0.910489 training's binary_logloss: 0.113092 valid_1's auc: 0.828009 valid_1's binary_logloss: 0.136567
[38] training's auc: 0.911562 training's binary_logloss: 0.112668 valid_1's auc: 0.827283 valid_1's binary_logloss: 0.136725
[39] training's auc: 0.912284 training's binary_logloss: 0.112326 valid_1's auc: 0.827065 valid_1's binary_logloss: 0.136767
[40] training's auc: 0.912871 training's binary_logloss: 0.112022 valid_1's auc: 0.826349 valid_1's binary_logloss: 0.136894
[41] training's auc: 0.913768 training's binary_logloss: 0.111677 valid_1's auc: 0.825703 valid_1's binary_logloss: 0.137017
[42] training's auc: 0.914411 training's binary_logloss: 0.11136 valid_1's auc: 0.825115 valid_1's binary_logloss: 0.137195
[43] training's auc: 0.915094 training's binary_logloss: 0.110993 valid_1's auc: 0.824889 valid_1's binary_logloss: 0.137288
[44] training's auc: 0.915583 training's binary_logloss: 0.11078 valid_1's auc: 0.824537 valid_1's binary_logloss: 0.137378
[45] training's auc: 0.916037 training's binary_logloss: 0.110534 valid_1's auc: 0.824159 valid_1's binary_logloss: 0.137507
[46] training's auc: 0.916762 training's binary_logloss: 0.110199 valid_1's auc: 0.823757 valid_1's binary_logloss: 0.137659
[47] training's auc: 0.917214 training's binary_logloss: 0.109962 valid_1's auc: 0.823522 valid_1's binary_logloss: 0.13775
[48] training's auc: 0.9181 training's binary_logloss: 0.109547 valid_1's auc: 0.822633 valid_1's binary_logloss: 0.137949
[49] training's auc: 0.918474 training's binary_logloss: 0.109333 valid_1's auc: 0.822395 valid_1's binary_logloss: 0.138038
[50] training's auc: 0.919074 training's binary_logloss: 0.108996 valid_1's auc: 0.822397 valid_1's binary_logloss: 0.138044
Early stopping, best iteration is:
[20] training's auc: 0.889471 training's binary_logloss: 0.120947 valid_1's auc: 0.830846 valid_1's binary_logloss: 0.135963
[1] training's auc: 0.828213 training's binary_logloss: 0.155353 valid_1's auc: 0.813102 valid_1's binary_logloss: 0.151305
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.838492 training's binary_logloss: 0.149075 valid_1's auc: 0.821222 valid_1's binary_logloss: 0.146073
[3] training's auc: 0.848963 training's binary_logloss: 0.144745 valid_1's auc: 0.829953 valid_1's binary_logloss: 0.142624
[4] training's auc: 0.85262 training's binary_logloss: 0.141465 valid_1's auc: 0.830968 valid_1's binary_logloss: 0.140184
[5] training's auc: 0.855873 training's binary_logloss: 0.138741 valid_1's auc: 0.832604 valid_1's binary_logloss: 0.1383
[6] training's auc: 0.859035 training's binary_logloss: 0.136555 valid_1's auc: 0.834374 valid_1's binary_logloss: 0.136763
[7] training's auc: 0.862704 training's binary_logloss: 0.134784 valid_1's auc: 0.834999 valid_1's binary_logloss: 0.135557
[8] training's auc: 0.863943 training's binary_logloss: 0.1333 valid_1's auc: 0.835389 valid_1's binary_logloss: 0.134572
[9] training's auc: 0.866061 training's binary_logloss: 0.131919 valid_1's auc: 0.835558 valid_1's binary_logloss: 0.133816
[10] training's auc: 0.868257 training's binary_logloss: 0.130762 valid_1's auc: 0.834164 valid_1's binary_logloss: 0.133302
[11] training's auc: 0.869971 training's binary_logloss: 0.129645 valid_1's auc: 0.833839 valid_1's binary_logloss: 0.132849
[12] training's auc: 0.871611 training's binary_logloss: 0.128681 valid_1's auc: 0.834376 valid_1's binary_logloss: 0.132358
[13] training's auc: 0.874355 training's binary_logloss: 0.127807 valid_1's auc: 0.834217 valid_1's binary_logloss: 0.132029
[14] training's auc: 0.876831 training's binary_logloss: 0.126899 valid_1's auc: 0.83355 valid_1's binary_logloss: 0.131807
[15] training's auc: 0.878307 training's binary_logloss: 0.126052 valid_1's auc: 0.834349 valid_1's binary_logloss: 0.131511
[16] training's auc: 0.879768 training's binary_logloss: 0.125332 valid_1's auc: 0.834723 valid_1's binary_logloss: 0.131315
[17] training's auc: 0.880945 training's binary_logloss: 0.124694 valid_1's auc: 0.834883 valid_1's binary_logloss: 0.131158
[18] training's auc: 0.883008 training's binary_logloss: 0.123997 valid_1's auc: 0.834716 valid_1's binary_logloss: 0.131041
[19] training's auc: 0.884577 training's binary_logloss: 0.123418 valid_1's auc: 0.834646 valid_1's binary_logloss: 0.130996
[20] training's auc: 0.886425 training's binary_logloss: 0.122821 valid_1's auc: 0.834813 valid_1's binary_logloss: 0.13094
[21] training's auc: 0.888136 training's binary_logloss: 0.122265 valid_1's auc: 0.835397 valid_1's binary_logloss: 0.130812
[22] training's auc: 0.889727 training's binary_logloss: 0.121665 valid_1's auc: 0.835272 valid_1's binary_logloss: 0.130792
[23] training's auc: 0.891897 training's binary_logloss: 0.121105 valid_1's auc: 0.835651 valid_1's binary_logloss: 0.130747
[24] training's auc: 0.893276 training's binary_logloss: 0.120537 valid_1's auc: 0.83613 valid_1's binary_logloss: 0.130695
[25] training's auc: 0.894682 training's binary_logloss: 0.120101 valid_1's auc: 0.836353 valid_1's binary_logloss: 0.130665
[26] training's auc: 0.896482 training's binary_logloss: 0.119621 valid_1's auc: 0.836698 valid_1's binary_logloss: 0.130583
[27] training's auc: 0.897885 training's binary_logloss: 0.119138 valid_1's auc: 0.836873 valid_1's binary_logloss: 0.130538
[28] training's auc: 0.898864 training's binary_logloss: 0.118722 valid_1's auc: 0.837065 valid_1's binary_logloss: 0.130499
[29] training's auc: 0.900687 training's binary_logloss: 0.118281 valid_1's auc: 0.836993 valid_1's binary_logloss: 0.130527
[30] training's auc: 0.901947 training's binary_logloss: 0.117787 valid_1's auc: 0.836363 valid_1's binary_logloss: 0.130634
[31] training's auc: 0.903167 training's binary_logloss: 0.117349 valid_1's auc: 0.836014 valid_1's binary_logloss: 0.130702
[32] training's auc: 0.904737 training's binary_logloss: 0.116853 valid_1's auc: 0.835945 valid_1's binary_logloss: 0.130711
[33] training's auc: 0.905777 training's binary_logloss: 0.116452 valid_1's auc: 0.835536 valid_1's binary_logloss: 0.130824
[34] training's auc: 0.906826 training's binary_logloss: 0.115991 valid_1's auc: 0.835463 valid_1's binary_logloss: 0.130872
[35] training's auc: 0.907941 training's binary_logloss: 0.115586 valid_1's auc: 0.835548 valid_1's binary_logloss: 0.130879
[36] training's auc: 0.908877 training's binary_logloss: 0.11523 valid_1's auc: 0.836056 valid_1's binary_logloss: 0.130844
[37] training's auc: 0.909762 training's binary_logloss: 0.114804 valid_1's auc: 0.835755 valid_1's binary_logloss: 0.130882
[38] training's auc: 0.910612 training's binary_logloss: 0.114433 valid_1's auc: 0.834857 valid_1's binary_logloss: 0.131009
[39] training's auc: 0.911149 training's binary_logloss: 0.114109 valid_1's auc: 0.834569 valid_1's binary_logloss: 0.131083
[40] training's auc: 0.912117 training's binary_logloss: 0.113723 valid_1's auc: 0.83379 valid_1's binary_logloss: 0.131245
[41] training's auc: 0.912699 training's binary_logloss: 0.113377 valid_1's auc: 0.83359 valid_1's binary_logloss: 0.131287
[42] training's auc: 0.913451 training's binary_logloss: 0.11298 valid_1's auc: 0.833161 valid_1's binary_logloss: 0.131402
[43] training's auc: 0.914077 training's binary_logloss: 0.112662 valid_1's auc: 0.833271 valid_1's binary_logloss: 0.131374
[44] training's auc: 0.914732 training's binary_logloss: 0.112347 valid_1's auc: 0.833304 valid_1's binary_logloss: 0.131352
[45] training's auc: 0.915211 training's binary_logloss: 0.112086 valid_1's auc: 0.83311 valid_1's binary_logloss: 0.131421
[46] training's auc: 0.915752 training's binary_logloss: 0.111772 valid_1's auc: 0.833009 valid_1's binary_logloss: 0.131479
[47] training's auc: 0.916394 training's binary_logloss: 0.111492 valid_1's auc: 0.833079 valid_1's binary_logloss: 0.131471
[48] training's auc: 0.916984 training's binary_logloss: 0.111157 valid_1's auc: 0.833259 valid_1's binary_logloss: 0.131487
[49] training's auc: 0.917388 training's binary_logloss: 0.110898 valid_1's auc: 0.833134 valid_1's binary_logloss: 0.131506
[50] training's auc: 0.917927 training's binary_logloss: 0.110609 valid_1's auc: 0.83307 valid_1's binary_logloss: 0.131561
[51] training's auc: 0.91881 training's binary_logloss: 0.110206 valid_1's auc: 0.833026 valid_1's binary_logloss: 0.131565
[52] training's auc: 0.919881 training's binary_logloss: 0.109738 valid_1's auc: 0.833078 valid_1's binary_logloss: 0.131593
[53] training's auc: 0.920479 training's binary_logloss: 0.109399 valid_1's auc: 0.832773 valid_1's binary_logloss: 0.131676
[54] training's auc: 0.920988 training's binary_logloss: 0.109118 valid_1's auc: 0.832652 valid_1's binary_logloss: 0.131726
[55] training's auc: 0.921351 training's binary_logloss: 0.108823 valid_1's auc: 0.832576 valid_1's binary_logloss: 0.131757
[56] training's auc: 0.921803 training's binary_logloss: 0.108547 valid_1's auc: 0.832284 valid_1's binary_logloss: 0.131827
[57] training's auc: 0.922202 training's binary_logloss: 0.108307 valid_1's auc: 0.83212 valid_1's binary_logloss: 0.131868
[58] training's auc: 0.922725 training's binary_logloss: 0.108042 valid_1's auc: 0.832319 valid_1's binary_logloss: 0.131875
Early stopping, best iteration is:
[28] training's auc: 0.898864 training's binary_logloss: 0.118722 valid_1's auc: 0.837065 valid_1's binary_logloss: 0.130499
[1] training's auc: 0.829115 training's binary_logloss: 0.152192 valid_1's auc: 0.812545 valid_1's binary_logloss: 0.158352
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.838906 training's binary_logloss: 0.146131 valid_1's auc: 0.820146 valid_1's binary_logloss: 0.152606
[3] training's auc: 0.846243 training's binary_logloss: 0.141738 valid_1's auc: 0.825249 valid_1's binary_logloss: 0.149029
[4] training's auc: 0.850104 training's binary_logloss: 0.13834 valid_1's auc: 0.828329 valid_1's binary_logloss: 0.146192
[5] training's auc: 0.853147 training's binary_logloss: 0.135701 valid_1's auc: 0.829788 valid_1's binary_logloss: 0.144271
[6] training's auc: 0.856579 training's binary_logloss: 0.133632 valid_1's auc: 0.830738 valid_1's binary_logloss: 0.142695
[7] training's auc: 0.860318 training's binary_logloss: 0.131769 valid_1's auc: 0.831494 valid_1's binary_logloss: 0.141443
[8] training's auc: 0.863445 training's binary_logloss: 0.13019 valid_1's auc: 0.831878 valid_1's binary_logloss: 0.140457
[9] training's auc: 0.865669 training's binary_logloss: 0.128903 valid_1's auc: 0.832533 valid_1's binary_logloss: 0.139664
[10] training's auc: 0.869873 training's binary_logloss: 0.127731 valid_1's auc: 0.834203 valid_1's binary_logloss: 0.13902
[11] training's auc: 0.872085 training's binary_logloss: 0.126728 valid_1's auc: 0.835111 valid_1's binary_logloss: 0.138471
[12] training's auc: 0.873859 training's binary_logloss: 0.125723 valid_1's auc: 0.835341 valid_1's binary_logloss: 0.138032
[13] training's auc: 0.876262 training's binary_logloss: 0.124829 valid_1's auc: 0.836019 valid_1's binary_logloss: 0.137668
[14] training's auc: 0.877842 training's binary_logloss: 0.123989 valid_1's auc: 0.835227 valid_1's binary_logloss: 0.137529
[15] training's auc: 0.880164 training's binary_logloss: 0.123219 valid_1's auc: 0.835722 valid_1's binary_logloss: 0.137273
[16] training's auc: 0.881623 training's binary_logloss: 0.122499 valid_1's auc: 0.836072 valid_1's binary_logloss: 0.137037
[17] training's auc: 0.883497 training's binary_logloss: 0.121785 valid_1's auc: 0.83616 valid_1's binary_logloss: 0.136862
[18] training's auc: 0.885035 training's binary_logloss: 0.121167 valid_1's auc: 0.836319 valid_1's binary_logloss: 0.136731
[19] training's auc: 0.886555 training's binary_logloss: 0.120583 valid_1's auc: 0.836041 valid_1's binary_logloss: 0.136658
[20] training's auc: 0.888585 training's binary_logloss: 0.119935 valid_1's auc: 0.835709 valid_1's binary_logloss: 0.136644
[21] training's auc: 0.890252 training's binary_logloss: 0.119368 valid_1's auc: 0.835103 valid_1's binary_logloss: 0.13658
[22] training's auc: 0.892315 training's binary_logloss: 0.118835 valid_1's auc: 0.834319 valid_1's binary_logloss: 0.136688
[23] training's auc: 0.893507 training's binary_logloss: 0.118388 valid_1's auc: 0.834566 valid_1's binary_logloss: 0.136629
[24] training's auc: 0.894961 training's binary_logloss: 0.117849 valid_1's auc: 0.834665 valid_1's binary_logloss: 0.136623
[25] training's auc: 0.896205 training's binary_logloss: 0.117385 valid_1's auc: 0.834568 valid_1's binary_logloss: 0.13663
[26] training's auc: 0.897448 training's binary_logloss: 0.116911 valid_1's auc: 0.834104 valid_1's binary_logloss: 0.136748
[27] training's auc: 0.898571 training's binary_logloss: 0.116482 valid_1's auc: 0.833385 valid_1's binary_logloss: 0.136902
[28] training's auc: 0.899655 training's binary_logloss: 0.116022 valid_1's auc: 0.833502 valid_1's binary_logloss: 0.136913
[29] training's auc: 0.901248 training's binary_logloss: 0.115502 valid_1's auc: 0.833001 valid_1's binary_logloss: 0.137045
[30] training's auc: 0.90206 training's binary_logloss: 0.115091 valid_1's auc: 0.83299 valid_1's binary_logloss: 0.137081
[31] training's auc: 0.903526 training's binary_logloss: 0.114537 valid_1's auc: 0.833045 valid_1's binary_logloss: 0.137072
[32] training's auc: 0.904562 training's binary_logloss: 0.114181 valid_1's auc: 0.833061 valid_1's binary_logloss: 0.137059
[33] training's auc: 0.905765 training's binary_logloss: 0.113735 valid_1's auc: 0.832805 valid_1's binary_logloss: 0.137136
[34] training's auc: 0.906621 training's binary_logloss: 0.11337 valid_1's auc: 0.832459 valid_1's binary_logloss: 0.137256
[35] training's auc: 0.907347 training's binary_logloss: 0.112971 valid_1's auc: 0.832249 valid_1's binary_logloss: 0.137268
[36] training's auc: 0.908086 training's binary_logloss: 0.112582 valid_1's auc: 0.832438 valid_1's binary_logloss: 0.137296
[37] training's auc: 0.909053 training's binary_logloss: 0.11213 valid_1's auc: 0.831661 valid_1's binary_logloss: 0.13753
[38] training's auc: 0.909913 training's binary_logloss: 0.111734 valid_1's auc: 0.83113 valid_1's binary_logloss: 0.137598
[39] training's auc: 0.910762 training's binary_logloss: 0.111316 valid_1's auc: 0.830881 valid_1's binary_logloss: 0.137662
[40] training's auc: 0.911863 training's binary_logloss: 0.110914 valid_1's auc: 0.830884 valid_1's binary_logloss: 0.137636
[41] training's auc: 0.912503 training's binary_logloss: 0.110599 valid_1's auc: 0.830491 valid_1's binary_logloss: 0.137731
[42] training's auc: 0.913088 training's binary_logloss: 0.11029 valid_1's auc: 0.829938 valid_1's binary_logloss: 0.137872
[43] training's auc: 0.913653 training's binary_logloss: 0.109966 valid_1's auc: 0.829511 valid_1's binary_logloss: 0.137992
[44] training's auc: 0.9146 training's binary_logloss: 0.109586 valid_1's auc: 0.829228 valid_1's binary_logloss: 0.138073
[45] training's auc: 0.915169 training's binary_logloss: 0.109296 valid_1's auc: 0.82937 valid_1's binary_logloss: 0.138084
[46] training's auc: 0.916105 training's binary_logloss: 0.108931 valid_1's auc: 0.829377 valid_1's binary_logloss: 0.138101
[47] training's auc: 0.916618 training's binary_logloss: 0.108634 valid_1's auc: 0.829539 valid_1's binary_logloss: 0.138086
[48] training's auc: 0.917278 training's binary_logloss: 0.108335 valid_1's auc: 0.829515 valid_1's binary_logloss: 0.138187
Early stopping, best iteration is:
[18] training's auc: 0.885035 training's binary_logloss: 0.121167 valid_1's auc: 0.836319 valid_1's binary_logloss: 0.136731
[1] training's auc: 0.834391 training's binary_logloss: 0.15818 valid_1's auc: 0.809965 valid_1's binary_logloss: 0.16021
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.840626 training's binary_logloss: 0.153728 valid_1's auc: 0.809547 valid_1's binary_logloss: 0.156634
[3] training's auc: 0.843409 training's binary_logloss: 0.150239 valid_1's auc: 0.812254 valid_1's binary_logloss: 0.153802
[4] training's auc: 0.850013 training's binary_logloss: 0.147349 valid_1's auc: 0.816763 valid_1's binary_logloss: 0.151369
[5] training's auc: 0.851714 training's binary_logloss: 0.144915 valid_1's auc: 0.820572 valid_1's binary_logloss: 0.149386
[6] training's auc: 0.854107 training's binary_logloss: 0.142751 valid_1's auc: 0.821844 valid_1's binary_logloss: 0.147761
[7] training's auc: 0.857721 training's binary_logloss: 0.140863 valid_1's auc: 0.824618 valid_1's binary_logloss: 0.146326
[8] training's auc: 0.861173 training's binary_logloss: 0.139155 valid_1's auc: 0.826424 valid_1's binary_logloss: 0.145092
[9] training's auc: 0.862276 training's binary_logloss: 0.13769 valid_1's auc: 0.827239 valid_1's binary_logloss: 0.144024
[10] training's auc: 0.864357 training's binary_logloss: 0.136344 valid_1's auc: 0.827351 valid_1's binary_logloss: 0.143126
[11] training's auc: 0.865466 training's binary_logloss: 0.135151 valid_1's auc: 0.827166 valid_1's binary_logloss: 0.142345
[12] training's auc: 0.86683 training's binary_logloss: 0.13404 valid_1's auc: 0.827634 valid_1's binary_logloss: 0.141615
[13] training's auc: 0.867896 training's binary_logloss: 0.133006 valid_1's auc: 0.829345 valid_1's binary_logloss: 0.140938
[14] training's auc: 0.869981 training's binary_logloss: 0.132038 valid_1's auc: 0.82926 valid_1's binary_logloss: 0.140392
[15] training's auc: 0.871042 training's binary_logloss: 0.131116 valid_1's auc: 0.828732 valid_1's binary_logloss: 0.139922
[16] training's auc: 0.872698 training's binary_logloss: 0.130299 valid_1's auc: 0.829028 valid_1's binary_logloss: 0.139462
[17] training's auc: 0.874526 training's binary_logloss: 0.129473 valid_1's auc: 0.829579 valid_1's binary_logloss: 0.139006
[18] training's auc: 0.875467 training's binary_logloss: 0.128776 valid_1's auc: 0.829783 valid_1's binary_logloss: 0.13867
[19] training's auc: 0.877542 training's binary_logloss: 0.128066 valid_1's auc: 0.830611 valid_1's binary_logloss: 0.138267
[20] training's auc: 0.878382 training's binary_logloss: 0.127406 valid_1's auc: 0.830768 valid_1's binary_logloss: 0.137966
[21] training's auc: 0.879128 training's binary_logloss: 0.12682 valid_1's auc: 0.83111 valid_1's binary_logloss: 0.13768
[22] training's auc: 0.880408 training's binary_logloss: 0.126218 valid_1's auc: 0.831226 valid_1's binary_logloss: 0.137407
[23] training's auc: 0.881551 training's binary_logloss: 0.12563 valid_1's auc: 0.831118 valid_1's binary_logloss: 0.137244
[24] training's auc: 0.882642 training's binary_logloss: 0.125091 valid_1's auc: 0.831284 valid_1's binary_logloss: 0.137035
[25] training's auc: 0.883903 training's binary_logloss: 0.124573 valid_1's auc: 0.831428 valid_1's binary_logloss: 0.136878
[26] training's auc: 0.884946 training's binary_logloss: 0.124049 valid_1's auc: 0.831213 valid_1's binary_logloss: 0.136736
[27] training's auc: 0.886286 training's binary_logloss: 0.123499 valid_1's auc: 0.831221 valid_1's binary_logloss: 0.13663
[28] training's auc: 0.887366 training's binary_logloss: 0.123026 valid_1's auc: 0.830768 valid_1's binary_logloss: 0.136576
[29] training's auc: 0.888415 training's binary_logloss: 0.122594 valid_1's auc: 0.830993 valid_1's binary_logloss: 0.136452
[30] training's auc: 0.889513 training's binary_logloss: 0.122152 valid_1's auc: 0.831119 valid_1's binary_logloss: 0.13634
[31] training's auc: 0.890326 training's binary_logloss: 0.121722 valid_1's auc: 0.831478 valid_1's binary_logloss: 0.136186
[32] training's auc: 0.891214 training's binary_logloss: 0.121312 valid_1's auc: 0.831714 valid_1's binary_logloss: 0.136074
[33] training's auc: 0.892053 training's binary_logloss: 0.120901 valid_1's auc: 0.831906 valid_1's binary_logloss: 0.135995
[34] training's auc: 0.892872 training's binary_logloss: 0.120523 valid_1's auc: 0.832217 valid_1's binary_logloss: 0.135901
[35] training's auc: 0.893994 training's binary_logloss: 0.120089 valid_1's auc: 0.83241 valid_1's binary_logloss: 0.135859
[36] training's auc: 0.894894 training's binary_logloss: 0.119728 valid_1's auc: 0.832239 valid_1's binary_logloss: 0.135826
[37] training's auc: 0.89573 training's binary_logloss: 0.119361 valid_1's auc: 0.832069 valid_1's binary_logloss: 0.135806
[38] training's auc: 0.896777 training's binary_logloss: 0.118993 valid_1's auc: 0.831958 valid_1's binary_logloss: 0.135766
[39] training's auc: 0.897702 training's binary_logloss: 0.118663 valid_1's auc: 0.832003 valid_1's binary_logloss: 0.135734
[40] training's auc: 0.898465 training's binary_logloss: 0.118304 valid_1's auc: 0.831749 valid_1's binary_logloss: 0.135735
[41] training's auc: 0.899232 training's binary_logloss: 0.118005 valid_1's auc: 0.832059 valid_1's binary_logloss: 0.135698
[42] training's auc: 0.900137 training's binary_logloss: 0.117692 valid_1's auc: 0.832124 valid_1's binary_logloss: 0.135664
[43] training's auc: 0.901026 training's binary_logloss: 0.117346 valid_1's auc: 0.831762 valid_1's binary_logloss: 0.135724
[44] training's auc: 0.902195 training's binary_logloss: 0.116987 valid_1's auc: 0.832446 valid_1's binary_logloss: 0.135614
[45] training's auc: 0.903054 training's binary_logloss: 0.116662 valid_1's auc: 0.83247 valid_1's binary_logloss: 0.135635
[46] training's auc: 0.903853 training's binary_logloss: 0.116381 valid_1's auc: 0.832276 valid_1's binary_logloss: 0.13563
[47] training's auc: 0.904611 training's binary_logloss: 0.116101 valid_1's auc: 0.832018 valid_1's binary_logloss: 0.13566
[48] training's auc: 0.905544 training's binary_logloss: 0.115806 valid_1's auc: 0.831877 valid_1's binary_logloss: 0.135651
[49] training's auc: 0.906358 training's binary_logloss: 0.11554 valid_1's auc: 0.831816 valid_1's binary_logloss: 0.13564
[50] training's auc: 0.9071 training's binary_logloss: 0.115216 valid_1's auc: 0.83189 valid_1's binary_logloss: 0.135627
[51] training's auc: 0.907652 training's binary_logloss: 0.114945 valid_1's auc: 0.83167 valid_1's binary_logloss: 0.135693
[52] training's auc: 0.908285 training's binary_logloss: 0.114673 valid_1's auc: 0.83147 valid_1's binary_logloss: 0.135735
[53] training's auc: 0.909332 training's binary_logloss: 0.114422 valid_1's auc: 0.831495 valid_1's binary_logloss: 0.135729
[54] training's auc: 0.909841 training's binary_logloss: 0.114175 valid_1's auc: 0.83136 valid_1's binary_logloss: 0.135754
[55] training's auc: 0.910492 training's binary_logloss: 0.113891 valid_1's auc: 0.831291 valid_1's binary_logloss: 0.135767
[56] training's auc: 0.911091 training's binary_logloss: 0.113626 valid_1's auc: 0.831315 valid_1's binary_logloss: 0.135758
[57] training's auc: 0.91188 training's binary_logloss: 0.113356 valid_1's auc: 0.831415 valid_1's binary_logloss: 0.13576
[58] training's auc: 0.912387 training's binary_logloss: 0.113105 valid_1's auc: 0.831326 valid_1's binary_logloss: 0.135787
[59] training's auc: 0.912866 training's binary_logloss: 0.112866 valid_1's auc: 0.831122 valid_1's binary_logloss: 0.135826
[60] training's auc: 0.913525 training's binary_logloss: 0.112648 valid_1's auc: 0.831016 valid_1's binary_logloss: 0.135836
[61] training's auc: 0.91413 training's binary_logloss: 0.112377 valid_1's auc: 0.831001 valid_1's binary_logloss: 0.135842
[62] training's auc: 0.914562 training's binary_logloss: 0.112165 valid_1's auc: 0.831013 valid_1's binary_logloss: 0.135854
[63] training's auc: 0.91513 training's binary_logloss: 0.111915 valid_1's auc: 0.830706 valid_1's binary_logloss: 0.135898
[64] training's auc: 0.915742 training's binary_logloss: 0.111645 valid_1's auc: 0.830466 valid_1's binary_logloss: 0.135951
[65] training's auc: 0.916504 training's binary_logloss: 0.11141 valid_1's auc: 0.830499 valid_1's binary_logloss: 0.135976
[66] training's auc: 0.917323 training's binary_logloss: 0.111226 valid_1's auc: 0.830249 valid_1's binary_logloss: 0.136018
[67] training's auc: 0.918169 training's binary_logloss: 0.110973 valid_1's auc: 0.830305 valid_1's binary_logloss: 0.136008
[68] training's auc: 0.91858 training's binary_logloss: 0.110766 valid_1's auc: 0.830061 valid_1's binary_logloss: 0.136053
[69] training's auc: 0.919074 training's binary_logloss: 0.110574 valid_1's auc: 0.830059 valid_1's binary_logloss: 0.136061
[70] training's auc: 0.919614 training's binary_logloss: 0.110347 valid_1's auc: 0.829857 valid_1's binary_logloss: 0.136122
[71] training's auc: 0.920117 training's binary_logloss: 0.110115 valid_1's auc: 0.829615 valid_1's binary_logloss: 0.136175
[72] training's auc: 0.920485 training's binary_logloss: 0.109919 valid_1's auc: 0.829607 valid_1's binary_logloss: 0.136196
[73] training's auc: 0.920987 training's binary_logloss: 0.109664 valid_1's auc: 0.829327 valid_1's binary_logloss: 0.136289
[74] training's auc: 0.921483 training's binary_logloss: 0.109487 valid_1's auc: 0.829334 valid_1's binary_logloss: 0.136303
Early stopping, best iteration is:
[44] training's auc: 0.902195 training's binary_logloss: 0.116987 valid_1's auc: 0.832446 valid_1's binary_logloss: 0.135614
[1] training's auc: 0.830071 training's binary_logloss: 0.160688 valid_1's auc: 0.814842 valid_1's binary_logloss: 0.155694
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.839032 training's binary_logloss: 0.156096 valid_1's auc: 0.821535 valid_1's binary_logloss: 0.151944
[3] training's auc: 0.843267 training's binary_logloss: 0.152553 valid_1's auc: 0.824099 valid_1's binary_logloss: 0.149037
[4] training's auc: 0.849258 training's binary_logloss: 0.149626 valid_1's auc: 0.82871 valid_1's binary_logloss: 0.146592
[5] training's auc: 0.852634 training's binary_logloss: 0.147133 valid_1's auc: 0.830779 valid_1's binary_logloss: 0.144694
[6] training's auc: 0.854031 training's binary_logloss: 0.145006 valid_1's auc: 0.83154 valid_1's binary_logloss: 0.143036
[7] training's auc: 0.855882 training's binary_logloss: 0.143177 valid_1's auc: 0.832479 valid_1's binary_logloss: 0.141661
[8] training's auc: 0.85926 training's binary_logloss: 0.141423 valid_1's auc: 0.834584 valid_1's binary_logloss: 0.140393
[9] training's auc: 0.861207 training's binary_logloss: 0.139873 valid_1's auc: 0.834929 valid_1's binary_logloss: 0.139387
[10] training's auc: 0.863943 training's binary_logloss: 0.138484 valid_1's auc: 0.834227 valid_1's binary_logloss: 0.138472
[11] training's auc: 0.865716 training's binary_logloss: 0.137249 valid_1's auc: 0.834117 valid_1's binary_logloss: 0.137679
[12] training's auc: 0.86714 training's binary_logloss: 0.13612 valid_1's auc: 0.834204 valid_1's binary_logloss: 0.136967
[13] training's auc: 0.868159 training's binary_logloss: 0.135074 valid_1's auc: 0.834257 valid_1's binary_logloss: 0.136295
[14] training's auc: 0.86901 training's binary_logloss: 0.134135 valid_1's auc: 0.834452 valid_1's binary_logloss: 0.135676
[15] training's auc: 0.869515 training's binary_logloss: 0.133267 valid_1's auc: 0.834502 valid_1's binary_logloss: 0.135152
[16] training's auc: 0.870249 training's binary_logloss: 0.132475 valid_1's auc: 0.834462 valid_1's binary_logloss: 0.134728
[17] training's auc: 0.871533 training's binary_logloss: 0.131734 valid_1's auc: 0.834363 valid_1's binary_logloss: 0.134322
[18] training's auc: 0.872583 training's binary_logloss: 0.131034 valid_1's auc: 0.834365 valid_1's binary_logloss: 0.133922
[19] training's auc: 0.873889 training's binary_logloss: 0.130334 valid_1's auc: 0.834077 valid_1's binary_logloss: 0.133649
[20] training's auc: 0.874716 training's binary_logloss: 0.12968 valid_1's auc: 0.834183 valid_1's binary_logloss: 0.133354
[21] training's auc: 0.876192 training's binary_logloss: 0.129031 valid_1's auc: 0.833693 valid_1's binary_logloss: 0.133171
[22] training's auc: 0.876995 training's binary_logloss: 0.128481 valid_1's auc: 0.833545 valid_1's binary_logloss: 0.132933
[23] training's auc: 0.878225 training's binary_logloss: 0.127899 valid_1's auc: 0.833259 valid_1's binary_logloss: 0.132761
[24] training's auc: 0.879791 training's binary_logloss: 0.127364 valid_1's auc: 0.832976 valid_1's binary_logloss: 0.132607
[25] training's auc: 0.880532 training's binary_logloss: 0.126852 valid_1's auc: 0.832429 valid_1's binary_logloss: 0.13252
[26] training's auc: 0.882258 training's binary_logloss: 0.12626 valid_1's auc: 0.832897 valid_1's binary_logloss: 0.132315
[27] training's auc: 0.883001 training's binary_logloss: 0.125819 valid_1's auc: 0.833081 valid_1's binary_logloss: 0.132161
[28] training's auc: 0.884118 training's binary_logloss: 0.125305 valid_1's auc: 0.833355 valid_1's binary_logloss: 0.132006
[29] training's auc: 0.885031 training's binary_logloss: 0.124841 valid_1's auc: 0.833347 valid_1's binary_logloss: 0.131889
[30] training's auc: 0.885869 training's binary_logloss: 0.124369 valid_1's auc: 0.833149 valid_1's binary_logloss: 0.131793
[31] training's auc: 0.886754 training's binary_logloss: 0.12394 valid_1's auc: 0.833433 valid_1's binary_logloss: 0.131633
[32] training's auc: 0.88772 training's binary_logloss: 0.123528 valid_1's auc: 0.8335 valid_1's binary_logloss: 0.131549
[33] training's auc: 0.888822 training's binary_logloss: 0.123066 valid_1's auc: 0.833634 valid_1's binary_logloss: 0.13146
[34] training's auc: 0.889745 training's binary_logloss: 0.122651 valid_1's auc: 0.833689 valid_1's binary_logloss: 0.131384
[35] training's auc: 0.890657 training's binary_logloss: 0.12224 valid_1's auc: 0.833941 valid_1's binary_logloss: 0.131313
[36] training's auc: 0.89147 training's binary_logloss: 0.121883 valid_1's auc: 0.834017 valid_1's binary_logloss: 0.131263
[37] training's auc: 0.892338 training's binary_logloss: 0.121517 valid_1's auc: 0.833722 valid_1's binary_logloss: 0.131245
[38] training's auc: 0.893183 training's binary_logloss: 0.121154 valid_1's auc: 0.833619 valid_1's binary_logloss: 0.131219
[39] training's auc: 0.89419 training's binary_logloss: 0.120793 valid_1's auc: 0.833624 valid_1's binary_logloss: 0.131148
Early stopping, best iteration is:
[9] training's auc: 0.861207 training's binary_logloss: 0.139873 valid_1's auc: 0.834929 valid_1's binary_logloss: 0.139387
[1] training's auc: 0.833666 training's binary_logloss: 0.157203 valid_1's auc: 0.814486 valid_1's binary_logloss: 0.16315
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.838021 training's binary_logloss: 0.152929 valid_1's auc: 0.818795 valid_1's binary_logloss: 0.159227
[3] training's auc: 0.845069 training's binary_logloss: 0.149406 valid_1's auc: 0.826337 valid_1's binary_logloss: 0.155831
[4] training's auc: 0.848617 training's binary_logloss: 0.146555 valid_1's auc: 0.827438 valid_1's binary_logloss: 0.15323
[5] training's auc: 0.851299 training's binary_logloss: 0.144069 valid_1's auc: 0.828604 valid_1's binary_logloss: 0.151189
[6] training's auc: 0.853306 training's binary_logloss: 0.141988 valid_1's auc: 0.828782 valid_1's binary_logloss: 0.149487
[7] training's auc: 0.856274 training's binary_logloss: 0.140125 valid_1's auc: 0.829263 valid_1's binary_logloss: 0.147987
[8] training's auc: 0.858345 training's binary_logloss: 0.138464 valid_1's auc: 0.829743 valid_1's binary_logloss: 0.146685
[9] training's auc: 0.859945 training's binary_logloss: 0.136918 valid_1's auc: 0.830152 valid_1's binary_logloss: 0.145556
[10] training's auc: 0.860494 training's binary_logloss: 0.135593 valid_1's auc: 0.830647 valid_1's binary_logloss: 0.144509
[11] training's auc: 0.86195 training's binary_logloss: 0.134373 valid_1's auc: 0.830848 valid_1's binary_logloss: 0.143681
[12] training's auc: 0.863914 training's binary_logloss: 0.133292 valid_1's auc: 0.831456 valid_1's binary_logloss: 0.142909
[13] training's auc: 0.864771 training's binary_logloss: 0.132289 valid_1's auc: 0.831887 valid_1's binary_logloss: 0.142243
[14] training's auc: 0.86721 training's binary_logloss: 0.131354 valid_1's auc: 0.834406 valid_1's binary_logloss: 0.141605
[15] training's auc: 0.868834 training's binary_logloss: 0.13049 valid_1's auc: 0.834528 valid_1's binary_logloss: 0.141066
[16] training's auc: 0.869799 training's binary_logloss: 0.129716 valid_1's auc: 0.83462 valid_1's binary_logloss: 0.140621
[17] training's auc: 0.871606 training's binary_logloss: 0.128934 valid_1's auc: 0.834226 valid_1's binary_logloss: 0.140192
[18] training's auc: 0.87262 training's binary_logloss: 0.128224 valid_1's auc: 0.834533 valid_1's binary_logloss: 0.139804
[19] training's auc: 0.874245 training's binary_logloss: 0.127552 valid_1's auc: 0.834632 valid_1's binary_logloss: 0.139404
[20] training's auc: 0.875937 training's binary_logloss: 0.126922 valid_1's auc: 0.834878 valid_1's binary_logloss: 0.139075
[21] training's auc: 0.877543 training's binary_logloss: 0.126302 valid_1's auc: 0.835349 valid_1's binary_logloss: 0.138803
[22] training's auc: 0.8789 training's binary_logloss: 0.125682 valid_1's auc: 0.835216 valid_1's binary_logloss: 0.138585
[23] training's auc: 0.879837 training's binary_logloss: 0.125132 valid_1's auc: 0.835706 valid_1's binary_logloss: 0.138354
[24] training's auc: 0.881265 training's binary_logloss: 0.124602 valid_1's auc: 0.834933 valid_1's binary_logloss: 0.138203
[25] training's auc: 0.882522 training's binary_logloss: 0.124053 valid_1's auc: 0.83502 valid_1's binary_logloss: 0.138052
[26] training's auc: 0.883435 training's binary_logloss: 0.123545 valid_1's auc: 0.835439 valid_1's binary_logloss: 0.137836
[27] training's auc: 0.884396 training's binary_logloss: 0.123094 valid_1's auc: 0.83518 valid_1's binary_logloss: 0.137695
[28] training's auc: 0.885375 training's binary_logloss: 0.122629 valid_1's auc: 0.834948 valid_1's binary_logloss: 0.137608
[29] training's auc: 0.88638 training's binary_logloss: 0.12217 valid_1's auc: 0.834817 valid_1's binary_logloss: 0.137494
[30] training's auc: 0.887287 training's binary_logloss: 0.121772 valid_1's auc: 0.834691 valid_1's binary_logloss: 0.137383
[31] training's auc: 0.888489 training's binary_logloss: 0.121316 valid_1's auc: 0.83455 valid_1's binary_logloss: 0.137319
[32] training's auc: 0.88939 training's binary_logloss: 0.12088 valid_1's auc: 0.834456 valid_1's binary_logloss: 0.13725
[33] training's auc: 0.890459 training's binary_logloss: 0.120478 valid_1's auc: 0.834625 valid_1's binary_logloss: 0.137171
[34] training's auc: 0.891346 training's binary_logloss: 0.120069 valid_1's auc: 0.835106 valid_1's binary_logloss: 0.137024
[35] training's auc: 0.892219 training's binary_logloss: 0.119689 valid_1's auc: 0.834862 valid_1's binary_logloss: 0.136982
[36] training's auc: 0.893393 training's binary_logloss: 0.119319 valid_1's auc: 0.834951 valid_1's binary_logloss: 0.136907
[37] training's auc: 0.894379 training's binary_logloss: 0.118944 valid_1's auc: 0.834322 valid_1's binary_logloss: 0.13694
[38] training's auc: 0.895503 training's binary_logloss: 0.118565 valid_1's auc: 0.834272 valid_1's binary_logloss: 0.136928
[39] training's auc: 0.896326 training's binary_logloss: 0.118195 valid_1's auc: 0.833972 valid_1's binary_logloss: 0.136912
[40] training's auc: 0.896943 training's binary_logloss: 0.117861 valid_1's auc: 0.834174 valid_1's binary_logloss: 0.136841
[41] training's auc: 0.897695 training's binary_logloss: 0.117509 valid_1's auc: 0.833917 valid_1's binary_logloss: 0.136876
[42] training's auc: 0.898627 training's binary_logloss: 0.117146 valid_1's auc: 0.834056 valid_1's binary_logloss: 0.136837
[43] training's auc: 0.899221 training's binary_logloss: 0.116831 valid_1's auc: 0.833989 valid_1's binary_logloss: 0.136826
[44] training's auc: 0.899881 training's binary_logloss: 0.116525 valid_1's auc: 0.83394 valid_1's binary_logloss: 0.13682
[45] training's auc: 0.900553 training's binary_logloss: 0.116239 valid_1's auc: 0.83409 valid_1's binary_logloss: 0.136808
[46] training's auc: 0.901355 training's binary_logloss: 0.115928 valid_1's auc: 0.833755 valid_1's binary_logloss: 0.136862
[47] training's auc: 0.902097 training's binary_logloss: 0.115632 valid_1's auc: 0.83391 valid_1's binary_logloss: 0.136792
[48] training's auc: 0.902712 training's binary_logloss: 0.115362 valid_1's auc: 0.83373 valid_1's binary_logloss: 0.136813
[49] training's auc: 0.903742 training's binary_logloss: 0.115083 valid_1's auc: 0.833979 valid_1's binary_logloss: 0.136783
[50] training's auc: 0.904317 training's binary_logloss: 0.114845 valid_1's auc: 0.834023 valid_1's binary_logloss: 0.136755
[51] training's auc: 0.905083 training's binary_logloss: 0.114546 valid_1's auc: 0.833816 valid_1's binary_logloss: 0.136781
[52] training's auc: 0.906036 training's binary_logloss: 0.114201 valid_1's auc: 0.833745 valid_1's binary_logloss: 0.13682
[53] training's auc: 0.90702 training's binary_logloss: 0.113846 valid_1's auc: 0.833725 valid_1's binary_logloss: 0.136831
Early stopping, best iteration is:
[23] training's auc: 0.879837 training's binary_logloss: 0.125132 valid_1's auc: 0.835706 valid_1's binary_logloss: 0.138354
[1] training's auc: 0.827689 training's binary_logloss: 0.155308 valid_1's auc: 0.804538 valid_1's binary_logloss: 0.157839
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.838158 training's binary_logloss: 0.149848 valid_1's auc: 0.807097 valid_1's binary_logloss: 0.153442
[3] training's auc: 0.846162 training's binary_logloss: 0.145821 valid_1's auc: 0.816439 valid_1's binary_logloss: 0.150215
[4] training's auc: 0.851357 training's binary_logloss: 0.142592 valid_1's auc: 0.820449 valid_1's binary_logloss: 0.147655
[5] training's auc: 0.854886 training's binary_logloss: 0.139935 valid_1's auc: 0.821842 valid_1's binary_logloss: 0.14577
[6] training's auc: 0.859377 training's binary_logloss: 0.137693 valid_1's auc: 0.824203 valid_1's binary_logloss: 0.144077
[7] training's auc: 0.860974 training's binary_logloss: 0.13585 valid_1's auc: 0.824101 valid_1's binary_logloss: 0.142775
[8] training's auc: 0.862669 training's binary_logloss: 0.134245 valid_1's auc: 0.824102 valid_1's binary_logloss: 0.14174
[9] training's auc: 0.865502 training's binary_logloss: 0.13278 valid_1's auc: 0.824743 valid_1's binary_logloss: 0.140806
[10] training's auc: 0.86665 training's binary_logloss: 0.131463 valid_1's auc: 0.825938 valid_1's binary_logloss: 0.140033
[11] training's auc: 0.868862 training's binary_logloss: 0.130325 valid_1's auc: 0.827683 valid_1's binary_logloss: 0.139359
[12] training's auc: 0.870278 training's binary_logloss: 0.129274 valid_1's auc: 0.828225 valid_1's binary_logloss: 0.138753
[13] training's auc: 0.871796 training's binary_logloss: 0.128407 valid_1's auc: 0.828098 valid_1's binary_logloss: 0.138332
[14] training's auc: 0.873707 training's binary_logloss: 0.127465 valid_1's auc: 0.828455 valid_1's binary_logloss: 0.137918
[15] training's auc: 0.874952 training's binary_logloss: 0.126669 valid_1's auc: 0.828862 valid_1's binary_logloss: 0.137522
[16] training's auc: 0.876806 training's binary_logloss: 0.125914 valid_1's auc: 0.828877 valid_1's binary_logloss: 0.13721
[17] training's auc: 0.878471 training's binary_logloss: 0.125203 valid_1's auc: 0.828968 valid_1's binary_logloss: 0.137017
[18] training's auc: 0.87978 training's binary_logloss: 0.124514 valid_1's auc: 0.829591 valid_1's binary_logloss: 0.136733
[19] training's auc: 0.881274 training's binary_logloss: 0.123864 valid_1's auc: 0.830761 valid_1's binary_logloss: 0.13647
[20] training's auc: 0.882754 training's binary_logloss: 0.123293 valid_1's auc: 0.830825 valid_1's binary_logloss: 0.136382
[21] training's auc: 0.883783 training's binary_logloss: 0.122724 valid_1's auc: 0.830482 valid_1's binary_logloss: 0.136288
[22] training's auc: 0.885267 training's binary_logloss: 0.122233 valid_1's auc: 0.83089 valid_1's binary_logloss: 0.136176
[23] training's auc: 0.886989 training's binary_logloss: 0.1217 valid_1's auc: 0.830419 valid_1's binary_logloss: 0.13614
[24] training's auc: 0.888102 training's binary_logloss: 0.121224 valid_1's auc: 0.830532 valid_1's binary_logloss: 0.136035
[25] training's auc: 0.889617 training's binary_logloss: 0.120691 valid_1's auc: 0.82973 valid_1's binary_logloss: 0.136115
[26] training's auc: 0.891637 training's binary_logloss: 0.120145 valid_1's auc: 0.831151 valid_1's binary_logloss: 0.135853
[27] training's auc: 0.893069 training's binary_logloss: 0.11968 valid_1's auc: 0.830743 valid_1's binary_logloss: 0.135869
[28] training's auc: 0.894601 training's binary_logloss: 0.119182 valid_1's auc: 0.830643 valid_1's binary_logloss: 0.135851
[29] training's auc: 0.895722 training's binary_logloss: 0.118756 valid_1's auc: 0.830211 valid_1's binary_logloss: 0.135923
[30] training's auc: 0.896876 training's binary_logloss: 0.118374 valid_1's auc: 0.830432 valid_1's binary_logloss: 0.135886
[31] training's auc: 0.897901 training's binary_logloss: 0.117953 valid_1's auc: 0.830635 valid_1's binary_logloss: 0.135857
[32] training's auc: 0.899151 training's binary_logloss: 0.117588 valid_1's auc: 0.831008 valid_1's binary_logloss: 0.13579
[33] training's auc: 0.900201 training's binary_logloss: 0.117255 valid_1's auc: 0.831009 valid_1's binary_logloss: 0.135777
[34] training's auc: 0.900922 training's binary_logloss: 0.116889 valid_1's auc: 0.830656 valid_1's binary_logloss: 0.135869
[35] training's auc: 0.90166 training's binary_logloss: 0.11651 valid_1's auc: 0.830765 valid_1's binary_logloss: 0.13585
[36] training's auc: 0.902472 training's binary_logloss: 0.116125 valid_1's auc: 0.830725 valid_1's binary_logloss: 0.135893
[37] training's auc: 0.903321 training's binary_logloss: 0.115746 valid_1's auc: 0.830694 valid_1's binary_logloss: 0.13594
[38] training's auc: 0.903982 training's binary_logloss: 0.115444 valid_1's auc: 0.83006 valid_1's binary_logloss: 0.136066
[39] training's auc: 0.904711 training's binary_logloss: 0.115078 valid_1's auc: 0.830005 valid_1's binary_logloss: 0.136094
[40] training's auc: 0.905951 training's binary_logloss: 0.114741 valid_1's auc: 0.830602 valid_1's binary_logloss: 0.135993
[41] training's auc: 0.906735 training's binary_logloss: 0.11443 valid_1's auc: 0.830538 valid_1's binary_logloss: 0.136019
[42] training's auc: 0.907729 training's binary_logloss: 0.114074 valid_1's auc: 0.830173 valid_1's binary_logloss: 0.136112
[43] training's auc: 0.9087 training's binary_logloss: 0.113714 valid_1's auc: 0.830051 valid_1's binary_logloss: 0.136172
[44] training's auc: 0.909266 training's binary_logloss: 0.113439 valid_1's auc: 0.829727 valid_1's binary_logloss: 0.136262
[45] training's auc: 0.910334 training's binary_logloss: 0.113077 valid_1's auc: 0.8297 valid_1's binary_logloss: 0.136295
[46] training's auc: 0.911206 training's binary_logloss: 0.112772 valid_1's auc: 0.8292 valid_1's binary_logloss: 0.136398
[47] training's auc: 0.91186 training's binary_logloss: 0.112442 valid_1's auc: 0.828829 valid_1's binary_logloss: 0.136474
[48] training's auc: 0.912976 training's binary_logloss: 0.112048 valid_1's auc: 0.82878 valid_1's binary_logloss: 0.136464
[49] training's auc: 0.913726 training's binary_logloss: 0.111748 valid_1's auc: 0.8284 valid_1's binary_logloss: 0.136581
[50] training's auc: 0.914221 training's binary_logloss: 0.111502 valid_1's auc: 0.828247 valid_1's binary_logloss: 0.136648
[51] training's auc: 0.915081 training's binary_logloss: 0.1112 valid_1's auc: 0.828326 valid_1's binary_logloss: 0.136658
[52] training's auc: 0.915602 training's binary_logloss: 0.110959 valid_1's auc: 0.828032 valid_1's binary_logloss: 0.136749
[53] training's auc: 0.916139 training's binary_logloss: 0.110687 valid_1's auc: 0.827748 valid_1's binary_logloss: 0.136792
[54] training's auc: 0.916605 training's binary_logloss: 0.110397 valid_1's auc: 0.827668 valid_1's binary_logloss: 0.136853
[55] training's auc: 0.917104 training's binary_logloss: 0.110133 valid_1's auc: 0.827877 valid_1's binary_logloss: 0.136869
[56] training's auc: 0.918144 training's binary_logloss: 0.109794 valid_1's auc: 0.828153 valid_1's binary_logloss: 0.136858
Early stopping, best iteration is:
[26] training's auc: 0.891637 training's binary_logloss: 0.120145 valid_1's auc: 0.831151 valid_1's binary_logloss: 0.135853
[1] training's auc: 0.826967 training's binary_logloss: 0.158016 valid_1's auc: 0.813416 valid_1's binary_logloss: 0.153386
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.836988 training's binary_logloss: 0.152297 valid_1's auc: 0.820697 valid_1's binary_logloss: 0.148746
[3] training's auc: 0.847279 training's binary_logloss: 0.148138 valid_1's auc: 0.826202 valid_1's binary_logloss: 0.145543
[4] training's auc: 0.849482 training's binary_logloss: 0.144923 valid_1's auc: 0.827298 valid_1's binary_logloss: 0.142985
[5] training's auc: 0.853039 training's binary_logloss: 0.14232 valid_1's auc: 0.829853 valid_1's binary_logloss: 0.140929
[6] training's auc: 0.856328 training's binary_logloss: 0.140005 valid_1's auc: 0.832185 valid_1's binary_logloss: 0.139397
[7] training's auc: 0.858384 training's binary_logloss: 0.138106 valid_1's auc: 0.833252 valid_1's binary_logloss: 0.137971
[8] training's auc: 0.860434 training's binary_logloss: 0.136462 valid_1's auc: 0.833188 valid_1's binary_logloss: 0.136897
[9] training's auc: 0.862989 training's binary_logloss: 0.135014 valid_1's auc: 0.833554 valid_1's binary_logloss: 0.135918
[10] training's auc: 0.865414 training's binary_logloss: 0.133733 valid_1's auc: 0.833122 valid_1's binary_logloss: 0.135144
[11] training's auc: 0.867819 training's binary_logloss: 0.132553 valid_1's auc: 0.833862 valid_1's binary_logloss: 0.134518
[12] training's auc: 0.869402 training's binary_logloss: 0.131486 valid_1's auc: 0.8338 valid_1's binary_logloss: 0.133984
[13] training's auc: 0.870559 training's binary_logloss: 0.130565 valid_1's auc: 0.833788 valid_1's binary_logloss: 0.133511
[14] training's auc: 0.871966 training's binary_logloss: 0.129655 valid_1's auc: 0.833603 valid_1's binary_logloss: 0.133123
[15] training's auc: 0.873443 training's binary_logloss: 0.128812 valid_1's auc: 0.83353 valid_1's binary_logloss: 0.132798
[16] training's auc: 0.875159 training's binary_logloss: 0.128066 valid_1's auc: 0.833565 valid_1's binary_logloss: 0.132499
[17] training's auc: 0.876181 training's binary_logloss: 0.127354 valid_1's auc: 0.833078 valid_1's binary_logloss: 0.132334
[18] training's auc: 0.877776 training's binary_logloss: 0.12668 valid_1's auc: 0.832749 valid_1's binary_logloss: 0.13217
[19] training's auc: 0.879052 training's binary_logloss: 0.126011 valid_1's auc: 0.832545 valid_1's binary_logloss: 0.132025
[20] training's auc: 0.880989 training's binary_logloss: 0.125327 valid_1's auc: 0.832784 valid_1's binary_logloss: 0.13186
[21] training's auc: 0.882422 training's binary_logloss: 0.124726 valid_1's auc: 0.833555 valid_1's binary_logloss: 0.131662
[22] training's auc: 0.883716 training's binary_logloss: 0.124201 valid_1's auc: 0.833865 valid_1's binary_logloss: 0.131488
[23] training's auc: 0.885249 training's binary_logloss: 0.123601 valid_1's auc: 0.833536 valid_1's binary_logloss: 0.131438
[24] training's auc: 0.887172 training's binary_logloss: 0.123053 valid_1's auc: 0.833876 valid_1's binary_logloss: 0.131305
[25] training's auc: 0.888319 training's binary_logloss: 0.122563 valid_1's auc: 0.834459 valid_1's binary_logloss: 0.131212
[26] training's auc: 0.889326 training's binary_logloss: 0.12209 valid_1's auc: 0.834512 valid_1's binary_logloss: 0.131175
[27] training's auc: 0.890328 training's binary_logloss: 0.121625 valid_1's auc: 0.834308 valid_1's binary_logloss: 0.131131
[28] training's auc: 0.89206 training's binary_logloss: 0.121127 valid_1's auc: 0.834364 valid_1's binary_logloss: 0.131057
[29] training's auc: 0.893225 training's binary_logloss: 0.120745 valid_1's auc: 0.834179 valid_1's binary_logloss: 0.131032
[30] training's auc: 0.894709 training's binary_logloss: 0.120331 valid_1's auc: 0.833918 valid_1's binary_logloss: 0.131044
[31] training's auc: 0.895756 training's binary_logloss: 0.119934 valid_1's auc: 0.833715 valid_1's binary_logloss: 0.131021
[32] training's auc: 0.896622 training's binary_logloss: 0.119528 valid_1's auc: 0.833699 valid_1's binary_logloss: 0.131023
[33] training's auc: 0.898307 training's binary_logloss: 0.119108 valid_1's auc: 0.833862 valid_1's binary_logloss: 0.130998
[34] training's auc: 0.89937 training's binary_logloss: 0.118734 valid_1's auc: 0.833689 valid_1's binary_logloss: 0.131
[35] training's auc: 0.900598 training's binary_logloss: 0.118408 valid_1's auc: 0.834397 valid_1's binary_logloss: 0.130888
[36] training's auc: 0.901677 training's binary_logloss: 0.118028 valid_1's auc: 0.834662 valid_1's binary_logloss: 0.130829
[37] training's auc: 0.902751 training's binary_logloss: 0.117683 valid_1's auc: 0.834525 valid_1's binary_logloss: 0.130861
[38] training's auc: 0.903915 training's binary_logloss: 0.117294 valid_1's auc: 0.834758 valid_1's binary_logloss: 0.130831
[39] training's auc: 0.904689 training's binary_logloss: 0.11699 valid_1's auc: 0.834901 valid_1's binary_logloss: 0.130798
[40] training's auc: 0.905639 training's binary_logloss: 0.116608 valid_1's auc: 0.834782 valid_1's binary_logloss: 0.130831
[41] training's auc: 0.906626 training's binary_logloss: 0.116232 valid_1's auc: 0.834221 valid_1's binary_logloss: 0.130939
[42] training's auc: 0.907361 training's binary_logloss: 0.115868 valid_1's auc: 0.834306 valid_1's binary_logloss: 0.130955
[43] training's auc: 0.90809 training's binary_logloss: 0.115577 valid_1's auc: 0.834048 valid_1's binary_logloss: 0.130995
[44] training's auc: 0.909318 training's binary_logloss: 0.115236 valid_1's auc: 0.83424 valid_1's binary_logloss: 0.130979
[45] training's auc: 0.909918 training's binary_logloss: 0.114922 valid_1's auc: 0.83441 valid_1's binary_logloss: 0.130972
[46] training's auc: 0.910638 training's binary_logloss: 0.114652 valid_1's auc: 0.834397 valid_1's binary_logloss: 0.131011
[47] training's auc: 0.911392 training's binary_logloss: 0.114313 valid_1's auc: 0.833996 valid_1's binary_logloss: 0.131114
[48] training's auc: 0.912011 training's binary_logloss: 0.114048 valid_1's auc: 0.833629 valid_1's binary_logloss: 0.13118
[49] training's auc: 0.91261 training's binary_logloss: 0.113734 valid_1's auc: 0.833499 valid_1's binary_logloss: 0.131217
[50] training's auc: 0.91346 training's binary_logloss: 0.113437 valid_1's auc: 0.833389 valid_1's binary_logloss: 0.131232
[51] training's auc: 0.914085 training's binary_logloss: 0.113143 valid_1's auc: 0.83339 valid_1's binary_logloss: 0.131251
[52] training's auc: 0.914572 training's binary_logloss: 0.112907 valid_1's auc: 0.83316 valid_1's binary_logloss: 0.131284
[53] training's auc: 0.915352 training's binary_logloss: 0.112563 valid_1's auc: 0.833151 valid_1's binary_logloss: 0.131291
[54] training's auc: 0.915969 training's binary_logloss: 0.112274 valid_1's auc: 0.832973 valid_1's binary_logloss: 0.131336
[55] training's auc: 0.916455 training's binary_logloss: 0.111991 valid_1's auc: 0.832605 valid_1's binary_logloss: 0.131415
[56] training's auc: 0.916996 training's binary_logloss: 0.111729 valid_1's auc: 0.832711 valid_1's binary_logloss: 0.131417
[57] training's auc: 0.917488 training's binary_logloss: 0.111504 valid_1's auc: 0.83283 valid_1's binary_logloss: 0.131413
[58] training's auc: 0.917793 training's binary_logloss: 0.111282 valid_1's auc: 0.832687 valid_1's binary_logloss: 0.131467
[59] training's auc: 0.918552 training's binary_logloss: 0.110917 valid_1's auc: 0.832688 valid_1's binary_logloss: 0.131498
[60] training's auc: 0.918899 training's binary_logloss: 0.110727 valid_1's auc: 0.832711 valid_1's binary_logloss: 0.131495
[61] training's auc: 0.919365 training's binary_logloss: 0.110458 valid_1's auc: 0.832396 valid_1's binary_logloss: 0.131581
[62] training's auc: 0.919713 training's binary_logloss: 0.11024 valid_1's auc: 0.832411 valid_1's binary_logloss: 0.131576
[63] training's auc: 0.920444 training's binary_logloss: 0.110036 valid_1's auc: 0.832468 valid_1's binary_logloss: 0.131592
[64] training's auc: 0.920925 training's binary_logloss: 0.109791 valid_1's auc: 0.832266 valid_1's binary_logloss: 0.131614
[65] training's auc: 0.921238 training's binary_logloss: 0.109583 valid_1's auc: 0.832026 valid_1's binary_logloss: 0.131708
[66] training's auc: 0.922166 training's binary_logloss: 0.109248 valid_1's auc: 0.831872 valid_1's binary_logloss: 0.131768
[67] training's auc: 0.923117 training's binary_logloss: 0.108883 valid_1's auc: 0.831908 valid_1's binary_logloss: 0.13178
[68] training's auc: 0.92342 training's binary_logloss: 0.108696 valid_1's auc: 0.831796 valid_1's binary_logloss: 0.131804
[69] training's auc: 0.923676 training's binary_logloss: 0.108506 valid_1's auc: 0.831806 valid_1's binary_logloss: 0.131806
Early stopping, best iteration is:
[39] training's auc: 0.904689 training's binary_logloss: 0.11699 valid_1's auc: 0.834901 valid_1's binary_logloss: 0.130798
[1] training's auc: 0.827453 training's binary_logloss: 0.154484 valid_1's auc: 0.806813 valid_1's binary_logloss: 0.160709
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.836283 training's binary_logloss: 0.149016 valid_1's auc: 0.816013 valid_1's binary_logloss: 0.155492
[3] training's auc: 0.843764 training's binary_logloss: 0.14506 valid_1's auc: 0.824343 valid_1's binary_logloss: 0.151784
[4] training's auc: 0.847537 training's binary_logloss: 0.1418 valid_1's auc: 0.827135 valid_1's binary_logloss: 0.149153
[5] training's auc: 0.851832 training's binary_logloss: 0.139076 valid_1's auc: 0.828236 valid_1's binary_logloss: 0.146998
[6] training's auc: 0.854067 training's binary_logloss: 0.13695 valid_1's auc: 0.83015 valid_1's binary_logloss: 0.145337
[7] training's auc: 0.856611 training's binary_logloss: 0.134992 valid_1's auc: 0.830252 valid_1's binary_logloss: 0.143881
[8] training's auc: 0.858226 training's binary_logloss: 0.133462 valid_1's auc: 0.83027 valid_1's binary_logloss: 0.142815
[9] training's auc: 0.861348 training's binary_logloss: 0.132073 valid_1's auc: 0.832202 valid_1's binary_logloss: 0.141765
[10] training's auc: 0.863062 training's binary_logloss: 0.130771 valid_1's auc: 0.832825 valid_1's binary_logloss: 0.140875
[11] training's auc: 0.865282 training's binary_logloss: 0.129646 valid_1's auc: 0.833211 valid_1's binary_logloss: 0.14021
[12] training's auc: 0.866873 training's binary_logloss: 0.128647 valid_1's auc: 0.833422 valid_1's binary_logloss: 0.139592
[13] training's auc: 0.870766 training's binary_logloss: 0.127679 valid_1's auc: 0.834822 valid_1's binary_logloss: 0.139167
[14] training's auc: 0.873354 training's binary_logloss: 0.126815 valid_1's auc: 0.835452 valid_1's binary_logloss: 0.138698
[15] training's auc: 0.875046 training's binary_logloss: 0.126051 valid_1's auc: 0.83524 valid_1's binary_logloss: 0.138351
[16] training's auc: 0.876586 training's binary_logloss: 0.12536 valid_1's auc: 0.835988 valid_1's binary_logloss: 0.137983
[17] training's auc: 0.878587 training's binary_logloss: 0.124581 valid_1's auc: 0.836074 valid_1's binary_logloss: 0.137624
[18] training's auc: 0.879833 training's binary_logloss: 0.123897 valid_1's auc: 0.835606 valid_1's binary_logloss: 0.137534
[19] training's auc: 0.88149 training's binary_logloss: 0.123231 valid_1's auc: 0.835691 valid_1's binary_logloss: 0.137396
[20] training's auc: 0.883022 training's binary_logloss: 0.122646 valid_1's auc: 0.836098 valid_1's binary_logloss: 0.13721
[21] training's auc: 0.884477 training's binary_logloss: 0.122064 valid_1's auc: 0.835927 valid_1's binary_logloss: 0.137136
[22] training's auc: 0.885877 training's binary_logloss: 0.121512 valid_1's auc: 0.836074 valid_1's binary_logloss: 0.136969
[23] training's auc: 0.887403 training's binary_logloss: 0.120928 valid_1's auc: 0.835725 valid_1's binary_logloss: 0.136912
[24] training's auc: 0.888761 training's binary_logloss: 0.120396 valid_1's auc: 0.835592 valid_1's binary_logloss: 0.136854
[25] training's auc: 0.889701 training's binary_logloss: 0.119929 valid_1's auc: 0.835441 valid_1's binary_logloss: 0.136811
[26] training's auc: 0.890752 training's binary_logloss: 0.119469 valid_1's auc: 0.835444 valid_1's binary_logloss: 0.136802
[27] training's auc: 0.892108 training's binary_logloss: 0.118995 valid_1's auc: 0.835116 valid_1's binary_logloss: 0.1367
[28] training's auc: 0.893731 training's binary_logloss: 0.118511 valid_1's auc: 0.835401 valid_1's binary_logloss: 0.136625
[29] training's auc: 0.894768 training's binary_logloss: 0.118094 valid_1's auc: 0.835424 valid_1's binary_logloss: 0.136575
[30] training's auc: 0.896188 training's binary_logloss: 0.117663 valid_1's auc: 0.835499 valid_1's binary_logloss: 0.136547
[31] training's auc: 0.897284 training's binary_logloss: 0.117223 valid_1's auc: 0.835354 valid_1's binary_logloss: 0.136535
[32] training's auc: 0.898264 training's binary_logloss: 0.116848 valid_1's auc: 0.835111 valid_1's binary_logloss: 0.136571
[33] training's auc: 0.899509 training's binary_logloss: 0.116325 valid_1's auc: 0.835147 valid_1's binary_logloss: 0.136504
[34] training's auc: 0.900584 training's binary_logloss: 0.115934 valid_1's auc: 0.834958 valid_1's binary_logloss: 0.136553
[35] training's auc: 0.901592 training's binary_logloss: 0.115478 valid_1's auc: 0.834837 valid_1's binary_logloss: 0.136571
[36] training's auc: 0.902917 training's binary_logloss: 0.115048 valid_1's auc: 0.834637 valid_1's binary_logloss: 0.136577
[37] training's auc: 0.9038 training's binary_logloss: 0.11463 valid_1's auc: 0.834866 valid_1's binary_logloss: 0.136566
[38] training's auc: 0.904521 training's binary_logloss: 0.114274 valid_1's auc: 0.834991 valid_1's binary_logloss: 0.136559
[39] training's auc: 0.905432 training's binary_logloss: 0.113891 valid_1's auc: 0.835108 valid_1's binary_logloss: 0.136602
[40] training's auc: 0.906386 training's binary_logloss: 0.113515 valid_1's auc: 0.83511 valid_1's binary_logloss: 0.136593
[41] training's auc: 0.907189 training's binary_logloss: 0.11315 valid_1's auc: 0.835209 valid_1's binary_logloss: 0.136613
[42] training's auc: 0.907715 training's binary_logloss: 0.112861 valid_1's auc: 0.834965 valid_1's binary_logloss: 0.136669
[43] training's auc: 0.908283 training's binary_logloss: 0.112555 valid_1's auc: 0.834705 valid_1's binary_logloss: 0.136713
[44] training's auc: 0.90891 training's binary_logloss: 0.112287 valid_1's auc: 0.834574 valid_1's binary_logloss: 0.136729
[45] training's auc: 0.90958 training's binary_logloss: 0.111951 valid_1's auc: 0.834554 valid_1's binary_logloss: 0.136731
[46] training's auc: 0.910435 training's binary_logloss: 0.111597 valid_1's auc: 0.834601 valid_1's binary_logloss: 0.136753
[47] training's auc: 0.911013 training's binary_logloss: 0.111318 valid_1's auc: 0.834286 valid_1's binary_logloss: 0.13686
[48] training's auc: 0.911572 training's binary_logloss: 0.111048 valid_1's auc: 0.834288 valid_1's binary_logloss: 0.136882
[49] training's auc: 0.912039 training's binary_logloss: 0.110772 valid_1's auc: 0.833952 valid_1's binary_logloss: 0.136934
[50] training's auc: 0.912432 training's binary_logloss: 0.110539 valid_1's auc: 0.833884 valid_1's binary_logloss: 0.136965
Early stopping, best iteration is:
[20] training's auc: 0.883022 training's binary_logloss: 0.122646 valid_1's auc: 0.836098 valid_1's binary_logloss: 0.13721
[1] training's auc: 0.826915 training's binary_logloss: 0.153721 valid_1's auc: 0.803825 valid_1's binary_logloss: 0.156525
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.839935 training's binary_logloss: 0.14791 valid_1's auc: 0.813624 valid_1's binary_logloss: 0.151645
[3] training's auc: 0.847213 training's binary_logloss: 0.143786 valid_1's auc: 0.819026 valid_1's binary_logloss: 0.148291
[4] training's auc: 0.851632 training's binary_logloss: 0.140512 valid_1's auc: 0.822285 valid_1's binary_logloss: 0.145903
[5] training's auc: 0.856755 training's binary_logloss: 0.137847 valid_1's auc: 0.825419 valid_1's binary_logloss: 0.143786
[6] training's auc: 0.858228 training's binary_logloss: 0.135727 valid_1's auc: 0.825651 valid_1's binary_logloss: 0.142132
[7] training's auc: 0.860036 training's binary_logloss: 0.133865 valid_1's auc: 0.82615 valid_1's binary_logloss: 0.140909
[8] training's auc: 0.862357 training's binary_logloss: 0.132252 valid_1's auc: 0.82923 valid_1's binary_logloss: 0.139844
[9] training's auc: 0.86552 training's binary_logloss: 0.13093 valid_1's auc: 0.829116 valid_1's binary_logloss: 0.139118
[10] training's auc: 0.867755 training's binary_logloss: 0.129788 valid_1's auc: 0.828923 valid_1's binary_logloss: 0.138583
[11] training's auc: 0.869906 training's binary_logloss: 0.12865 valid_1's auc: 0.829364 valid_1's binary_logloss: 0.138024
[12] training's auc: 0.871862 training's binary_logloss: 0.127623 valid_1's auc: 0.82924 valid_1's binary_logloss: 0.137634
[13] training's auc: 0.873448 training's binary_logloss: 0.126743 valid_1's auc: 0.830237 valid_1's binary_logloss: 0.137244
[14] training's auc: 0.87596 training's binary_logloss: 0.125885 valid_1's auc: 0.829966 valid_1's binary_logloss: 0.137025
[15] training's auc: 0.87725 training's binary_logloss: 0.125099 valid_1's auc: 0.830298 valid_1's binary_logloss: 0.136707
[16] training's auc: 0.879266 training's binary_logloss: 0.12437 valid_1's auc: 0.830161 valid_1's binary_logloss: 0.136583
[17] training's auc: 0.880702 training's binary_logloss: 0.123744 valid_1's auc: 0.83057 valid_1's binary_logloss: 0.136358
[18] training's auc: 0.882372 training's binary_logloss: 0.123059 valid_1's auc: 0.831016 valid_1's binary_logloss: 0.136205
[19] training's auc: 0.884034 training's binary_logloss: 0.122464 valid_1's auc: 0.830594 valid_1's binary_logloss: 0.136158
[20] training's auc: 0.8856 training's binary_logloss: 0.12182 valid_1's auc: 0.83027 valid_1's binary_logloss: 0.136157
[21] training's auc: 0.886829 training's binary_logloss: 0.12126 valid_1's auc: 0.829567 valid_1's binary_logloss: 0.136182
[22] training's auc: 0.889141 training's binary_logloss: 0.120703 valid_1's auc: 0.8295 valid_1's binary_logloss: 0.136125
[23] training's auc: 0.890269 training's binary_logloss: 0.120253 valid_1's auc: 0.829685 valid_1's binary_logloss: 0.136062
[24] training's auc: 0.891872 training's binary_logloss: 0.119675 valid_1's auc: 0.830637 valid_1's binary_logloss: 0.135971
[25] training's auc: 0.89338 training's binary_logloss: 0.119188 valid_1's auc: 0.83065 valid_1's binary_logloss: 0.135923
[26] training's auc: 0.894487 training's binary_logloss: 0.118723 valid_1's auc: 0.830431 valid_1's binary_logloss: 0.135889
[27] training's auc: 0.895675 training's binary_logloss: 0.11827 valid_1's auc: 0.830162 valid_1's binary_logloss: 0.13595
[28] training's auc: 0.897296 training's binary_logloss: 0.11783 valid_1's auc: 0.829884 valid_1's binary_logloss: 0.135981
[29] training's auc: 0.898102 training's binary_logloss: 0.117422 valid_1's auc: 0.82969 valid_1's binary_logloss: 0.136045
[30] training's auc: 0.899066 training's binary_logloss: 0.117033 valid_1's auc: 0.830273 valid_1's binary_logloss: 0.135973
[31] training's auc: 0.90014 training's binary_logloss: 0.116601 valid_1's auc: 0.830455 valid_1's binary_logloss: 0.135925
[32] training's auc: 0.902015 training's binary_logloss: 0.116151 valid_1's auc: 0.831373 valid_1's binary_logloss: 0.135801
[33] training's auc: 0.903001 training's binary_logloss: 0.115745 valid_1's auc: 0.831082 valid_1's binary_logloss: 0.135824
[34] training's auc: 0.903676 training's binary_logloss: 0.115395 valid_1's auc: 0.831055 valid_1's binary_logloss: 0.135861
[35] training's auc: 0.904571 training's binary_logloss: 0.115018 valid_1's auc: 0.830787 valid_1's binary_logloss: 0.135917
[36] training's auc: 0.905665 training's binary_logloss: 0.114639 valid_1's auc: 0.830491 valid_1's binary_logloss: 0.135975
[37] training's auc: 0.906844 training's binary_logloss: 0.114253 valid_1's auc: 0.82988 valid_1's binary_logloss: 0.136109
[38] training's auc: 0.907579 training's binary_logloss: 0.113912 valid_1's auc: 0.82957 valid_1's binary_logloss: 0.136203
[39] training's auc: 0.908463 training's binary_logloss: 0.113649 valid_1's auc: 0.829555 valid_1's binary_logloss: 0.136227
[40] training's auc: 0.909299 training's binary_logloss: 0.113289 valid_1's auc: 0.829531 valid_1's binary_logloss: 0.136257
[41] training's auc: 0.91062 training's binary_logloss: 0.112892 valid_1's auc: 0.829722 valid_1's binary_logloss: 0.136236
[42] training's auc: 0.911201 training's binary_logloss: 0.112651 valid_1's auc: 0.829232 valid_1's binary_logloss: 0.136343
[43] training's auc: 0.911856 training's binary_logloss: 0.112332 valid_1's auc: 0.828808 valid_1's binary_logloss: 0.136444
[44] training's auc: 0.913054 training's binary_logloss: 0.112012 valid_1's auc: 0.82869 valid_1's binary_logloss: 0.136489
[45] training's auc: 0.914018 training's binary_logloss: 0.111636 valid_1's auc: 0.828401 valid_1's binary_logloss: 0.136593
[46] training's auc: 0.914569 training's binary_logloss: 0.11133 valid_1's auc: 0.827901 valid_1's binary_logloss: 0.136704
[47] training's auc: 0.915186 training's binary_logloss: 0.11108 valid_1's auc: 0.827893 valid_1's binary_logloss: 0.136718
[48] training's auc: 0.916063 training's binary_logloss: 0.110743 valid_1's auc: 0.82793 valid_1's binary_logloss: 0.136738
[49] training's auc: 0.916752 training's binary_logloss: 0.110457 valid_1's auc: 0.827799 valid_1's binary_logloss: 0.136816
[50] training's auc: 0.917245 training's binary_logloss: 0.1102 valid_1's auc: 0.827615 valid_1's binary_logloss: 0.136863
[51] training's auc: 0.918007 training's binary_logloss: 0.109831 valid_1's auc: 0.827043 valid_1's binary_logloss: 0.136988
[52] training's auc: 0.918528 training's binary_logloss: 0.109518 valid_1's auc: 0.826567 valid_1's binary_logloss: 0.137128
[53] training's auc: 0.919286 training's binary_logloss: 0.109158 valid_1's auc: 0.82655 valid_1's binary_logloss: 0.137128
[54] training's auc: 0.919647 training's binary_logloss: 0.108957 valid_1's auc: 0.826276 valid_1's binary_logloss: 0.137217
[55] training's auc: 0.920055 training's binary_logloss: 0.108734 valid_1's auc: 0.825984 valid_1's binary_logloss: 0.1373
[56] training's auc: 0.920626 training's binary_logloss: 0.108424 valid_1's auc: 0.825176 valid_1's binary_logloss: 0.137509
[57] training's auc: 0.921322 training's binary_logloss: 0.108184 valid_1's auc: 0.824602 valid_1's binary_logloss: 0.137627
[58] training's auc: 0.922339 training's binary_logloss: 0.107904 valid_1's auc: 0.824717 valid_1's binary_logloss: 0.13768
[59] training's auc: 0.922888 training's binary_logloss: 0.107575 valid_1's auc: 0.824909 valid_1's binary_logloss: 0.137708
[60] training's auc: 0.92334 training's binary_logloss: 0.107319 valid_1's auc: 0.824624 valid_1's binary_logloss: 0.137756
[61] training's auc: 0.923763 training's binary_logloss: 0.107066 valid_1's auc: 0.824028 valid_1's binary_logloss: 0.137916
[62] training's auc: 0.924276 training's binary_logloss: 0.106874 valid_1's auc: 0.824224 valid_1's binary_logloss: 0.137922
Early stopping, best iteration is:
[32] training's auc: 0.902015 training's binary_logloss: 0.116151 valid_1's auc: 0.831373 valid_1's binary_logloss: 0.135801
[1] training's auc: 0.826784 training's binary_logloss: 0.156389 valid_1's auc: 0.813873 valid_1's binary_logloss: 0.152047
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.837553 training's binary_logloss: 0.150224 valid_1's auc: 0.822828 valid_1's binary_logloss: 0.147059
[3] training's auc: 0.846941 training's binary_logloss: 0.145985 valid_1's auc: 0.829408 valid_1's binary_logloss: 0.143584
[4] training's auc: 0.851114 training's binary_logloss: 0.142785 valid_1's auc: 0.831962 valid_1's binary_logloss: 0.141018
[5] training's auc: 0.854953 training's binary_logloss: 0.140108 valid_1's auc: 0.833123 valid_1's binary_logloss: 0.139062
[6] training's auc: 0.857456 training's binary_logloss: 0.137895 valid_1's auc: 0.835245 valid_1's binary_logloss: 0.137481
[7] training's auc: 0.860647 training's binary_logloss: 0.136014 valid_1's auc: 0.835234 valid_1's binary_logloss: 0.136298
[8] training's auc: 0.86265 training's binary_logloss: 0.134453 valid_1's auc: 0.834871 valid_1's binary_logloss: 0.13535
[9] training's auc: 0.865225 training's binary_logloss: 0.133063 valid_1's auc: 0.834245 valid_1's binary_logloss: 0.134635
[10] training's auc: 0.86757 training's binary_logloss: 0.13181 valid_1's auc: 0.834244 valid_1's binary_logloss: 0.133998
[11] training's auc: 0.869706 training's binary_logloss: 0.130684 valid_1's auc: 0.833009 valid_1's binary_logloss: 0.133591
[12] training's auc: 0.871247 training's binary_logloss: 0.129658 valid_1's auc: 0.833528 valid_1's binary_logloss: 0.133116
[13] training's auc: 0.873778 training's binary_logloss: 0.128716 valid_1's auc: 0.832877 valid_1's binary_logloss: 0.1328
[14] training's auc: 0.876145 training's binary_logloss: 0.127772 valid_1's auc: 0.832346 valid_1's binary_logloss: 0.132552
[15] training's auc: 0.87741 training's binary_logloss: 0.126974 valid_1's auc: 0.832759 valid_1's binary_logloss: 0.132251
[16] training's auc: 0.879173 training's binary_logloss: 0.126189 valid_1's auc: 0.833077 valid_1's binary_logloss: 0.132022
[17] training's auc: 0.88078 training's binary_logloss: 0.125458 valid_1's auc: 0.833369 valid_1's binary_logloss: 0.131822
[18] training's auc: 0.882056 training's binary_logloss: 0.124783 valid_1's auc: 0.833683 valid_1's binary_logloss: 0.131624
[19] training's auc: 0.883795 training's binary_logloss: 0.124169 valid_1's auc: 0.834396 valid_1's binary_logloss: 0.131443
[20] training's auc: 0.885297 training's binary_logloss: 0.123524 valid_1's auc: 0.834177 valid_1's binary_logloss: 0.131377
[21] training's auc: 0.886678 training's binary_logloss: 0.122975 valid_1's auc: 0.834621 valid_1's binary_logloss: 0.131226
[22] training's auc: 0.887761 training's binary_logloss: 0.122453 valid_1's auc: 0.835422 valid_1's binary_logloss: 0.131078
[23] training's auc: 0.889215 training's binary_logloss: 0.121949 valid_1's auc: 0.834835 valid_1's binary_logloss: 0.131144
[24] training's auc: 0.890975 training's binary_logloss: 0.121349 valid_1's auc: 0.834945 valid_1's binary_logloss: 0.131133
[25] training's auc: 0.891886 training's binary_logloss: 0.120873 valid_1's auc: 0.83452 valid_1's binary_logloss: 0.131144
[26] training's auc: 0.893592 training's binary_logloss: 0.120327 valid_1's auc: 0.834967 valid_1's binary_logloss: 0.131084
[27] training's auc: 0.894835 training's binary_logloss: 0.119938 valid_1's auc: 0.834903 valid_1's binary_logloss: 0.131072
[28] training's auc: 0.89629 training's binary_logloss: 0.119546 valid_1's auc: 0.834662 valid_1's binary_logloss: 0.131043
[29] training's auc: 0.897359 training's binary_logloss: 0.119069 valid_1's auc: 0.834654 valid_1's binary_logloss: 0.13105
[30] training's auc: 0.898396 training's binary_logloss: 0.11874 valid_1's auc: 0.834331 valid_1's binary_logloss: 0.131098
[31] training's auc: 0.899604 training's binary_logloss: 0.118251 valid_1's auc: 0.834194 valid_1's binary_logloss: 0.131131
[32] training's auc: 0.900991 training's binary_logloss: 0.117878 valid_1's auc: 0.834622 valid_1's binary_logloss: 0.131078
[33] training's auc: 0.902297 training's binary_logloss: 0.117386 valid_1's auc: 0.8342 valid_1's binary_logloss: 0.131137
[34] training's auc: 0.90365 training's binary_logloss: 0.11691 valid_1's auc: 0.834234 valid_1's binary_logloss: 0.131159
[35] training's auc: 0.904476 training's binary_logloss: 0.116507 valid_1's auc: 0.834228 valid_1's binary_logloss: 0.131152
[36] training's auc: 0.905684 training's binary_logloss: 0.116012 valid_1's auc: 0.833867 valid_1's binary_logloss: 0.131225
[37] training's auc: 0.906711 training's binary_logloss: 0.115628 valid_1's auc: 0.833853 valid_1's binary_logloss: 0.131244
[38] training's auc: 0.907864 training's binary_logloss: 0.115191 valid_1's auc: 0.833676 valid_1's binary_logloss: 0.131265
[39] training's auc: 0.908769 training's binary_logloss: 0.114841 valid_1's auc: 0.834016 valid_1's binary_logloss: 0.131214
[40] training's auc: 0.909609 training's binary_logloss: 0.114495 valid_1's auc: 0.833974 valid_1's binary_logloss: 0.131275
[41] training's auc: 0.910271 training's binary_logloss: 0.114177 valid_1's auc: 0.833894 valid_1's binary_logloss: 0.131283
[42] training's auc: 0.910761 training's binary_logloss: 0.11389 valid_1's auc: 0.833701 valid_1's binary_logloss: 0.131316
[43] training's auc: 0.911396 training's binary_logloss: 0.113537 valid_1's auc: 0.833468 valid_1's binary_logloss: 0.131406
[44] training's auc: 0.912289 training's binary_logloss: 0.113244 valid_1's auc: 0.833375 valid_1's binary_logloss: 0.131418
[45] training's auc: 0.913013 training's binary_logloss: 0.112925 valid_1's auc: 0.833259 valid_1's binary_logloss: 0.131422
[46] training's auc: 0.913571 training's binary_logloss: 0.112651 valid_1's auc: 0.833262 valid_1's binary_logloss: 0.131446
[47] training's auc: 0.914334 training's binary_logloss: 0.112267 valid_1's auc: 0.833286 valid_1's binary_logloss: 0.131477
[48] training's auc: 0.914788 training's binary_logloss: 0.112019 valid_1's auc: 0.833357 valid_1's binary_logloss: 0.131473
[49] training's auc: 0.91537 training's binary_logloss: 0.111746 valid_1's auc: 0.833128 valid_1's binary_logloss: 0.131534
[50] training's auc: 0.916264 training's binary_logloss: 0.111392 valid_1's auc: 0.832915 valid_1's binary_logloss: 0.131594
[51] training's auc: 0.916799 training's binary_logloss: 0.111121 valid_1's auc: 0.832857 valid_1's binary_logloss: 0.131613
[52] training's auc: 0.917431 training's binary_logloss: 0.11082 valid_1's auc: 0.832979 valid_1's binary_logloss: 0.131629
Early stopping, best iteration is:
[22] training's auc: 0.887761 training's binary_logloss: 0.122453 valid_1's auc: 0.835422 valid_1's binary_logloss: 0.131078
[1] training's auc: 0.826867 training's binary_logloss: 0.153018 valid_1's auc: 0.808756 valid_1's binary_logloss: 0.159249
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.837358 training's binary_logloss: 0.147224 valid_1's auc: 0.817671 valid_1's binary_logloss: 0.153582
[3] training's auc: 0.846111 training's binary_logloss: 0.142915 valid_1's auc: 0.824665 valid_1's binary_logloss: 0.149982
[4] training's auc: 0.850379 training's binary_logloss: 0.139466 valid_1's auc: 0.826474 valid_1's binary_logloss: 0.147234
[5] training's auc: 0.85397 training's binary_logloss: 0.13687 valid_1's auc: 0.829894 valid_1's binary_logloss: 0.145138
[6] training's auc: 0.855446 training's binary_logloss: 0.13473 valid_1's auc: 0.830818 valid_1's binary_logloss: 0.143496
[7] training's auc: 0.858791 training's binary_logloss: 0.132982 valid_1's auc: 0.831282 valid_1's binary_logloss: 0.142199
[8] training's auc: 0.861461 training's binary_logloss: 0.131435 valid_1's auc: 0.831339 valid_1's binary_logloss: 0.141145
[9] training's auc: 0.864046 training's binary_logloss: 0.130093 valid_1's auc: 0.83153 valid_1's binary_logloss: 0.140386
[10] training's auc: 0.866772 training's binary_logloss: 0.128883 valid_1's auc: 0.832187 valid_1's binary_logloss: 0.139707
[11] training's auc: 0.870754 training's binary_logloss: 0.127822 valid_1's auc: 0.834707 valid_1's binary_logloss: 0.13902
[12] training's auc: 0.873071 training's binary_logloss: 0.126818 valid_1's auc: 0.835771 valid_1's binary_logloss: 0.13848
[13] training's auc: 0.874722 training's binary_logloss: 0.125925 valid_1's auc: 0.835499 valid_1's binary_logloss: 0.138095
[14] training's auc: 0.876455 training's binary_logloss: 0.125051 valid_1's auc: 0.836489 valid_1's binary_logloss: 0.137584
[15] training's auc: 0.878236 training's binary_logloss: 0.124226 valid_1's auc: 0.837504 valid_1's binary_logloss: 0.137262
[16] training's auc: 0.880051 training's binary_logloss: 0.123472 valid_1's auc: 0.836913 valid_1's binary_logloss: 0.137072
[17] training's auc: 0.88203 training's binary_logloss: 0.122718 valid_1's auc: 0.836808 valid_1's binary_logloss: 0.136953
[18] training's auc: 0.883665 training's binary_logloss: 0.12204 valid_1's auc: 0.836568 valid_1's binary_logloss: 0.136861
[19] training's auc: 0.885463 training's binary_logloss: 0.121408 valid_1's auc: 0.836654 valid_1's binary_logloss: 0.136727
[20] training's auc: 0.88724 training's binary_logloss: 0.120825 valid_1's auc: 0.836611 valid_1's binary_logloss: 0.136639
[21] training's auc: 0.888552 training's binary_logloss: 0.1203 valid_1's auc: 0.8369 valid_1's binary_logloss: 0.136553
[22] training's auc: 0.889929 training's binary_logloss: 0.119771 valid_1's auc: 0.836305 valid_1's binary_logloss: 0.136565
[23] training's auc: 0.891268 training's binary_logloss: 0.11923 valid_1's auc: 0.836467 valid_1's binary_logloss: 0.136481
[24] training's auc: 0.892544 training's binary_logloss: 0.118731 valid_1's auc: 0.836232 valid_1's binary_logloss: 0.136445
[25] training's auc: 0.893678 training's binary_logloss: 0.118265 valid_1's auc: 0.83682 valid_1's binary_logloss: 0.1363
[26] training's auc: 0.895071 training's binary_logloss: 0.117798 valid_1's auc: 0.836537 valid_1's binary_logloss: 0.136287
[27] training's auc: 0.896357 training's binary_logloss: 0.117383 valid_1's auc: 0.836381 valid_1's binary_logloss: 0.136256
[28] training's auc: 0.897608 training's binary_logloss: 0.116922 valid_1's auc: 0.83571 valid_1's binary_logloss: 0.136351
[29] training's auc: 0.898824 training's binary_logloss: 0.116509 valid_1's auc: 0.835123 valid_1's binary_logloss: 0.136454
[30] training's auc: 0.899877 training's binary_logloss: 0.116124 valid_1's auc: 0.835117 valid_1's binary_logloss: 0.136467
[31] training's auc: 0.90114 training's binary_logloss: 0.115655 valid_1's auc: 0.835384 valid_1's binary_logloss: 0.136456
[32] training's auc: 0.902283 training's binary_logloss: 0.115226 valid_1's auc: 0.835418 valid_1's binary_logloss: 0.136466
[33] training's auc: 0.903699 training's binary_logloss: 0.114767 valid_1's auc: 0.835427 valid_1's binary_logloss: 0.136479
[34] training's auc: 0.904891 training's binary_logloss: 0.114352 valid_1's auc: 0.835168 valid_1's binary_logloss: 0.136495
[35] training's auc: 0.905753 training's binary_logloss: 0.113952 valid_1's auc: 0.834923 valid_1's binary_logloss: 0.136558
[36] training's auc: 0.906869 training's binary_logloss: 0.113639 valid_1's auc: 0.835242 valid_1's binary_logloss: 0.136544
[37] training's auc: 0.907632 training's binary_logloss: 0.113288 valid_1's auc: 0.83507 valid_1's binary_logloss: 0.136583
[38] training's auc: 0.908459 training's binary_logloss: 0.112913 valid_1's auc: 0.834919 valid_1's binary_logloss: 0.136592
[39] training's auc: 0.909312 training's binary_logloss: 0.112499 valid_1's auc: 0.834534 valid_1's binary_logloss: 0.136729
[40] training's auc: 0.910389 training's binary_logloss: 0.112177 valid_1's auc: 0.83457 valid_1's binary_logloss: 0.13677
[41] training's auc: 0.911174 training's binary_logloss: 0.111829 valid_1's auc: 0.834368 valid_1's binary_logloss: 0.136805
[42] training's auc: 0.911827 training's binary_logloss: 0.111518 valid_1's auc: 0.834517 valid_1's binary_logloss: 0.136818
[43] training's auc: 0.912378 training's binary_logloss: 0.111227 valid_1's auc: 0.834402 valid_1's binary_logloss: 0.136863
[44] training's auc: 0.912978 training's binary_logloss: 0.110916 valid_1's auc: 0.834156 valid_1's binary_logloss: 0.136905
[45] training's auc: 0.913499 training's binary_logloss: 0.110644 valid_1's auc: 0.833748 valid_1's binary_logloss: 0.137034
Early stopping, best iteration is:
[15] training's auc: 0.878236 training's binary_logloss: 0.124226 valid_1's auc: 0.837504 valid_1's binary_logloss: 0.137262
[1] training's auc: 0.838179 training's binary_logloss: 0.163473 valid_1's auc: 0.808559 valid_1's binary_logloss: 0.164526
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.838516 training's binary_logloss: 0.162431 valid_1's auc: 0.808693 valid_1's binary_logloss: 0.163655
[3] training's auc: 0.838728 training's binary_logloss: 0.161449 valid_1's auc: 0.808945 valid_1's binary_logloss: 0.162835
[4] training's auc: 0.838899 training's binary_logloss: 0.160518 valid_1's auc: 0.809165 valid_1's binary_logloss: 0.162072
[5] training's auc: 0.839896 training's binary_logloss: 0.159639 valid_1's auc: 0.809674 valid_1's binary_logloss: 0.161322
[6] training's auc: 0.840429 training's binary_logloss: 0.158798 valid_1's auc: 0.809992 valid_1's binary_logloss: 0.160626
[7] training's auc: 0.841728 training's binary_logloss: 0.157996 valid_1's auc: 0.810711 valid_1's binary_logloss: 0.159969
[8] training's auc: 0.842232 training's binary_logloss: 0.15723 valid_1's auc: 0.81054 valid_1's binary_logloss: 0.159351
[9] training's auc: 0.84256 training's binary_logloss: 0.1565 valid_1's auc: 0.809817 valid_1's binary_logloss: 0.158762
[10] training's auc: 0.842722 training's binary_logloss: 0.155799 valid_1's auc: 0.809438 valid_1's binary_logloss: 0.158194
[11] training's auc: 0.84276 training's binary_logloss: 0.155127 valid_1's auc: 0.809812 valid_1's binary_logloss: 0.157652
[12] training's auc: 0.842932 training's binary_logloss: 0.154468 valid_1's auc: 0.809232 valid_1's binary_logloss: 0.157129
[13] training's auc: 0.843643 training's binary_logloss: 0.153847 valid_1's auc: 0.809803 valid_1's binary_logloss: 0.15663
[14] training's auc: 0.84452 training's binary_logloss: 0.153243 valid_1's auc: 0.810824 valid_1's binary_logloss: 0.156126
[15] training's auc: 0.845338 training's binary_logloss: 0.152642 valid_1's auc: 0.812096 valid_1's binary_logloss: 0.155633
[16] training's auc: 0.846108 training's binary_logloss: 0.152065 valid_1's auc: 0.813425 valid_1's binary_logloss: 0.155168
[17] training's auc: 0.849812 training's binary_logloss: 0.151511 valid_1's auc: 0.815908 valid_1's binary_logloss: 0.154712
[18] training's auc: 0.850807 training's binary_logloss: 0.150973 valid_1's auc: 0.81717 valid_1's binary_logloss: 0.154258
[19] training's auc: 0.851527 training's binary_logloss: 0.150446 valid_1's auc: 0.817254 valid_1's binary_logloss: 0.15384
[20] training's auc: 0.852441 training's binary_logloss: 0.14993 valid_1's auc: 0.817923 valid_1's binary_logloss: 0.153432
[21] training's auc: 0.852821 training's binary_logloss: 0.149436 valid_1's auc: 0.818012 valid_1's binary_logloss: 0.153045
[22] training's auc: 0.85293 training's binary_logloss: 0.148959 valid_1's auc: 0.817905 valid_1's binary_logloss: 0.152658
[23] training's auc: 0.853337 training's binary_logloss: 0.148492 valid_1's auc: 0.817964 valid_1's binary_logloss: 0.152288
[24] training's auc: 0.853475 training's binary_logloss: 0.148036 valid_1's auc: 0.819015 valid_1's binary_logloss: 0.151911
[25] training's auc: 0.854323 training's binary_logloss: 0.147596 valid_1's auc: 0.819991 valid_1's binary_logloss: 0.151548
[26] training's auc: 0.854416 training's binary_logloss: 0.14716 valid_1's auc: 0.820592 valid_1's binary_logloss: 0.151194
[27] training's auc: 0.85462 training's binary_logloss: 0.146733 valid_1's auc: 0.820648 valid_1's binary_logloss: 0.150872
[28] training's auc: 0.854726 training's binary_logloss: 0.146329 valid_1's auc: 0.820892 valid_1's binary_logloss: 0.150549
[29] training's auc: 0.854906 training's binary_logloss: 0.14592 valid_1's auc: 0.820934 valid_1's binary_logloss: 0.15023
[30] training's auc: 0.85499 training's binary_logloss: 0.145529 valid_1's auc: 0.820833 valid_1's binary_logloss: 0.149926
[31] training's auc: 0.855198 training's binary_logloss: 0.145146 valid_1's auc: 0.820979 valid_1's binary_logloss: 0.14962
[32] training's auc: 0.855375 training's binary_logloss: 0.144774 valid_1's auc: 0.820961 valid_1's binary_logloss: 0.149343
[33] training's auc: 0.855497 training's binary_logloss: 0.144401 valid_1's auc: 0.821092 valid_1's binary_logloss: 0.14906
[34] training's auc: 0.85589 training's binary_logloss: 0.144044 valid_1's auc: 0.82147 valid_1's binary_logloss: 0.148786
[35] training's auc: 0.85606 training's binary_logloss: 0.14369 valid_1's auc: 0.821502 valid_1's binary_logloss: 0.148508
[36] training's auc: 0.856461 training's binary_logloss: 0.143343 valid_1's auc: 0.821723 valid_1's binary_logloss: 0.148256
[37] training's auc: 0.856628 training's binary_logloss: 0.143005 valid_1's auc: 0.821672 valid_1's binary_logloss: 0.148002
[38] training's auc: 0.856933 training's binary_logloss: 0.14267 valid_1's auc: 0.821846 valid_1's binary_logloss: 0.147756
[39] training's auc: 0.85768 training's binary_logloss: 0.142348 valid_1's auc: 0.822377 valid_1's binary_logloss: 0.147515
[40] training's auc: 0.858134 training's binary_logloss: 0.142032 valid_1's auc: 0.822598 valid_1's binary_logloss: 0.147271
[41] training's auc: 0.858376 training's binary_logloss: 0.141724 valid_1's auc: 0.822624 valid_1's binary_logloss: 0.147045
[42] training's auc: 0.858916 training's binary_logloss: 0.141419 valid_1's auc: 0.822929 valid_1's binary_logloss: 0.146817
[43] training's auc: 0.859123 training's binary_logloss: 0.141111 valid_1's auc: 0.823151 valid_1's binary_logloss: 0.146597
[44] training's auc: 0.859381 training's binary_logloss: 0.140816 valid_1's auc: 0.823363 valid_1's binary_logloss: 0.146377
[45] training's auc: 0.860047 training's binary_logloss: 0.140527 valid_1's auc: 0.823859 valid_1's binary_logloss: 0.146165
[46] training's auc: 0.860366 training's binary_logloss: 0.140238 valid_1's auc: 0.824063 valid_1's binary_logloss: 0.145957
[47] training's auc: 0.860658 training's binary_logloss: 0.13995 valid_1's auc: 0.824173 valid_1's binary_logloss: 0.145755
[48] training's auc: 0.860792 training's binary_logloss: 0.139672 valid_1's auc: 0.82462 valid_1's binary_logloss: 0.145544
[49] training's auc: 0.861217 training's binary_logloss: 0.139393 valid_1's auc: 0.824617 valid_1's binary_logloss: 0.145346
[50] training's auc: 0.861344 training's binary_logloss: 0.139131 valid_1's auc: 0.824846 valid_1's binary_logloss: 0.145142
[51] training's auc: 0.861514 training's binary_logloss: 0.138876 valid_1's auc: 0.82511 valid_1's binary_logloss: 0.144947
[52] training's auc: 0.861664 training's binary_logloss: 0.138622 valid_1's auc: 0.82512 valid_1's binary_logloss: 0.144773
[53] training's auc: 0.861827 training's binary_logloss: 0.138374 valid_1's auc: 0.825261 valid_1's binary_logloss: 0.144592
[54] training's auc: 0.862003 training's binary_logloss: 0.138131 valid_1's auc: 0.825288 valid_1's binary_logloss: 0.144427
[55] training's auc: 0.862116 training's binary_logloss: 0.13789 valid_1's auc: 0.825298 valid_1's binary_logloss: 0.14425
[56] training's auc: 0.862351 training's binary_logloss: 0.137654 valid_1's auc: 0.825373 valid_1's binary_logloss: 0.144092
[57] training's auc: 0.863612 training's binary_logloss: 0.137417 valid_1's auc: 0.825654 valid_1's binary_logloss: 0.143925
[58] training's auc: 0.863805 training's binary_logloss: 0.137198 valid_1's auc: 0.826072 valid_1's binary_logloss: 0.143751
[59] training's auc: 0.864081 training's binary_logloss: 0.13697 valid_1's auc: 0.826132 valid_1's binary_logloss: 0.143586
[60] training's auc: 0.864459 training's binary_logloss: 0.136747 valid_1's auc: 0.826254 valid_1's binary_logloss: 0.143431
[61] training's auc: 0.864562 training's binary_logloss: 0.136526 valid_1's auc: 0.826193 valid_1's binary_logloss: 0.143282
[62] training's auc: 0.864924 training's binary_logloss: 0.13631 valid_1's auc: 0.826072 valid_1's binary_logloss: 0.143122
[63] training's auc: 0.865083 training's binary_logloss: 0.136098 valid_1's auc: 0.826242 valid_1's binary_logloss: 0.142981
[64] training's auc: 0.865266 training's binary_logloss: 0.13589 valid_1's auc: 0.826299 valid_1's binary_logloss: 0.142835
[65] training's auc: 0.865357 training's binary_logloss: 0.135686 valid_1's auc: 0.826296 valid_1's binary_logloss: 0.142694
[66] training's auc: 0.865412 training's binary_logloss: 0.135487 valid_1's auc: 0.826355 valid_1's binary_logloss: 0.142559
[67] training's auc: 0.865563 training's binary_logloss: 0.13529 valid_1's auc: 0.826676 valid_1's binary_logloss: 0.142421
[68] training's auc: 0.865709 training's binary_logloss: 0.1351 valid_1's auc: 0.826822 valid_1's binary_logloss: 0.142293
[69] training's auc: 0.865932 training's binary_logloss: 0.134909 valid_1's auc: 0.826845 valid_1's binary_logloss: 0.142163
[70] training's auc: 0.866427 training's binary_logloss: 0.134722 valid_1's auc: 0.827262 valid_1's binary_logloss: 0.142037
[71] training's auc: 0.866526 training's binary_logloss: 0.13454 valid_1's auc: 0.827358 valid_1's binary_logloss: 0.141914
[72] training's auc: 0.866915 training's binary_logloss: 0.134351 valid_1's auc: 0.82745 valid_1's binary_logloss: 0.14179
[73] training's auc: 0.867014 training's binary_logloss: 0.13418 valid_1's auc: 0.827425 valid_1's binary_logloss: 0.141674
[74] training's auc: 0.867204 training's binary_logloss: 0.134002 valid_1's auc: 0.827355 valid_1's binary_logloss: 0.141555
[75] training's auc: 0.867409 training's binary_logloss: 0.133829 valid_1's auc: 0.827545 valid_1's binary_logloss: 0.141441
[76] training's auc: 0.867534 training's binary_logloss: 0.13366 valid_1's auc: 0.82767 valid_1's binary_logloss: 0.141329
[77] training's auc: 0.867825 training's binary_logloss: 0.133481 valid_1's auc: 0.827763 valid_1's binary_logloss: 0.141218
[78] training's auc: 0.867965 training's binary_logloss: 0.133309 valid_1's auc: 0.82779 valid_1's binary_logloss: 0.141112
[79] training's auc: 0.868423 training's binary_logloss: 0.133147 valid_1's auc: 0.827811 valid_1's binary_logloss: 0.141014
[80] training's auc: 0.868613 training's binary_logloss: 0.13298 valid_1's auc: 0.827989 valid_1's binary_logloss: 0.140914
[81] training's auc: 0.86901 training's binary_logloss: 0.132814 valid_1's auc: 0.82816 valid_1's binary_logloss: 0.140809
[82] training's auc: 0.869265 training's binary_logloss: 0.132651 valid_1's auc: 0.828316 valid_1's binary_logloss: 0.14071
[83] training's auc: 0.869686 training's binary_logloss: 0.132487 valid_1's auc: 0.828093 valid_1's binary_logloss: 0.140614
[84] training's auc: 0.870032 training's binary_logloss: 0.132326 valid_1's auc: 0.827999 valid_1's binary_logloss: 0.140518
[85] training's auc: 0.870283 training's binary_logloss: 0.132169 valid_1's auc: 0.828062 valid_1's binary_logloss: 0.140422
[86] training's auc: 0.870486 training's binary_logloss: 0.132012 valid_1's auc: 0.828309 valid_1's binary_logloss: 0.140333
[87] training's auc: 0.87064 training's binary_logloss: 0.131863 valid_1's auc: 0.828626 valid_1's binary_logloss: 0.14025
[88] training's auc: 0.871055 training's binary_logloss: 0.131718 valid_1's auc: 0.828568 valid_1's binary_logloss: 0.140164
[89] training's auc: 0.871213 training's binary_logloss: 0.131559 valid_1's auc: 0.828646 valid_1's binary_logloss: 0.140084
[90] training's auc: 0.871644 training's binary_logloss: 0.131407 valid_1's auc: 0.828803 valid_1's binary_logloss: 0.139992
[91] training's auc: 0.871992 training's binary_logloss: 0.131259 valid_1's auc: 0.829026 valid_1's binary_logloss: 0.1399
[92] training's auc: 0.872373 training's binary_logloss: 0.131111 valid_1's auc: 0.828842 valid_1's binary_logloss: 0.139823
[93] training's auc: 0.872708 training's binary_logloss: 0.130964 valid_1's auc: 0.828932 valid_1's binary_logloss: 0.139741
[94] training's auc: 0.872869 training's binary_logloss: 0.13082 valid_1's auc: 0.829061 valid_1's binary_logloss: 0.139655
[95] training's auc: 0.873096 training's binary_logloss: 0.130688 valid_1's auc: 0.829028 valid_1's binary_logloss: 0.139578
[96] training's auc: 0.873368 training's binary_logloss: 0.130545 valid_1's auc: 0.829082 valid_1's binary_logloss: 0.139509
[97] training's auc: 0.873531 training's binary_logloss: 0.130416 valid_1's auc: 0.828986 valid_1's binary_logloss: 0.139437
[98] training's auc: 0.873768 training's binary_logloss: 0.130277 valid_1's auc: 0.829117 valid_1's binary_logloss: 0.139364
[99] training's auc: 0.874014 training's binary_logloss: 0.130139 valid_1's auc: 0.829228 valid_1's binary_logloss: 0.139297
[100] training's auc: 0.874151 training's binary_logloss: 0.130012 valid_1's auc: 0.829079 valid_1's binary_logloss: 0.139229
Did not meet early stopping. Best iteration is:
[100] training's auc: 0.874151 training's binary_logloss: 0.130012 valid_1's auc: 0.829079 valid_1's binary_logloss: 0.139229
[1] training's auc: 0.832798 training's binary_logloss: 0.165859 valid_1's auc: 0.814587 valid_1's binary_logloss: 0.159854
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.832983 training's binary_logloss: 0.164851 valid_1's auc: 0.815361 valid_1's binary_logloss: 0.159021
[3] training's auc: 0.835306 training's binary_logloss: 0.163894 valid_1's auc: 0.816716 valid_1's binary_logloss: 0.158236
[4] training's auc: 0.836451 training's binary_logloss: 0.162966 valid_1's auc: 0.817463 valid_1's binary_logloss: 0.157459
[5] training's auc: 0.837129 training's binary_logloss: 0.162071 valid_1's auc: 0.818685 valid_1's binary_logloss: 0.156723
[6] training's auc: 0.838022 training's binary_logloss: 0.161228 valid_1's auc: 0.819099 valid_1's binary_logloss: 0.156025
[7] training's auc: 0.839619 training's binary_logloss: 0.160425 valid_1's auc: 0.819099 valid_1's binary_logloss: 0.155346
[8] training's auc: 0.840557 training's binary_logloss: 0.159659 valid_1's auc: 0.81936 valid_1's binary_logloss: 0.15471
[9] training's auc: 0.841189 training's binary_logloss: 0.158924 valid_1's auc: 0.81964 valid_1's binary_logloss: 0.154089
[10] training's auc: 0.842343 training's binary_logloss: 0.158195 valid_1's auc: 0.82104 valid_1's binary_logloss: 0.153498
[11] training's auc: 0.842743 training's binary_logloss: 0.157515 valid_1's auc: 0.821231 valid_1's binary_logloss: 0.152938
[12] training's auc: 0.843419 training's binary_logloss: 0.15684 valid_1's auc: 0.821668 valid_1's binary_logloss: 0.152398
[13] training's auc: 0.844524 training's binary_logloss: 0.156207 valid_1's auc: 0.822424 valid_1's binary_logloss: 0.151868
[14] training's auc: 0.845017 training's binary_logloss: 0.155586 valid_1's auc: 0.823058 valid_1's binary_logloss: 0.151357
[15] training's auc: 0.845321 training's binary_logloss: 0.154992 valid_1's auc: 0.82379 valid_1's binary_logloss: 0.150856
[16] training's auc: 0.845843 training's binary_logloss: 0.154413 valid_1's auc: 0.823868 valid_1's binary_logloss: 0.150386
[17] training's auc: 0.846168 training's binary_logloss: 0.153853 valid_1's auc: 0.823946 valid_1's binary_logloss: 0.149929
[18] training's auc: 0.846881 training's binary_logloss: 0.153312 valid_1's auc: 0.824846 valid_1's binary_logloss: 0.149481
[19] training's auc: 0.849874 training's binary_logloss: 0.152782 valid_1's auc: 0.827177 valid_1's binary_logloss: 0.149062
[20] training's auc: 0.850491 training's binary_logloss: 0.152263 valid_1's auc: 0.827013 valid_1's binary_logloss: 0.148643
[21] training's auc: 0.850655 training's binary_logloss: 0.151766 valid_1's auc: 0.827381 valid_1's binary_logloss: 0.148228
[22] training's auc: 0.850749 training's binary_logloss: 0.151277 valid_1's auc: 0.827347 valid_1's binary_logloss: 0.147842
[23] training's auc: 0.850949 training's binary_logloss: 0.150812 valid_1's auc: 0.827359 valid_1's binary_logloss: 0.147466
[24] training's auc: 0.852116 training's binary_logloss: 0.15035 valid_1's auc: 0.828466 valid_1's binary_logloss: 0.147109
[25] training's auc: 0.852678 training's binary_logloss: 0.149899 valid_1's auc: 0.828784 valid_1's binary_logloss: 0.146757
[26] training's auc: 0.853043 training's binary_logloss: 0.149446 valid_1's auc: 0.829467 valid_1's binary_logloss: 0.146421
[27] training's auc: 0.853316 training's binary_logloss: 0.149024 valid_1's auc: 0.829254 valid_1's binary_logloss: 0.146084
[28] training's auc: 0.853717 training's binary_logloss: 0.148612 valid_1's auc: 0.829857 valid_1's binary_logloss: 0.145766
[29] training's auc: 0.853813 training's binary_logloss: 0.148206 valid_1's auc: 0.830038 valid_1's binary_logloss: 0.145455
[30] training's auc: 0.853963 training's binary_logloss: 0.147807 valid_1's auc: 0.830196 valid_1's binary_logloss: 0.14515
[31] training's auc: 0.854393 training's binary_logloss: 0.147424 valid_1's auc: 0.830308 valid_1's binary_logloss: 0.144848
[32] training's auc: 0.854889 training's binary_logloss: 0.147049 valid_1's auc: 0.830534 valid_1's binary_logloss: 0.144563
[33] training's auc: 0.855294 training's binary_logloss: 0.146676 valid_1's auc: 0.830517 valid_1's binary_logloss: 0.144288
[34] training's auc: 0.855617 training's binary_logloss: 0.146317 valid_1's auc: 0.830534 valid_1's binary_logloss: 0.144023
[35] training's auc: 0.855765 training's binary_logloss: 0.145964 valid_1's auc: 0.830444 valid_1's binary_logloss: 0.143763
[36] training's auc: 0.856023 training's binary_logloss: 0.145621 valid_1's auc: 0.830403 valid_1's binary_logloss: 0.143507
[37] training's auc: 0.856062 training's binary_logloss: 0.145281 valid_1's auc: 0.830424 valid_1's binary_logloss: 0.143258
[38] training's auc: 0.856795 training's binary_logloss: 0.144955 valid_1's auc: 0.83043 valid_1's binary_logloss: 0.14301
[39] training's auc: 0.857123 training's binary_logloss: 0.144637 valid_1's auc: 0.830194 valid_1's binary_logloss: 0.142776
[40] training's auc: 0.85767 training's binary_logloss: 0.144301 valid_1's auc: 0.830569 valid_1's binary_logloss: 0.142533
[41] training's auc: 0.858641 training's binary_logloss: 0.143974 valid_1's auc: 0.831071 valid_1's binary_logloss: 0.142291
[42] training's auc: 0.859892 training's binary_logloss: 0.143658 valid_1's auc: 0.832506 valid_1's binary_logloss: 0.142053
[43] training's auc: 0.860015 training's binary_logloss: 0.143353 valid_1's auc: 0.832718 valid_1's binary_logloss: 0.14183
[44] training's auc: 0.860142 training's binary_logloss: 0.143061 valid_1's auc: 0.832818 valid_1's binary_logloss: 0.141604
[45] training's auc: 0.860279 training's binary_logloss: 0.142774 valid_1's auc: 0.832908 valid_1's binary_logloss: 0.141384
[46] training's auc: 0.860716 training's binary_logloss: 0.142485 valid_1's auc: 0.832832 valid_1's binary_logloss: 0.141182
[47] training's auc: 0.860908 training's binary_logloss: 0.142196 valid_1's auc: 0.833038 valid_1's binary_logloss: 0.140992
[48] training's auc: 0.86121 training's binary_logloss: 0.14192 valid_1's auc: 0.832907 valid_1's binary_logloss: 0.140794
[49] training's auc: 0.861475 training's binary_logloss: 0.141648 valid_1's auc: 0.832986 valid_1's binary_logloss: 0.140602
[50] training's auc: 0.861687 training's binary_logloss: 0.141381 valid_1's auc: 0.833155 valid_1's binary_logloss: 0.140398
[51] training's auc: 0.862108 training's binary_logloss: 0.141115 valid_1's auc: 0.833524 valid_1's binary_logloss: 0.140207
[52] training's auc: 0.862483 training's binary_logloss: 0.140856 valid_1's auc: 0.83377 valid_1's binary_logloss: 0.140013
[53] training's auc: 0.862729 training's binary_logloss: 0.140602 valid_1's auc: 0.833778 valid_1's binary_logloss: 0.139837
[54] training's auc: 0.863086 training's binary_logloss: 0.140351 valid_1's auc: 0.833748 valid_1's binary_logloss: 0.139664
[55] training's auc: 0.863451 training's binary_logloss: 0.140102 valid_1's auc: 0.83394 valid_1's binary_logloss: 0.139491
[56] training's auc: 0.863656 training's binary_logloss: 0.139859 valid_1's auc: 0.834053 valid_1's binary_logloss: 0.139323
[57] training's auc: 0.863954 training's binary_logloss: 0.139621 valid_1's auc: 0.834064 valid_1's binary_logloss: 0.139163
[58] training's auc: 0.864111 training's binary_logloss: 0.139392 valid_1's auc: 0.834052 valid_1's binary_logloss: 0.139008
[59] training's auc: 0.864632 training's binary_logloss: 0.139164 valid_1's auc: 0.834125 valid_1's binary_logloss: 0.138857
[60] training's auc: 0.864912 training's binary_logloss: 0.138939 valid_1's auc: 0.834182 valid_1's binary_logloss: 0.138692
[61] training's auc: 0.865153 training's binary_logloss: 0.138717 valid_1's auc: 0.834126 valid_1's binary_logloss: 0.138539
[62] training's auc: 0.865337 training's binary_logloss: 0.1385 valid_1's auc: 0.834214 valid_1's binary_logloss: 0.138397
[63] training's auc: 0.86555 training's binary_logloss: 0.138287 valid_1's auc: 0.834018 valid_1's binary_logloss: 0.138263
[64] training's auc: 0.86572 training's binary_logloss: 0.138072 valid_1's auc: 0.834186 valid_1's binary_logloss: 0.138112
[65] training's auc: 0.865965 training's binary_logloss: 0.137857 valid_1's auc: 0.834246 valid_1's binary_logloss: 0.137969
[66] training's auc: 0.866114 training's binary_logloss: 0.137652 valid_1's auc: 0.834202 valid_1's binary_logloss: 0.137829
[67] training's auc: 0.866316 training's binary_logloss: 0.137444 valid_1's auc: 0.834303 valid_1's binary_logloss: 0.137693
[68] training's auc: 0.866503 training's binary_logloss: 0.137243 valid_1's auc: 0.834379 valid_1's binary_logloss: 0.137565
[69] training's auc: 0.866708 training's binary_logloss: 0.137043 valid_1's auc: 0.834297 valid_1's binary_logloss: 0.137438
[70] training's auc: 0.867044 training's binary_logloss: 0.136845 valid_1's auc: 0.834382 valid_1's binary_logloss: 0.13731
[71] training's auc: 0.867324 training's binary_logloss: 0.136652 valid_1's auc: 0.834287 valid_1's binary_logloss: 0.137183
[72] training's auc: 0.867611 training's binary_logloss: 0.136465 valid_1's auc: 0.834337 valid_1's binary_logloss: 0.137057
[73] training's auc: 0.867734 training's binary_logloss: 0.136273 valid_1's auc: 0.834347 valid_1's binary_logloss: 0.136938
[74] training's auc: 0.86793 training's binary_logloss: 0.136088 valid_1's auc: 0.834308 valid_1's binary_logloss: 0.136821
[75] training's auc: 0.868292 training's binary_logloss: 0.13591 valid_1's auc: 0.834507 valid_1's binary_logloss: 0.136702
[76] training's auc: 0.868506 training's binary_logloss: 0.135732 valid_1's auc: 0.834435 valid_1's binary_logloss: 0.136582
[77] training's auc: 0.868659 training's binary_logloss: 0.135554 valid_1's auc: 0.834418 valid_1's binary_logloss: 0.136468
[78] training's auc: 0.868774 training's binary_logloss: 0.135384 valid_1's auc: 0.834413 valid_1's binary_logloss: 0.136355
[79] training's auc: 0.868971 training's binary_logloss: 0.135217 valid_1's auc: 0.834376 valid_1's binary_logloss: 0.136252
[80] training's auc: 0.869154 training's binary_logloss: 0.135045 valid_1's auc: 0.83428 valid_1's binary_logloss: 0.136146
[81] training's auc: 0.869384 training's binary_logloss: 0.134885 valid_1's auc: 0.83438 valid_1's binary_logloss: 0.136042
[82] training's auc: 0.869638 training's binary_logloss: 0.134722 valid_1's auc: 0.834457 valid_1's binary_logloss: 0.135937
[83] training's auc: 0.869892 training's binary_logloss: 0.134567 valid_1's auc: 0.834331 valid_1's binary_logloss: 0.135837
[84] training's auc: 0.870019 training's binary_logloss: 0.134411 valid_1's auc: 0.834376 valid_1's binary_logloss: 0.13574
[85] training's auc: 0.87019 training's binary_logloss: 0.134259 valid_1's auc: 0.834293 valid_1's binary_logloss: 0.135651
[86] training's auc: 0.870482 training's binary_logloss: 0.134108 valid_1's auc: 0.834321 valid_1's binary_logloss: 0.135562
[87] training's auc: 0.870613 training's binary_logloss: 0.133959 valid_1's auc: 0.834343 valid_1's binary_logloss: 0.135468
[88] training's auc: 0.87067 training's binary_logloss: 0.133809 valid_1's auc: 0.834453 valid_1's binary_logloss: 0.135369
[89] training's auc: 0.870797 training's binary_logloss: 0.133661 valid_1's auc: 0.834537 valid_1's binary_logloss: 0.135277
[90] training's auc: 0.870935 training's binary_logloss: 0.133523 valid_1's auc: 0.834483 valid_1's binary_logloss: 0.135194
[91] training's auc: 0.871226 training's binary_logloss: 0.133377 valid_1's auc: 0.834539 valid_1's binary_logloss: 0.135103
[92] training's auc: 0.871342 training's binary_logloss: 0.133236 valid_1's auc: 0.834565 valid_1's binary_logloss: 0.135016
[93] training's auc: 0.871613 training's binary_logloss: 0.133099 valid_1's auc: 0.834472 valid_1's binary_logloss: 0.134939
[94] training's auc: 0.871756 training's binary_logloss: 0.132957 valid_1's auc: 0.83448 valid_1's binary_logloss: 0.134855
[95] training's auc: 0.871934 training's binary_logloss: 0.132822 valid_1's auc: 0.834485 valid_1's binary_logloss: 0.134775
[96] training's auc: 0.872036 training's binary_logloss: 0.132692 valid_1's auc: 0.834557 valid_1's binary_logloss: 0.1347
[97] training's auc: 0.872123 training's binary_logloss: 0.13256 valid_1's auc: 0.834595 valid_1's binary_logloss: 0.134623
[98] training's auc: 0.872273 training's binary_logloss: 0.132428 valid_1's auc: 0.834569 valid_1's binary_logloss: 0.134549
[99] training's auc: 0.872543 training's binary_logloss: 0.132288 valid_1's auc: 0.834559 valid_1's binary_logloss: 0.134478
[100] training's auc: 0.872667 training's binary_logloss: 0.132154 valid_1's auc: 0.834548 valid_1's binary_logloss: 0.134406
Did not meet early stopping. Best iteration is:
[100] training's auc: 0.872667 training's binary_logloss: 0.132154 valid_1's auc: 0.834548 valid_1's binary_logloss: 0.134406
[1] training's auc: 0.833984 training's binary_logloss: 0.162058 valid_1's auc: 0.81354 valid_1's binary_logloss: 0.167538
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.837044 training's binary_logloss: 0.161098 valid_1's auc: 0.817364 valid_1's binary_logloss: 0.16666
[3] training's auc: 0.837812 training's binary_logloss: 0.160191 valid_1's auc: 0.81882 valid_1's binary_logloss: 0.165815
[4] training's auc: 0.839832 training's binary_logloss: 0.15932 valid_1's auc: 0.820263 valid_1's binary_logloss: 0.165013
[5] training's auc: 0.84007 training's binary_logloss: 0.158502 valid_1's auc: 0.820567 valid_1's binary_logloss: 0.164245
[6] training's auc: 0.841145 training's binary_logloss: 0.157695 valid_1's auc: 0.821963 valid_1's binary_logloss: 0.163509
[7] training's auc: 0.841219 training's binary_logloss: 0.156915 valid_1's auc: 0.822471 valid_1's binary_logloss: 0.162781
[8] training's auc: 0.841909 training's binary_logloss: 0.156174 valid_1's auc: 0.822886 valid_1's binary_logloss: 0.162098
[9] training's auc: 0.841954 training's binary_logloss: 0.155472 valid_1's auc: 0.823959 valid_1's binary_logloss: 0.161433
[10] training's auc: 0.842553 training's binary_logloss: 0.154785 valid_1's auc: 0.824159 valid_1's binary_logloss: 0.1608
[11] training's auc: 0.84509 training's binary_logloss: 0.154137 valid_1's auc: 0.825156 valid_1's binary_logloss: 0.160189
[12] training's auc: 0.845189 training's binary_logloss: 0.153499 valid_1's auc: 0.825185 valid_1's binary_logloss: 0.159602
[13] training's auc: 0.845871 training's binary_logloss: 0.152886 valid_1's auc: 0.825734 valid_1's binary_logloss: 0.159032
[14] training's auc: 0.846992 training's binary_logloss: 0.152287 valid_1's auc: 0.826938 valid_1's binary_logloss: 0.158484
[15] training's auc: 0.847686 training's binary_logloss: 0.151696 valid_1's auc: 0.827293 valid_1's binary_logloss: 0.157948
[16] training's auc: 0.848175 training's binary_logloss: 0.151129 valid_1's auc: 0.827913 valid_1's binary_logloss: 0.1574
[17] training's auc: 0.84789 training's binary_logloss: 0.150582 valid_1's auc: 0.828109 valid_1's binary_logloss: 0.156892
[18] training's auc: 0.848499 training's binary_logloss: 0.150048 valid_1's auc: 0.828335 valid_1's binary_logloss: 0.156413
[19] training's auc: 0.849042 training's binary_logloss: 0.149531 valid_1's auc: 0.828296 valid_1's binary_logloss: 0.155938
[20] training's auc: 0.849275 training's binary_logloss: 0.149034 valid_1's auc: 0.828442 valid_1's binary_logloss: 0.15548
[21] training's auc: 0.849612 training's binary_logloss: 0.148553 valid_1's auc: 0.828543 valid_1's binary_logloss: 0.155038
[22] training's auc: 0.850357 training's binary_logloss: 0.148087 valid_1's auc: 0.828275 valid_1's binary_logloss: 0.154615
[23] training's auc: 0.850975 training's binary_logloss: 0.147625 valid_1's auc: 0.828203 valid_1's binary_logloss: 0.154201
[24] training's auc: 0.851229 training's binary_logloss: 0.147172 valid_1's auc: 0.828235 valid_1's binary_logloss: 0.153802
[25] training's auc: 0.8514 training's binary_logloss: 0.14673 valid_1's auc: 0.828599 valid_1's binary_logloss: 0.153427
[26] training's auc: 0.851754 training's binary_logloss: 0.146301 valid_1's auc: 0.828665 valid_1's binary_logloss: 0.153042
[27] training's auc: 0.85199 training's binary_logloss: 0.145885 valid_1's auc: 0.828817 valid_1's binary_logloss: 0.152684
[28] training's auc: 0.852511 training's binary_logloss: 0.145471 valid_1's auc: 0.829013 valid_1's binary_logloss: 0.152325
[29] training's auc: 0.852959 training's binary_logloss: 0.145069 valid_1's auc: 0.829139 valid_1's binary_logloss: 0.151981
[30] training's auc: 0.853489 training's binary_logloss: 0.14468 valid_1's auc: 0.829316 valid_1's binary_logloss: 0.151656
[31] training's auc: 0.853729 training's binary_logloss: 0.144302 valid_1's auc: 0.829295 valid_1's binary_logloss: 0.151322
[32] training's auc: 0.854045 training's binary_logloss: 0.143935 valid_1's auc: 0.829779 valid_1's binary_logloss: 0.151006
[33] training's auc: 0.85445 training's binary_logloss: 0.143572 valid_1's auc: 0.829837 valid_1's binary_logloss: 0.150703
[34] training's auc: 0.854699 training's binary_logloss: 0.143219 valid_1's auc: 0.82989 valid_1's binary_logloss: 0.150401
[35] training's auc: 0.854767 training's binary_logloss: 0.142865 valid_1's auc: 0.829917 valid_1's binary_logloss: 0.15012
[36] training's auc: 0.855138 training's binary_logloss: 0.142531 valid_1's auc: 0.830027 valid_1's binary_logloss: 0.149827
[37] training's auc: 0.855437 training's binary_logloss: 0.142193 valid_1's auc: 0.830073 valid_1's binary_logloss: 0.149556
[38] training's auc: 0.85582 training's binary_logloss: 0.141862 valid_1's auc: 0.829948 valid_1's binary_logloss: 0.149285
[39] training's auc: 0.856094 training's binary_logloss: 0.141541 valid_1's auc: 0.829924 valid_1's binary_logloss: 0.149027
[40] training's auc: 0.856389 training's binary_logloss: 0.141227 valid_1's auc: 0.829933 valid_1's binary_logloss: 0.148767
[41] training's auc: 0.856639 training's binary_logloss: 0.140916 valid_1's auc: 0.830038 valid_1's binary_logloss: 0.148519
[42] training's auc: 0.856817 training's binary_logloss: 0.140614 valid_1's auc: 0.830114 valid_1's binary_logloss: 0.148271
[43] training's auc: 0.857251 training's binary_logloss: 0.140324 valid_1's auc: 0.830106 valid_1's binary_logloss: 0.148028
[44] training's auc: 0.857979 training's binary_logloss: 0.140035 valid_1's auc: 0.830377 valid_1's binary_logloss: 0.1478
[45] training's auc: 0.858435 training's binary_logloss: 0.139751 valid_1's auc: 0.83072 valid_1's binary_logloss: 0.14757
[46] training's auc: 0.858635 training's binary_logloss: 0.139474 valid_1's auc: 0.830892 valid_1's binary_logloss: 0.147341
[47] training's auc: 0.858883 training's binary_logloss: 0.1392 valid_1's auc: 0.83128 valid_1's binary_logloss: 0.147123
[48] training's auc: 0.858906 training's binary_logloss: 0.138938 valid_1's auc: 0.831458 valid_1's binary_logloss: 0.146909
[49] training's auc: 0.858961 training's binary_logloss: 0.138681 valid_1's auc: 0.831354 valid_1's binary_logloss: 0.146713
[50] training's auc: 0.859286 training's binary_logloss: 0.138422 valid_1's auc: 0.831318 valid_1's binary_logloss: 0.146512
[51] training's auc: 0.859481 training's binary_logloss: 0.138172 valid_1's auc: 0.831344 valid_1's binary_logloss: 0.146322
[52] training's auc: 0.8598 training's binary_logloss: 0.137924 valid_1's auc: 0.831382 valid_1's binary_logloss: 0.146128
[53] training's auc: 0.859878 training's binary_logloss: 0.137686 valid_1's auc: 0.831349 valid_1's binary_logloss: 0.145945
[54] training's auc: 0.86025 training's binary_logloss: 0.137447 valid_1's auc: 0.831468 valid_1's binary_logloss: 0.14576
[55] training's auc: 0.860286 training's binary_logloss: 0.137214 valid_1's auc: 0.831398 valid_1's binary_logloss: 0.145585
[56] training's auc: 0.860627 training's binary_logloss: 0.136984 valid_1's auc: 0.831477 valid_1's binary_logloss: 0.145408
[57] training's auc: 0.860717 training's binary_logloss: 0.136762 valid_1's auc: 0.831449 valid_1's binary_logloss: 0.14525
[58] training's auc: 0.860855 training's binary_logloss: 0.136548 valid_1's auc: 0.831303 valid_1's binary_logloss: 0.145087
[59] training's auc: 0.86121 training's binary_logloss: 0.136332 valid_1's auc: 0.831474 valid_1's binary_logloss: 0.14493
[60] training's auc: 0.86133 training's binary_logloss: 0.13612 valid_1's auc: 0.831516 valid_1's binary_logloss: 0.144772
[61] training's auc: 0.861487 training's binary_logloss: 0.13591 valid_1's auc: 0.831612 valid_1's binary_logloss: 0.144615
[62] training's auc: 0.861641 training's binary_logloss: 0.135699 valid_1's auc: 0.831736 valid_1's binary_logloss: 0.144457
[63] training's auc: 0.861672 training's binary_logloss: 0.135503 valid_1's auc: 0.831722 valid_1's binary_logloss: 0.144311
[64] training's auc: 0.861876 training's binary_logloss: 0.135307 valid_1's auc: 0.831906 valid_1's binary_logloss: 0.144164
[65] training's auc: 0.862087 training's binary_logloss: 0.135091 valid_1's auc: 0.831903 valid_1's binary_logloss: 0.144013
[66] training's auc: 0.862298 training's binary_logloss: 0.134887 valid_1's auc: 0.83197 valid_1's binary_logloss: 0.143866
[67] training's auc: 0.86251 training's binary_logloss: 0.134682 valid_1's auc: 0.831979 valid_1's binary_logloss: 0.143717
[68] training's auc: 0.862767 training's binary_logloss: 0.13448 valid_1's auc: 0.832039 valid_1's binary_logloss: 0.143577
[69] training's auc: 0.862933 training's binary_logloss: 0.134286 valid_1's auc: 0.832116 valid_1's binary_logloss: 0.143437
[70] training's auc: 0.863034 training's binary_logloss: 0.1341 valid_1's auc: 0.832148 valid_1's binary_logloss: 0.143303
[71] training's auc: 0.863293 training's binary_logloss: 0.133918 valid_1's auc: 0.832379 valid_1's binary_logloss: 0.143166
[72] training's auc: 0.863525 training's binary_logloss: 0.133738 valid_1's auc: 0.832405 valid_1's binary_logloss: 0.143035
[73] training's auc: 0.863749 training's binary_logloss: 0.133562 valid_1's auc: 0.832571 valid_1's binary_logloss: 0.142905
[74] training's auc: 0.863972 training's binary_logloss: 0.133386 valid_1's auc: 0.832576 valid_1's binary_logloss: 0.142781
[75] training's auc: 0.86415 training's binary_logloss: 0.133221 valid_1's auc: 0.832645 valid_1's binary_logloss: 0.142664
[76] training's auc: 0.865387 training's binary_logloss: 0.13305 valid_1's auc: 0.834653 valid_1's binary_logloss: 0.142547
[77] training's auc: 0.865629 training's binary_logloss: 0.132891 valid_1's auc: 0.834839 valid_1's binary_logloss: 0.142431
[78] training's auc: 0.865862 training's binary_logloss: 0.132725 valid_1's auc: 0.835055 valid_1's binary_logloss: 0.142327
[79] training's auc: 0.866055 training's binary_logloss: 0.132566 valid_1's auc: 0.835097 valid_1's binary_logloss: 0.142213
[80] training's auc: 0.866201 training's binary_logloss: 0.132407 valid_1's auc: 0.835061 valid_1's binary_logloss: 0.142112
[81] training's auc: 0.866526 training's binary_logloss: 0.132253 valid_1's auc: 0.835165 valid_1's binary_logloss: 0.142001
[82] training's auc: 0.866936 training's binary_logloss: 0.132097 valid_1's auc: 0.835257 valid_1's binary_logloss: 0.141899
[83] training's auc: 0.867284 training's binary_logloss: 0.131947 valid_1's auc: 0.835284 valid_1's binary_logloss: 0.141796
[84] training's auc: 0.867478 training's binary_logloss: 0.131801 valid_1's auc: 0.835313 valid_1's binary_logloss: 0.141701
[85] training's auc: 0.867614 training's binary_logloss: 0.131654 valid_1's auc: 0.835413 valid_1's binary_logloss: 0.141601
[86] training's auc: 0.867828 training's binary_logloss: 0.131499 valid_1's auc: 0.835387 valid_1's binary_logloss: 0.141498
[87] training's auc: 0.868078 training's binary_logloss: 0.131345 valid_1's auc: 0.835435 valid_1's binary_logloss: 0.141392
[88] training's auc: 0.868219 training's binary_logloss: 0.131192 valid_1's auc: 0.835471 valid_1's binary_logloss: 0.141286
[89] training's auc: 0.868364 training's binary_logloss: 0.131045 valid_1's auc: 0.835513 valid_1's binary_logloss: 0.141184
[90] training's auc: 0.868582 training's binary_logloss: 0.130897 valid_1's auc: 0.835614 valid_1's binary_logloss: 0.141083
[91] training's auc: 0.868695 training's binary_logloss: 0.130755 valid_1's auc: 0.835745 valid_1's binary_logloss: 0.140989
[92] training's auc: 0.868968 training's binary_logloss: 0.130611 valid_1's auc: 0.835656 valid_1's binary_logloss: 0.140906
[93] training's auc: 0.869071 training's binary_logloss: 0.130477 valid_1's auc: 0.835708 valid_1's binary_logloss: 0.140829
[94] training's auc: 0.86928 training's binary_logloss: 0.130339 valid_1's auc: 0.835652 valid_1's binary_logloss: 0.140743
[95] training's auc: 0.869509 training's binary_logloss: 0.1302 valid_1's auc: 0.835587 valid_1's binary_logloss: 0.140662
[96] training's auc: 0.869735 training's binary_logloss: 0.130068 valid_1's auc: 0.835728 valid_1's binary_logloss: 0.140575
[97] training's auc: 0.869896 training's binary_logloss: 0.129939 valid_1's auc: 0.835808 valid_1's binary_logloss: 0.14049
[98] training's auc: 0.870186 training's binary_logloss: 0.129807 valid_1's auc: 0.835792 valid_1's binary_logloss: 0.140417
[99] training's auc: 0.870565 training's binary_logloss: 0.129673 valid_1's auc: 0.835856 valid_1's binary_logloss: 0.140339
[100] training's auc: 0.870756 training's binary_logloss: 0.129549 valid_1's auc: 0.835852 valid_1's binary_logloss: 0.140266
Did not meet early stopping. Best iteration is:
[100] training's auc: 0.870756 training's binary_logloss: 0.129549 valid_1's auc: 0.835852 valid_1's binary_logloss: 0.140266
[1] training's auc: 0.833653 training's binary_logloss: 0.156372 valid_1's auc: 0.808154 valid_1's binary_logloss: 0.15874
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.839696 training's binary_logloss: 0.151226 valid_1's auc: 0.809138 valid_1's binary_logloss: 0.154637
[3] training's auc: 0.844537 training's binary_logloss: 0.147299 valid_1's auc: 0.811864 valid_1's binary_logloss: 0.151475
[4] training's auc: 0.849325 training's binary_logloss: 0.144227 valid_1's auc: 0.81593 valid_1's binary_logloss: 0.149029
[5] training's auc: 0.853216 training's binary_logloss: 0.141627 valid_1's auc: 0.820795 valid_1's binary_logloss: 0.146969
[6] training's auc: 0.856904 training's binary_logloss: 0.139384 valid_1's auc: 0.821897 valid_1's binary_logloss: 0.145418
[7] training's auc: 0.859538 training's binary_logloss: 0.137569 valid_1's auc: 0.823277 valid_1's binary_logloss: 0.14396
[8] training's auc: 0.860854 training's binary_logloss: 0.135934 valid_1's auc: 0.823639 valid_1's binary_logloss: 0.142828
[9] training's auc: 0.862435 training's binary_logloss: 0.134448 valid_1's auc: 0.824676 valid_1's binary_logloss: 0.141827
[10] training's auc: 0.86505 training's binary_logloss: 0.133123 valid_1's auc: 0.824981 valid_1's binary_logloss: 0.141011
[11] training's auc: 0.866708 training's binary_logloss: 0.131959 valid_1's auc: 0.826518 valid_1's binary_logloss: 0.140242
[12] training's auc: 0.8682 training's binary_logloss: 0.13092 valid_1's auc: 0.826469 valid_1's binary_logloss: 0.139749
[13] training's auc: 0.86917 training's binary_logloss: 0.1299 valid_1's auc: 0.826664 valid_1's binary_logloss: 0.139246
[14] training's auc: 0.870973 training's binary_logloss: 0.129019 valid_1's auc: 0.826819 valid_1's binary_logloss: 0.138817
[15] training's auc: 0.872914 training's binary_logloss: 0.128178 valid_1's auc: 0.827216 valid_1's binary_logloss: 0.13839
[16] training's auc: 0.8742 training's binary_logloss: 0.127454 valid_1's auc: 0.82767 valid_1's binary_logloss: 0.138074
[17] training's auc: 0.876228 training's binary_logloss: 0.126712 valid_1's auc: 0.828106 valid_1's binary_logloss: 0.13779
[18] training's auc: 0.877768 training's binary_logloss: 0.126034 valid_1's auc: 0.827728 valid_1's binary_logloss: 0.137596
[19] training's auc: 0.879448 training's binary_logloss: 0.125415 valid_1's auc: 0.828107 valid_1's binary_logloss: 0.137374
[20] training's auc: 0.880546 training's binary_logloss: 0.124798 valid_1's auc: 0.828165 valid_1's binary_logloss: 0.13723
[21] training's auc: 0.881911 training's binary_logloss: 0.124166 valid_1's auc: 0.829055 valid_1's binary_logloss: 0.136972
[22] training's auc: 0.883165 training's binary_logloss: 0.123629 valid_1's auc: 0.829223 valid_1's binary_logloss: 0.136795
[23] training's auc: 0.88422 training's binary_logloss: 0.123096 valid_1's auc: 0.829208 valid_1's binary_logloss: 0.136682
[24] training's auc: 0.885064 training's binary_logloss: 0.122604 valid_1's auc: 0.829645 valid_1's binary_logloss: 0.136507
[25] training's auc: 0.886347 training's binary_logloss: 0.122099 valid_1's auc: 0.829322 valid_1's binary_logloss: 0.13648
[26] training's auc: 0.887776 training's binary_logloss: 0.121618 valid_1's auc: 0.829272 valid_1's binary_logloss: 0.136448
[27] training's auc: 0.888944 training's binary_logloss: 0.121169 valid_1's auc: 0.828989 valid_1's binary_logloss: 0.136458
[28] training's auc: 0.889788 training's binary_logloss: 0.120781 valid_1's auc: 0.829426 valid_1's binary_logloss: 0.136367
[29] training's auc: 0.891212 training's binary_logloss: 0.120308 valid_1's auc: 0.829557 valid_1's binary_logloss: 0.136297
[30] training's auc: 0.892312 training's binary_logloss: 0.119912 valid_1's auc: 0.829423 valid_1's binary_logloss: 0.136297
[31] training's auc: 0.893327 training's binary_logloss: 0.119543 valid_1's auc: 0.829488 valid_1's binary_logloss: 0.136245
[32] training's auc: 0.894542 training's binary_logloss: 0.119093 valid_1's auc: 0.829563 valid_1's binary_logloss: 0.136225
[33] training's auc: 0.895416 training's binary_logloss: 0.11871 valid_1's auc: 0.828896 valid_1's binary_logloss: 0.136332
[34] training's auc: 0.896545 training's binary_logloss: 0.118324 valid_1's auc: 0.829002 valid_1's binary_logloss: 0.136306
[35] training's auc: 0.897408 training's binary_logloss: 0.117948 valid_1's auc: 0.828987 valid_1's binary_logloss: 0.136328
[36] training's auc: 0.898279 training's binary_logloss: 0.117575 valid_1's auc: 0.828404 valid_1's binary_logloss: 0.136445
[37] training's auc: 0.899223 training's binary_logloss: 0.117241 valid_1's auc: 0.827734 valid_1's binary_logloss: 0.136562
[38] training's auc: 0.900686 training's binary_logloss: 0.116872 valid_1's auc: 0.828157 valid_1's binary_logloss: 0.13649
[39] training's auc: 0.901703 training's binary_logloss: 0.116515 valid_1's auc: 0.827826 valid_1's binary_logloss: 0.136527
[40] training's auc: 0.902577 training's binary_logloss: 0.116218 valid_1's auc: 0.827559 valid_1's binary_logloss: 0.136581
[41] training's auc: 0.903794 training's binary_logloss: 0.115851 valid_1's auc: 0.828217 valid_1's binary_logloss: 0.136488
[42] training's auc: 0.90454 training's binary_logloss: 0.115511 valid_1's auc: 0.828151 valid_1's binary_logloss: 0.136499
[43] training's auc: 0.905474 training's binary_logloss: 0.115163 valid_1's auc: 0.828219 valid_1's binary_logloss: 0.13651
[44] training's auc: 0.906393 training's binary_logloss: 0.11484 valid_1's auc: 0.827957 valid_1's binary_logloss: 0.136557
[45] training's auc: 0.907051 training's binary_logloss: 0.11457 valid_1's auc: 0.827933 valid_1's binary_logloss: 0.136566
[46] training's auc: 0.907731 training's binary_logloss: 0.114279 valid_1's auc: 0.827573 valid_1's binary_logloss: 0.13663
[47] training's auc: 0.9085 training's binary_logloss: 0.113934 valid_1's auc: 0.827432 valid_1's binary_logloss: 0.136733
[48] training's auc: 0.909112 training's binary_logloss: 0.113644 valid_1's auc: 0.827678 valid_1's binary_logloss: 0.136681
[49] training's auc: 0.910111 training's binary_logloss: 0.113385 valid_1's auc: 0.8273 valid_1's binary_logloss: 0.136738
[50] training's auc: 0.910774 training's binary_logloss: 0.113075 valid_1's auc: 0.827074 valid_1's binary_logloss: 0.136829
[51] training's auc: 0.911453 training's binary_logloss: 0.112772 valid_1's auc: 0.826602 valid_1's binary_logloss: 0.13692
[52] training's auc: 0.912468 training's binary_logloss: 0.112505 valid_1's auc: 0.826426 valid_1's binary_logloss: 0.136965
[53] training's auc: 0.912953 training's binary_logloss: 0.112252 valid_1's auc: 0.826179 valid_1's binary_logloss: 0.137018
[54] training's auc: 0.913711 training's binary_logloss: 0.11199 valid_1's auc: 0.825952 valid_1's binary_logloss: 0.137072
Early stopping, best iteration is:
[24] training's auc: 0.885064 training's binary_logloss: 0.122604 valid_1's auc: 0.829645 valid_1's binary_logloss: 0.136507
[1] training's auc: 0.827904 training's binary_logloss: 0.158945 valid_1's auc: 0.813676 valid_1's binary_logloss: 0.154269
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.837572 training's binary_logloss: 0.153679 valid_1's auc: 0.819965 valid_1's binary_logloss: 0.149906
[3] training's auc: 0.843713 training's binary_logloss: 0.149768 valid_1's auc: 0.82381 valid_1's binary_logloss: 0.146637
[4] training's auc: 0.849037 training's binary_logloss: 0.146579 valid_1's auc: 0.829257 valid_1's binary_logloss: 0.14419
[5] training's auc: 0.852549 training's binary_logloss: 0.143998 valid_1's auc: 0.830054 valid_1's binary_logloss: 0.142252
[6] training's auc: 0.855935 training's binary_logloss: 0.141694 valid_1's auc: 0.832045 valid_1's binary_logloss: 0.140488
[7] training's auc: 0.858474 training's binary_logloss: 0.139733 valid_1's auc: 0.834416 valid_1's binary_logloss: 0.139114
[8] training's auc: 0.861628 training's binary_logloss: 0.138034 valid_1's auc: 0.834294 valid_1's binary_logloss: 0.137992
[9] training's auc: 0.863579 training's binary_logloss: 0.136543 valid_1's auc: 0.834952 valid_1's binary_logloss: 0.136937
[10] training's auc: 0.865209 training's binary_logloss: 0.135258 valid_1's auc: 0.834473 valid_1's binary_logloss: 0.136078
[11] training's auc: 0.867492 training's binary_logloss: 0.134076 valid_1's auc: 0.834678 valid_1's binary_logloss: 0.135334
[12] training's auc: 0.868499 training's binary_logloss: 0.133041 valid_1's auc: 0.834491 valid_1's binary_logloss: 0.13476
[13] training's auc: 0.870197 training's binary_logloss: 0.132025 valid_1's auc: 0.835127 valid_1's binary_logloss: 0.134167
[14] training's auc: 0.871336 training's binary_logloss: 0.131137 valid_1's auc: 0.835215 valid_1's binary_logloss: 0.13372
[15] training's auc: 0.873142 training's binary_logloss: 0.130293 valid_1's auc: 0.835198 valid_1's binary_logloss: 0.133321
[16] training's auc: 0.874263 training's binary_logloss: 0.129565 valid_1's auc: 0.835818 valid_1's binary_logloss: 0.132931
[17] training's auc: 0.8755 training's binary_logloss: 0.128847 valid_1's auc: 0.835275 valid_1's binary_logloss: 0.132658
[18] training's auc: 0.877347 training's binary_logloss: 0.128146 valid_1's auc: 0.83554 valid_1's binary_logloss: 0.132378
[19] training's auc: 0.878114 training's binary_logloss: 0.127538 valid_1's auc: 0.835108 valid_1's binary_logloss: 0.132187
[20] training's auc: 0.879713 training's binary_logloss: 0.126875 valid_1's auc: 0.835737 valid_1's binary_logloss: 0.131949
[21] training's auc: 0.880936 training's binary_logloss: 0.126263 valid_1's auc: 0.835505 valid_1's binary_logloss: 0.131792
[22] training's auc: 0.88227 training's binary_logloss: 0.125683 valid_1's auc: 0.835951 valid_1's binary_logloss: 0.131582
[23] training's auc: 0.883434 training's binary_logloss: 0.125112 valid_1's auc: 0.836255 valid_1's binary_logloss: 0.131446
[24] training's auc: 0.884384 training's binary_logloss: 0.124612 valid_1's auc: 0.836473 valid_1's binary_logloss: 0.131335
[25] training's auc: 0.885589 training's binary_logloss: 0.124088 valid_1's auc: 0.836397 valid_1's binary_logloss: 0.131251
[26] training's auc: 0.886627 training's binary_logloss: 0.123616 valid_1's auc: 0.836575 valid_1's binary_logloss: 0.131126
[27] training's auc: 0.88761 training's binary_logloss: 0.123129 valid_1's auc: 0.836443 valid_1's binary_logloss: 0.131091
[28] training's auc: 0.888714 training's binary_logloss: 0.122711 valid_1's auc: 0.836366 valid_1's binary_logloss: 0.131011
[29] training's auc: 0.890082 training's binary_logloss: 0.122238 valid_1's auc: 0.836581 valid_1's binary_logloss: 0.130971
[30] training's auc: 0.891226 training's binary_logloss: 0.121779 valid_1's auc: 0.836397 valid_1's binary_logloss: 0.1309
[31] training's auc: 0.892107 training's binary_logloss: 0.121357 valid_1's auc: 0.836306 valid_1's binary_logloss: 0.130852
[32] training's auc: 0.893162 training's binary_logloss: 0.120969 valid_1's auc: 0.836422 valid_1's binary_logloss: 0.130807
[33] training's auc: 0.894054 training's binary_logloss: 0.120595 valid_1's auc: 0.83658 valid_1's binary_logloss: 0.130772
[34] training's auc: 0.895042 training's binary_logloss: 0.120247 valid_1's auc: 0.836429 valid_1's binary_logloss: 0.13074
[35] training's auc: 0.895916 training's binary_logloss: 0.119887 valid_1's auc: 0.836278 valid_1's binary_logloss: 0.130733
[36] training's auc: 0.897127 training's binary_logloss: 0.119534 valid_1's auc: 0.83648 valid_1's binary_logloss: 0.130708
[37] training's auc: 0.898209 training's binary_logloss: 0.119176 valid_1's auc: 0.836606 valid_1's binary_logloss: 0.130681
[38] training's auc: 0.899151 training's binary_logloss: 0.118825 valid_1's auc: 0.83618 valid_1's binary_logloss: 0.130704
[39] training's auc: 0.900332 training's binary_logloss: 0.118504 valid_1's auc: 0.836015 valid_1's binary_logloss: 0.130717
[40] training's auc: 0.901082 training's binary_logloss: 0.11818 valid_1's auc: 0.83611 valid_1's binary_logloss: 0.13069
[41] training's auc: 0.901799 training's binary_logloss: 0.117892 valid_1's auc: 0.836017 valid_1's binary_logloss: 0.130695
[42] training's auc: 0.902392 training's binary_logloss: 0.117598 valid_1's auc: 0.836124 valid_1's binary_logloss: 0.130643
[43] training's auc: 0.903771 training's binary_logloss: 0.117256 valid_1's auc: 0.836016 valid_1's binary_logloss: 0.130659
[44] training's auc: 0.904829 training's binary_logloss: 0.116884 valid_1's auc: 0.835909 valid_1's binary_logloss: 0.130696
[45] training's auc: 0.905756 training's binary_logloss: 0.116529 valid_1's auc: 0.835876 valid_1's binary_logloss: 0.130707
[46] training's auc: 0.906521 training's binary_logloss: 0.116199 valid_1's auc: 0.835861 valid_1's binary_logloss: 0.130731
[47] training's auc: 0.907203 training's binary_logloss: 0.115898 valid_1's auc: 0.835764 valid_1's binary_logloss: 0.130753
[48] training's auc: 0.907813 training's binary_logloss: 0.115618 valid_1's auc: 0.835691 valid_1's binary_logloss: 0.13076
[49] training's auc: 0.9084 training's binary_logloss: 0.115321 valid_1's auc: 0.835546 valid_1's binary_logloss: 0.130767
[50] training's auc: 0.909058 training's binary_logloss: 0.114991 valid_1's auc: 0.835442 valid_1's binary_logloss: 0.130828
[51] training's auc: 0.909473 training's binary_logloss: 0.114763 valid_1's auc: 0.835408 valid_1's binary_logloss: 0.130836
[52] training's auc: 0.910243 training's binary_logloss: 0.114487 valid_1's auc: 0.835707 valid_1's binary_logloss: 0.130798
[53] training's auc: 0.910745 training's binary_logloss: 0.114259 valid_1's auc: 0.835355 valid_1's binary_logloss: 0.130856
[54] training's auc: 0.911636 training's binary_logloss: 0.113924 valid_1's auc: 0.834894 valid_1's binary_logloss: 0.13093
[55] training's auc: 0.912325 training's binary_logloss: 0.113642 valid_1's auc: 0.834869 valid_1's binary_logloss: 0.130952
[56] training's auc: 0.912899 training's binary_logloss: 0.113358 valid_1's auc: 0.834706 valid_1's binary_logloss: 0.130999
[57] training's auc: 0.913371 training's binary_logloss: 0.113102 valid_1's auc: 0.834607 valid_1's binary_logloss: 0.131025
[58] training's auc: 0.913773 training's binary_logloss: 0.112884 valid_1's auc: 0.834223 valid_1's binary_logloss: 0.131088
[59] training's auc: 0.914421 training's binary_logloss: 0.112606 valid_1's auc: 0.834125 valid_1's binary_logloss: 0.131121
[60] training's auc: 0.914821 training's binary_logloss: 0.112345 valid_1's auc: 0.834057 valid_1's binary_logloss: 0.131138
[61] training's auc: 0.915595 training's binary_logloss: 0.112059 valid_1's auc: 0.834098 valid_1's binary_logloss: 0.131131
[62] training's auc: 0.916345 training's binary_logloss: 0.111803 valid_1's auc: 0.833938 valid_1's binary_logloss: 0.131161
[63] training's auc: 0.916606 training's binary_logloss: 0.11165 valid_1's auc: 0.833853 valid_1's binary_logloss: 0.1312
[64] training's auc: 0.917105 training's binary_logloss: 0.111366 valid_1's auc: 0.833573 valid_1's binary_logloss: 0.131284
[65] training's auc: 0.917486 training's binary_logloss: 0.111143 valid_1's auc: 0.833478 valid_1's binary_logloss: 0.131325
[66] training's auc: 0.918086 training's binary_logloss: 0.110948 valid_1's auc: 0.833676 valid_1's binary_logloss: 0.131298
[67] training's auc: 0.918725 training's binary_logloss: 0.110656 valid_1's auc: 0.83357 valid_1's binary_logloss: 0.131305
Early stopping, best iteration is:
[37] training's auc: 0.898209 training's binary_logloss: 0.119176 valid_1's auc: 0.836606 valid_1's binary_logloss: 0.130681
[1] training's auc: 0.827795 training's binary_logloss: 0.155535 valid_1's auc: 0.809293 valid_1's binary_logloss: 0.161617
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.835977 training's binary_logloss: 0.150453 valid_1's auc: 0.817233 valid_1's binary_logloss: 0.156762
[3] training's auc: 0.844181 training's binary_logloss: 0.146611 valid_1's auc: 0.824719 valid_1's binary_logloss: 0.15331
[4] training's auc: 0.849101 training's binary_logloss: 0.143418 valid_1's auc: 0.826565 valid_1's binary_logloss: 0.150599
[5] training's auc: 0.852319 training's binary_logloss: 0.140809 valid_1's auc: 0.827191 valid_1's binary_logloss: 0.148511
[6] training's auc: 0.853793 training's binary_logloss: 0.138692 valid_1's auc: 0.828845 valid_1's binary_logloss: 0.146766
[7] training's auc: 0.856053 training's binary_logloss: 0.136722 valid_1's auc: 0.829676 valid_1's binary_logloss: 0.14532
[8] training's auc: 0.858036 training's binary_logloss: 0.135132 valid_1's auc: 0.829385 valid_1's binary_logloss: 0.144156
[9] training's auc: 0.85935 training's binary_logloss: 0.133645 valid_1's auc: 0.830421 valid_1's binary_logloss: 0.143051
[10] training's auc: 0.861596 training's binary_logloss: 0.132421 valid_1's auc: 0.831557 valid_1's binary_logloss: 0.142118
[11] training's auc: 0.863007 training's binary_logloss: 0.131277 valid_1's auc: 0.831845 valid_1's binary_logloss: 0.141325
[12] training's auc: 0.86524 training's binary_logloss: 0.130215 valid_1's auc: 0.831992 valid_1's binary_logloss: 0.140673
[13] training's auc: 0.867039 training's binary_logloss: 0.129286 valid_1's auc: 0.83231 valid_1's binary_logloss: 0.140109
[14] training's auc: 0.870349 training's binary_logloss: 0.128425 valid_1's auc: 0.833732 valid_1's binary_logloss: 0.139634
[15] training's auc: 0.872176 training's binary_logloss: 0.127604 valid_1's auc: 0.834089 valid_1's binary_logloss: 0.139175
[16] training's auc: 0.873783 training's binary_logloss: 0.126893 valid_1's auc: 0.834473 valid_1's binary_logloss: 0.138827
[17] training's auc: 0.875545 training's binary_logloss: 0.126135 valid_1's auc: 0.835062 valid_1's binary_logloss: 0.138521
[18] training's auc: 0.876503 training's binary_logloss: 0.125508 valid_1's auc: 0.83492 valid_1's binary_logloss: 0.138259
[19] training's auc: 0.877688 training's binary_logloss: 0.124835 valid_1's auc: 0.834571 valid_1's binary_logloss: 0.138071
[20] training's auc: 0.879291 training's binary_logloss: 0.124263 valid_1's auc: 0.834796 valid_1's binary_logloss: 0.137842
[21] training's auc: 0.880956 training's binary_logloss: 0.123635 valid_1's auc: 0.83497 valid_1's binary_logloss: 0.13757
[22] training's auc: 0.882184 training's binary_logloss: 0.123052 valid_1's auc: 0.835618 valid_1's binary_logloss: 0.137369
[23] training's auc: 0.883793 training's binary_logloss: 0.12248 valid_1's auc: 0.835803 valid_1's binary_logloss: 0.137236
[24] training's auc: 0.885216 training's binary_logloss: 0.12195 valid_1's auc: 0.835865 valid_1's binary_logloss: 0.137154
[25] training's auc: 0.886592 training's binary_logloss: 0.121449 valid_1's auc: 0.835862 valid_1's binary_logloss: 0.13704
[26] training's auc: 0.887513 training's binary_logloss: 0.120956 valid_1's auc: 0.83589 valid_1's binary_logloss: 0.136965
[27] training's auc: 0.888765 training's binary_logloss: 0.12047 valid_1's auc: 0.835424 valid_1's binary_logloss: 0.136907
[28] training's auc: 0.889983 training's binary_logloss: 0.120035 valid_1's auc: 0.83526 valid_1's binary_logloss: 0.136873
[29] training's auc: 0.891398 training's binary_logloss: 0.119547 valid_1's auc: 0.834569 valid_1's binary_logloss: 0.136875
[30] training's auc: 0.89279 training's binary_logloss: 0.119091 valid_1's auc: 0.834961 valid_1's binary_logloss: 0.136834
[31] training's auc: 0.893735 training's binary_logloss: 0.118673 valid_1's auc: 0.834934 valid_1's binary_logloss: 0.136809
[32] training's auc: 0.894796 training's binary_logloss: 0.118282 valid_1's auc: 0.834917 valid_1's binary_logloss: 0.136747
[33] training's auc: 0.895676 training's binary_logloss: 0.117932 valid_1's auc: 0.834764 valid_1's binary_logloss: 0.136748
[34] training's auc: 0.896486 training's binary_logloss: 0.117563 valid_1's auc: 0.834609 valid_1's binary_logloss: 0.136727
[35] training's auc: 0.897331 training's binary_logloss: 0.117221 valid_1's auc: 0.834518 valid_1's binary_logloss: 0.136747
[36] training's auc: 0.898232 training's binary_logloss: 0.116876 valid_1's auc: 0.834592 valid_1's binary_logloss: 0.136703
[37] training's auc: 0.899722 training's binary_logloss: 0.116522 valid_1's auc: 0.834458 valid_1's binary_logloss: 0.136701
[38] training's auc: 0.90057 training's binary_logloss: 0.116179 valid_1's auc: 0.834657 valid_1's binary_logloss: 0.136682
[39] training's auc: 0.901301 training's binary_logloss: 0.115876 valid_1's auc: 0.834399 valid_1's binary_logloss: 0.136722
[40] training's auc: 0.902305 training's binary_logloss: 0.115563 valid_1's auc: 0.833925 valid_1's binary_logloss: 0.136794
[41] training's auc: 0.903068 training's binary_logloss: 0.115198 valid_1's auc: 0.83389 valid_1's binary_logloss: 0.13684
[42] training's auc: 0.903994 training's binary_logloss: 0.114832 valid_1's auc: 0.833894 valid_1's binary_logloss: 0.136838
[43] training's auc: 0.905099 training's binary_logloss: 0.114533 valid_1's auc: 0.833989 valid_1's binary_logloss: 0.136822
[44] training's auc: 0.905956 training's binary_logloss: 0.114126 valid_1's auc: 0.833701 valid_1's binary_logloss: 0.136902
[45] training's auc: 0.906862 training's binary_logloss: 0.113748 valid_1's auc: 0.834118 valid_1's binary_logloss: 0.136883
[46] training's auc: 0.90759 training's binary_logloss: 0.113467 valid_1's auc: 0.833996 valid_1's binary_logloss: 0.136903
[47] training's auc: 0.908468 training's binary_logloss: 0.113157 valid_1's auc: 0.834137 valid_1's binary_logloss: 0.136884
[48] training's auc: 0.909207 training's binary_logloss: 0.112818 valid_1's auc: 0.834095 valid_1's binary_logloss: 0.136948
[49] training's auc: 0.90991 training's binary_logloss: 0.112489 valid_1's auc: 0.833806 valid_1's binary_logloss: 0.136993
[50] training's auc: 0.910722 training's binary_logloss: 0.112135 valid_1's auc: 0.83369 valid_1's binary_logloss: 0.137026
[51] training's auc: 0.911225 training's binary_logloss: 0.111851 valid_1's auc: 0.833851 valid_1's binary_logloss: 0.137024
[52] training's auc: 0.911746 training's binary_logloss: 0.111569 valid_1's auc: 0.833491 valid_1's binary_logloss: 0.137141
[53] training's auc: 0.912243 training's binary_logloss: 0.111322 valid_1's auc: 0.833167 valid_1's binary_logloss: 0.137204
[54] training's auc: 0.912839 training's binary_logloss: 0.111038 valid_1's auc: 0.833127 valid_1's binary_logloss: 0.137258
[55] training's auc: 0.913585 training's binary_logloss: 0.110738 valid_1's auc: 0.832886 valid_1's binary_logloss: 0.137294
[56] training's auc: 0.914208 training's binary_logloss: 0.110443 valid_1's auc: 0.832918 valid_1's binary_logloss: 0.137317
Early stopping, best iteration is:
[26] training's auc: 0.887513 training's binary_logloss: 0.120956 valid_1's auc: 0.83589 valid_1's binary_logloss: 0.136965
[1] training's auc: 0.838033 training's binary_logloss: 0.158305 valid_1's auc: 0.808659 valid_1's binary_logloss: 0.160243
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.84083 training's binary_logloss: 0.15387 valid_1's auc: 0.810233 valid_1's binary_logloss: 0.156651
[3] training's auc: 0.842895 training's binary_logloss: 0.150392 valid_1's auc: 0.809948 valid_1's binary_logloss: 0.153865
[4] training's auc: 0.849569 training's binary_logloss: 0.147553 valid_1's auc: 0.815241 valid_1's binary_logloss: 0.151522
[5] training's auc: 0.851303 training's binary_logloss: 0.145129 valid_1's auc: 0.819185 valid_1's binary_logloss: 0.149528
[6] training's auc: 0.854112 training's binary_logloss: 0.142982 valid_1's auc: 0.81996 valid_1's binary_logloss: 0.147942
[7] training's auc: 0.855345 training's binary_logloss: 0.141142 valid_1's auc: 0.821175 valid_1's binary_logloss: 0.146562
[8] training's auc: 0.858678 training's binary_logloss: 0.139488 valid_1's auc: 0.823892 valid_1's binary_logloss: 0.145309
[9] training's auc: 0.860133 training's binary_logloss: 0.13799 valid_1's auc: 0.824155 valid_1's binary_logloss: 0.144268
[10] training's auc: 0.86222 training's binary_logloss: 0.136689 valid_1's auc: 0.824738 valid_1's binary_logloss: 0.14332
[11] training's auc: 0.863495 training's binary_logloss: 0.13545 valid_1's auc: 0.825411 valid_1's binary_logloss: 0.142447
[12] training's auc: 0.865133 training's binary_logloss: 0.134367 valid_1's auc: 0.826479 valid_1's binary_logloss: 0.141685
[13] training's auc: 0.86622 training's binary_logloss: 0.133347 valid_1's auc: 0.826923 valid_1's binary_logloss: 0.141094
[14] training's auc: 0.867304 training's binary_logloss: 0.132426 valid_1's auc: 0.827422 valid_1's binary_logloss: 0.140517
[15] training's auc: 0.869423 training's binary_logloss: 0.131552 valid_1's auc: 0.827052 valid_1's binary_logloss: 0.140056
[16] training's auc: 0.871073 training's binary_logloss: 0.130732 valid_1's auc: 0.828093 valid_1's binary_logloss: 0.139521
[17] training's auc: 0.872181 training's binary_logloss: 0.129974 valid_1's auc: 0.828385 valid_1's binary_logloss: 0.139132
[18] training's auc: 0.873493 training's binary_logloss: 0.129281 valid_1's auc: 0.828433 valid_1's binary_logloss: 0.138806
[19] training's auc: 0.87436 training's binary_logloss: 0.128638 valid_1's auc: 0.828646 valid_1's binary_logloss: 0.138485
[20] training's auc: 0.875834 training's binary_logloss: 0.127958 valid_1's auc: 0.829009 valid_1's binary_logloss: 0.138128
[21] training's auc: 0.87723 training's binary_logloss: 0.127338 valid_1's auc: 0.829374 valid_1's binary_logloss: 0.137841
[22] training's auc: 0.878298 training's binary_logloss: 0.126768 valid_1's auc: 0.829347 valid_1's binary_logloss: 0.137598
[23] training's auc: 0.879799 training's binary_logloss: 0.126213 valid_1's auc: 0.830289 valid_1's binary_logloss: 0.137314
[24] training's auc: 0.880541 training's binary_logloss: 0.125686 valid_1's auc: 0.830515 valid_1's binary_logloss: 0.137137
[25] training's auc: 0.881414 training's binary_logloss: 0.125206 valid_1's auc: 0.830106 valid_1's binary_logloss: 0.136985
[26] training's auc: 0.882756 training's binary_logloss: 0.12469 valid_1's auc: 0.830046 valid_1's binary_logloss: 0.13686
[27] training's auc: 0.884125 training's binary_logloss: 0.124251 valid_1's auc: 0.830125 valid_1's binary_logloss: 0.136692
[28] training's auc: 0.884961 training's binary_logloss: 0.123776 valid_1's auc: 0.830631 valid_1's binary_logloss: 0.136514
[29] training's auc: 0.885856 training's binary_logloss: 0.123346 valid_1's auc: 0.83081 valid_1's binary_logloss: 0.136443
[30] training's auc: 0.887169 training's binary_logloss: 0.122891 valid_1's auc: 0.830822 valid_1's binary_logloss: 0.136383
[31] training's auc: 0.887952 training's binary_logloss: 0.122481 valid_1's auc: 0.83124 valid_1's binary_logloss: 0.136262
[32] training's auc: 0.888733 training's binary_logloss: 0.12208 valid_1's auc: 0.830917 valid_1's binary_logloss: 0.136261
[33] training's auc: 0.889621 training's binary_logloss: 0.121706 valid_1's auc: 0.831428 valid_1's binary_logloss: 0.136102
[34] training's auc: 0.89049 training's binary_logloss: 0.121305 valid_1's auc: 0.831484 valid_1's binary_logloss: 0.136032
[35] training's auc: 0.891618 training's binary_logloss: 0.120918 valid_1's auc: 0.831054 valid_1's binary_logloss: 0.136056
[36] training's auc: 0.89218 training's binary_logloss: 0.120583 valid_1's auc: 0.831408 valid_1's binary_logloss: 0.135962
[37] training's auc: 0.892953 training's binary_logloss: 0.120243 valid_1's auc: 0.831288 valid_1's binary_logloss: 0.135939
[38] training's auc: 0.893795 training's binary_logloss: 0.119888 valid_1's auc: 0.831292 valid_1's binary_logloss: 0.135888
[39] training's auc: 0.894564 training's binary_logloss: 0.119571 valid_1's auc: 0.831356 valid_1's binary_logloss: 0.135869
[40] training's auc: 0.895621 training's binary_logloss: 0.119184 valid_1's auc: 0.831331 valid_1's binary_logloss: 0.135849
[41] training's auc: 0.896408 training's binary_logloss: 0.118823 valid_1's auc: 0.831066 valid_1's binary_logloss: 0.135833
[42] training's auc: 0.897528 training's binary_logloss: 0.118476 valid_1's auc: 0.831662 valid_1's binary_logloss: 0.135709
[43] training's auc: 0.89836 training's binary_logloss: 0.118172 valid_1's auc: 0.831833 valid_1's binary_logloss: 0.135656
[44] training's auc: 0.899196 training's binary_logloss: 0.117867 valid_1's auc: 0.83184 valid_1's binary_logloss: 0.13564
[45] training's auc: 0.899869 training's binary_logloss: 0.117569 valid_1's auc: 0.831546 valid_1's binary_logloss: 0.135712
[46] training's auc: 0.90064 training's binary_logloss: 0.117272 valid_1's auc: 0.831449 valid_1's binary_logloss: 0.135718
[47] training's auc: 0.901354 training's binary_logloss: 0.116977 valid_1's auc: 0.830996 valid_1's binary_logloss: 0.135752
[48] training's auc: 0.902179 training's binary_logloss: 0.116681 valid_1's auc: 0.831105 valid_1's binary_logloss: 0.135723
[49] training's auc: 0.903276 training's binary_logloss: 0.116411 valid_1's auc: 0.83141 valid_1's binary_logloss: 0.135682
[50] training's auc: 0.903922 training's binary_logloss: 0.116114 valid_1's auc: 0.831078 valid_1's binary_logloss: 0.135735
[51] training's auc: 0.904754 training's binary_logloss: 0.115825 valid_1's auc: 0.830768 valid_1's binary_logloss: 0.135792
[52] training's auc: 0.905521 training's binary_logloss: 0.115582 valid_1's auc: 0.830727 valid_1's binary_logloss: 0.135799
[53] training's auc: 0.906168 training's binary_logloss: 0.115318 valid_1's auc: 0.83042 valid_1's binary_logloss: 0.13583
[54] training's auc: 0.906735 training's binary_logloss: 0.115074 valid_1's auc: 0.830322 valid_1's binary_logloss: 0.135879
[55] training's auc: 0.907398 training's binary_logloss: 0.114811 valid_1's auc: 0.830016 valid_1's binary_logloss: 0.135933
[56] training's auc: 0.907914 training's binary_logloss: 0.114579 valid_1's auc: 0.829664 valid_1's binary_logloss: 0.135978
[57] training's auc: 0.90859 training's binary_logloss: 0.114321 valid_1's auc: 0.829714 valid_1's binary_logloss: 0.136004
[58] training's auc: 0.909174 training's binary_logloss: 0.114062 valid_1's auc: 0.829441 valid_1's binary_logloss: 0.136037
[59] training's auc: 0.909753 training's binary_logloss: 0.113795 valid_1's auc: 0.829222 valid_1's binary_logloss: 0.136095
[60] training's auc: 0.910246 training's binary_logloss: 0.113572 valid_1's auc: 0.829209 valid_1's binary_logloss: 0.136125
[61] training's auc: 0.910745 training's binary_logloss: 0.113345 valid_1's auc: 0.829323 valid_1's binary_logloss: 0.136121
[62] training's auc: 0.911776 training's binary_logloss: 0.113125 valid_1's auc: 0.82937 valid_1's binary_logloss: 0.136125
[63] training's auc: 0.912384 training's binary_logloss: 0.112873 valid_1's auc: 0.829029 valid_1's binary_logloss: 0.136204
[64] training's auc: 0.912951 training's binary_logloss: 0.112637 valid_1's auc: 0.828919 valid_1's binary_logloss: 0.136229
[65] training's auc: 0.913396 training's binary_logloss: 0.112422 valid_1's auc: 0.828812 valid_1's binary_logloss: 0.136274
[66] training's auc: 0.913907 training's binary_logloss: 0.112219 valid_1's auc: 0.828987 valid_1's binary_logloss: 0.136255
[67] training's auc: 0.914333 training's binary_logloss: 0.111995 valid_1's auc: 0.828724 valid_1's binary_logloss: 0.136322
[68] training's auc: 0.914799 training's binary_logloss: 0.111771 valid_1's auc: 0.828612 valid_1's binary_logloss: 0.136344
[69] training's auc: 0.915222 training's binary_logloss: 0.111575 valid_1's auc: 0.828318 valid_1's binary_logloss: 0.136418
[70] training's auc: 0.915745 training's binary_logloss: 0.111327 valid_1's auc: 0.828052 valid_1's binary_logloss: 0.136481
[71] training's auc: 0.916071 training's binary_logloss: 0.111148 valid_1's auc: 0.827987 valid_1's binary_logloss: 0.136516
[72] training's auc: 0.916804 training's binary_logloss: 0.110943 valid_1's auc: 0.82781 valid_1's binary_logloss: 0.136563
[73] training's auc: 0.917372 training's binary_logloss: 0.110678 valid_1's auc: 0.827873 valid_1's binary_logloss: 0.136557
[74] training's auc: 0.917886 training's binary_logloss: 0.110425 valid_1's auc: 0.827763 valid_1's binary_logloss: 0.136617
Early stopping, best iteration is:
[44] training's auc: 0.899196 training's binary_logloss: 0.117867 valid_1's auc: 0.83184 valid_1's binary_logloss: 0.13564
[1] training's auc: 0.832696 training's binary_logloss: 0.160784 valid_1's auc: 0.815072 valid_1's binary_logloss: 0.155724
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.83994 training's binary_logloss: 0.156235 valid_1's auc: 0.819094 valid_1's binary_logloss: 0.151979
[3] training's auc: 0.843367 training's binary_logloss: 0.152743 valid_1's auc: 0.821636 valid_1's binary_logloss: 0.149047
[4] training's auc: 0.849593 training's binary_logloss: 0.149835 valid_1's auc: 0.828436 valid_1's binary_logloss: 0.146638
[5] training's auc: 0.852272 training's binary_logloss: 0.147355 valid_1's auc: 0.830545 valid_1's binary_logloss: 0.144774
[6] training's auc: 0.854419 training's binary_logloss: 0.145221 valid_1's auc: 0.831999 valid_1's binary_logloss: 0.143088
[7] training's auc: 0.856327 training's binary_logloss: 0.143369 valid_1's auc: 0.831902 valid_1's binary_logloss: 0.141732
[8] training's auc: 0.859591 training's binary_logloss: 0.141655 valid_1's auc: 0.833665 valid_1's binary_logloss: 0.140497
[9] training's auc: 0.86158 training's binary_logloss: 0.140132 valid_1's auc: 0.833817 valid_1's binary_logloss: 0.13945
[10] training's auc: 0.863485 training's binary_logloss: 0.138773 valid_1's auc: 0.834071 valid_1's binary_logloss: 0.138574
[11] training's auc: 0.864721 training's binary_logloss: 0.137586 valid_1's auc: 0.834105 valid_1's binary_logloss: 0.137758
[12] training's auc: 0.866087 training's binary_logloss: 0.136448 valid_1's auc: 0.834296 valid_1's binary_logloss: 0.136992
[13] training's auc: 0.867143 training's binary_logloss: 0.13543 valid_1's auc: 0.834199 valid_1's binary_logloss: 0.136313
[14] training's auc: 0.868307 training's binary_logloss: 0.134483 valid_1's auc: 0.834275 valid_1's binary_logloss: 0.135704
[15] training's auc: 0.868988 training's binary_logloss: 0.133658 valid_1's auc: 0.834211 valid_1's binary_logloss: 0.135181
[16] training's auc: 0.870198 training's binary_logloss: 0.132857 valid_1's auc: 0.834362 valid_1's binary_logloss: 0.134718
[17] training's auc: 0.871046 training's binary_logloss: 0.132083 valid_1's auc: 0.834133 valid_1's binary_logloss: 0.134326
[18] training's auc: 0.872364 training's binary_logloss: 0.131345 valid_1's auc: 0.83393 valid_1's binary_logloss: 0.133977
[19] training's auc: 0.873419 training's binary_logloss: 0.130654 valid_1's auc: 0.834314 valid_1's binary_logloss: 0.133621
[20] training's auc: 0.874317 training's binary_logloss: 0.13004 valid_1's auc: 0.834422 valid_1's binary_logloss: 0.133358
[21] training's auc: 0.875233 training's binary_logloss: 0.129429 valid_1's auc: 0.834198 valid_1's binary_logloss: 0.133148
[22] training's auc: 0.876378 training's binary_logloss: 0.128866 valid_1's auc: 0.833957 valid_1's binary_logloss: 0.132955
[23] training's auc: 0.877196 training's binary_logloss: 0.128303 valid_1's auc: 0.833589 valid_1's binary_logloss: 0.132776
[24] training's auc: 0.878194 training's binary_logloss: 0.12777 valid_1's auc: 0.832952 valid_1's binary_logloss: 0.13263
[25] training's auc: 0.878885 training's binary_logloss: 0.127295 valid_1's auc: 0.832852 valid_1's binary_logloss: 0.132464
[26] training's auc: 0.879807 training's binary_logloss: 0.126803 valid_1's auc: 0.832815 valid_1's binary_logloss: 0.132334
[27] training's auc: 0.881384 training's binary_logloss: 0.126294 valid_1's auc: 0.83372 valid_1's binary_logloss: 0.132127
[28] training's auc: 0.88255 training's binary_logloss: 0.12582 valid_1's auc: 0.834312 valid_1's binary_logloss: 0.131949
[29] training's auc: 0.883743 training's binary_logloss: 0.125353 valid_1's auc: 0.833818 valid_1's binary_logloss: 0.131859
[30] training's auc: 0.884917 training's binary_logloss: 0.124866 valid_1's auc: 0.833617 valid_1's binary_logloss: 0.131749
[31] training's auc: 0.885684 training's binary_logloss: 0.124483 valid_1's auc: 0.833377 valid_1's binary_logloss: 0.131659
[32] training's auc: 0.886624 training's binary_logloss: 0.124071 valid_1's auc: 0.83318 valid_1's binary_logloss: 0.131593
[33] training's auc: 0.887693 training's binary_logloss: 0.123635 valid_1's auc: 0.833288 valid_1's binary_logloss: 0.13153
[34] training's auc: 0.888727 training's binary_logloss: 0.123267 valid_1's auc: 0.833067 valid_1's binary_logloss: 0.131503
[35] training's auc: 0.889639 training's binary_logloss: 0.122878 valid_1's auc: 0.833229 valid_1's binary_logloss: 0.131437
[36] training's auc: 0.89033 training's binary_logloss: 0.122553 valid_1's auc: 0.833011 valid_1's binary_logloss: 0.131411
[37] training's auc: 0.891088 training's binary_logloss: 0.122201 valid_1's auc: 0.833056 valid_1's binary_logloss: 0.131362
[38] training's auc: 0.891977 training's binary_logloss: 0.121848 valid_1's auc: 0.833222 valid_1's binary_logloss: 0.131289
[39] training's auc: 0.892812 training's binary_logloss: 0.121491 valid_1's auc: 0.833124 valid_1's binary_logloss: 0.131252
[40] training's auc: 0.893883 training's binary_logloss: 0.121125 valid_1's auc: 0.833181 valid_1's binary_logloss: 0.131198
[41] training's auc: 0.894874 training's binary_logloss: 0.12075 valid_1's auc: 0.833541 valid_1's binary_logloss: 0.131122
[42] training's auc: 0.895871 training's binary_logloss: 0.120391 valid_1's auc: 0.83388 valid_1's binary_logloss: 0.131053
[43] training's auc: 0.896427 training's binary_logloss: 0.120102 valid_1's auc: 0.833715 valid_1's binary_logloss: 0.131044
[44] training's auc: 0.897635 training's binary_logloss: 0.119762 valid_1's auc: 0.833872 valid_1's binary_logloss: 0.131004
[45] training's auc: 0.898542 training's binary_logloss: 0.119483 valid_1's auc: 0.834052 valid_1's binary_logloss: 0.130943
[46] training's auc: 0.899262 training's binary_logloss: 0.119219 valid_1's auc: 0.834105 valid_1's binary_logloss: 0.130911
[47] training's auc: 0.900068 training's binary_logloss: 0.118932 valid_1's auc: 0.834325 valid_1's binary_logloss: 0.130864
[48] training's auc: 0.90133 training's binary_logloss: 0.118609 valid_1's auc: 0.834608 valid_1's binary_logloss: 0.130789
[49] training's auc: 0.902052 training's binary_logloss: 0.118339 valid_1's auc: 0.834526 valid_1's binary_logloss: 0.1308
[50] training's auc: 0.902898 training's binary_logloss: 0.118032 valid_1's auc: 0.834667 valid_1's binary_logloss: 0.130772
[51] training's auc: 0.903501 training's binary_logloss: 0.117778 valid_1's auc: 0.834698 valid_1's binary_logloss: 0.130764
[52] training's auc: 0.904204 training's binary_logloss: 0.117505 valid_1's auc: 0.834633 valid_1's binary_logloss: 0.130774
[53] training's auc: 0.904887 training's binary_logloss: 0.117264 valid_1's auc: 0.834384 valid_1's binary_logloss: 0.130807
[54] training's auc: 0.905632 training's binary_logloss: 0.116974 valid_1's auc: 0.834424 valid_1's binary_logloss: 0.130814
[55] training's auc: 0.906469 training's binary_logloss: 0.116737 valid_1's auc: 0.834371 valid_1's binary_logloss: 0.130818
[56] training's auc: 0.907241 training's binary_logloss: 0.116461 valid_1's auc: 0.834463 valid_1's binary_logloss: 0.130813
[57] training's auc: 0.907923 training's binary_logloss: 0.116212 valid_1's auc: 0.834557 valid_1's binary_logloss: 0.130798
[58] training's auc: 0.908627 training's binary_logloss: 0.115947 valid_1's auc: 0.83472 valid_1's binary_logloss: 0.130774
[59] training's auc: 0.909219 training's binary_logloss: 0.115694 valid_1's auc: 0.834779 valid_1's binary_logloss: 0.130788
[60] training's auc: 0.909804 training's binary_logloss: 0.11544 valid_1's auc: 0.834688 valid_1's binary_logloss: 0.130795
[61] training's auc: 0.91078 training's binary_logloss: 0.115124 valid_1's auc: 0.834677 valid_1's binary_logloss: 0.130791
[62] training's auc: 0.911344 training's binary_logloss: 0.114885 valid_1's auc: 0.834839 valid_1's binary_logloss: 0.130769
[63] training's auc: 0.911829 training's binary_logloss: 0.114658 valid_1's auc: 0.834853 valid_1's binary_logloss: 0.130764
[64] training's auc: 0.912634 training's binary_logloss: 0.114362 valid_1's auc: 0.834689 valid_1's binary_logloss: 0.130806
[65] training's auc: 0.913097 training's binary_logloss: 0.11412 valid_1's auc: 0.834703 valid_1's binary_logloss: 0.130812
[66] training's auc: 0.913586 training's binary_logloss: 0.113888 valid_1's auc: 0.834467 valid_1's binary_logloss: 0.13085
[67] training's auc: 0.914169 training's binary_logloss: 0.11367 valid_1's auc: 0.834571 valid_1's binary_logloss: 0.130838
[68] training's auc: 0.914636 training's binary_logloss: 0.113449 valid_1's auc: 0.834374 valid_1's binary_logloss: 0.130878
[69] training's auc: 0.915137 training's binary_logloss: 0.113198 valid_1's auc: 0.834366 valid_1's binary_logloss: 0.130888
[70] training's auc: 0.915595 training's binary_logloss: 0.112958 valid_1's auc: 0.834543 valid_1's binary_logloss: 0.130867
[71] training's auc: 0.916069 training's binary_logloss: 0.112734 valid_1's auc: 0.834475 valid_1's binary_logloss: 0.130893
[72] training's auc: 0.916475 training's binary_logloss: 0.112534 valid_1's auc: 0.834564 valid_1's binary_logloss: 0.130888
[73] training's auc: 0.916864 training's binary_logloss: 0.11233 valid_1's auc: 0.8345 valid_1's binary_logloss: 0.130905
[74] training's auc: 0.917354 training's binary_logloss: 0.112097 valid_1's auc: 0.834459 valid_1's binary_logloss: 0.130911
[75] training's auc: 0.917809 training's binary_logloss: 0.111865 valid_1's auc: 0.834577 valid_1's binary_logloss: 0.130905
[76] training's auc: 0.918226 training's binary_logloss: 0.111671 valid_1's auc: 0.834425 valid_1's binary_logloss: 0.130942
[77] training's auc: 0.918937 training's binary_logloss: 0.11148 valid_1's auc: 0.834415 valid_1's binary_logloss: 0.130952
[78] training's auc: 0.919338 training's binary_logloss: 0.111258 valid_1's auc: 0.834183 valid_1's binary_logloss: 0.130972
[79] training's auc: 0.919718 training's binary_logloss: 0.111084 valid_1's auc: 0.833995 valid_1's binary_logloss: 0.131005
[80] training's auc: 0.920109 training's binary_logloss: 0.110897 valid_1's auc: 0.833866 valid_1's binary_logloss: 0.131041
[81] training's auc: 0.920422 training's binary_logloss: 0.11073 valid_1's auc: 0.83386 valid_1's binary_logloss: 0.131045
Early stopping, best iteration is:
[51] training's auc: 0.903501 training's binary_logloss: 0.117778 valid_1's auc: 0.834698 valid_1's binary_logloss: 0.130764
[1] training's auc: 0.835741 training's binary_logloss: 0.157307 valid_1's auc: 0.81747 valid_1's binary_logloss: 0.163127
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.840363 training's binary_logloss: 0.153106 valid_1's auc: 0.822395 valid_1's binary_logloss: 0.159171
[3] training's auc: 0.844452 training's binary_logloss: 0.149664 valid_1's auc: 0.825897 valid_1's binary_logloss: 0.155983
[4] training's auc: 0.847891 training's binary_logloss: 0.146746 valid_1's auc: 0.827941 valid_1's binary_logloss: 0.153279
[5] training's auc: 0.851056 training's binary_logloss: 0.144337 valid_1's auc: 0.828661 valid_1's binary_logloss: 0.151232
[6] training's auc: 0.852868 training's binary_logloss: 0.142277 valid_1's auc: 0.829155 valid_1's binary_logloss: 0.149483
[7] training's auc: 0.854317 training's binary_logloss: 0.140417 valid_1's auc: 0.829734 valid_1's binary_logloss: 0.147914
[8] training's auc: 0.857028 training's binary_logloss: 0.138764 valid_1's auc: 0.83119 valid_1's binary_logloss: 0.146596
[9] training's auc: 0.858309 training's binary_logloss: 0.137312 valid_1's auc: 0.831479 valid_1's binary_logloss: 0.14546
[10] training's auc: 0.859885 training's binary_logloss: 0.135944 valid_1's auc: 0.831522 valid_1's binary_logloss: 0.144434
[11] training's auc: 0.861278 training's binary_logloss: 0.134705 valid_1's auc: 0.832074 valid_1's binary_logloss: 0.143556
[12] training's auc: 0.862101 training's binary_logloss: 0.133622 valid_1's auc: 0.832042 valid_1's binary_logloss: 0.142828
[13] training's auc: 0.86354 training's binary_logloss: 0.132621 valid_1's auc: 0.832542 valid_1's binary_logloss: 0.142148
[14] training's auc: 0.866215 training's binary_logloss: 0.131719 valid_1's auc: 0.834612 valid_1's binary_logloss: 0.141529
[15] training's auc: 0.867259 training's binary_logloss: 0.130888 valid_1's auc: 0.834618 valid_1's binary_logloss: 0.140943
[16] training's auc: 0.86902 training's binary_logloss: 0.130118 valid_1's auc: 0.835249 valid_1's binary_logloss: 0.140442
[17] training's auc: 0.870382 training's binary_logloss: 0.129399 valid_1's auc: 0.8351 valid_1's binary_logloss: 0.14001
[18] training's auc: 0.871807 training's binary_logloss: 0.128665 valid_1's auc: 0.834823 valid_1's binary_logloss: 0.139624
[19] training's auc: 0.873313 training's binary_logloss: 0.12801 valid_1's auc: 0.835 valid_1's binary_logloss: 0.139265
[20] training's auc: 0.875 training's binary_logloss: 0.127386 valid_1's auc: 0.835339 valid_1's binary_logloss: 0.138919
[21] training's auc: 0.876275 training's binary_logloss: 0.126767 valid_1's auc: 0.83612 valid_1's binary_logloss: 0.138614
[22] training's auc: 0.877787 training's binary_logloss: 0.126182 valid_1's auc: 0.836648 valid_1's binary_logloss: 0.138288
[23] training's auc: 0.878792 training's binary_logloss: 0.125648 valid_1's auc: 0.836522 valid_1's binary_logloss: 0.138038
[24] training's auc: 0.879707 training's binary_logloss: 0.12514 valid_1's auc: 0.836834 valid_1's binary_logloss: 0.137815
[25] training's auc: 0.881021 training's binary_logloss: 0.12464 valid_1's auc: 0.836666 valid_1's binary_logloss: 0.137632
[26] training's auc: 0.882262 training's binary_logloss: 0.124138 valid_1's auc: 0.836636 valid_1's binary_logloss: 0.137458
[27] training's auc: 0.883094 training's binary_logloss: 0.123673 valid_1's auc: 0.8367 valid_1's binary_logloss: 0.137344
[28] training's auc: 0.8841 training's binary_logloss: 0.12322 valid_1's auc: 0.836537 valid_1's binary_logloss: 0.137251
[29] training's auc: 0.885262 training's binary_logloss: 0.122776 valid_1's auc: 0.836404 valid_1's binary_logloss: 0.137148
[30] training's auc: 0.886168 training's binary_logloss: 0.122346 valid_1's auc: 0.836091 valid_1's binary_logloss: 0.137058
[31] training's auc: 0.887299 training's binary_logloss: 0.121934 valid_1's auc: 0.835886 valid_1's binary_logloss: 0.136962
[32] training's auc: 0.888237 training's binary_logloss: 0.121539 valid_1's auc: 0.83597 valid_1's binary_logloss: 0.136867
[33] training's auc: 0.889255 training's binary_logloss: 0.12112 valid_1's auc: 0.836439 valid_1's binary_logloss: 0.136763
[34] training's auc: 0.890147 training's binary_logloss: 0.12076 valid_1's auc: 0.836362 valid_1's binary_logloss: 0.136676
[35] training's auc: 0.891179 training's binary_logloss: 0.120354 valid_1's auc: 0.836116 valid_1's binary_logloss: 0.13661
[36] training's auc: 0.891973 training's binary_logloss: 0.11997 valid_1's auc: 0.835799 valid_1's binary_logloss: 0.136601
[37] training's auc: 0.89306 training's binary_logloss: 0.1196 valid_1's auc: 0.836267 valid_1's binary_logloss: 0.136494
[38] training's auc: 0.893999 training's binary_logloss: 0.11926 valid_1's auc: 0.835956 valid_1's binary_logloss: 0.136502
[39] training's auc: 0.894878 training's binary_logloss: 0.118912 valid_1's auc: 0.835845 valid_1's binary_logloss: 0.136485
[40] training's auc: 0.895813 training's binary_logloss: 0.118526 valid_1's auc: 0.835803 valid_1's binary_logloss: 0.136445
[41] training's auc: 0.896572 training's binary_logloss: 0.118211 valid_1's auc: 0.835787 valid_1's binary_logloss: 0.136416
[42] training's auc: 0.897376 training's binary_logloss: 0.117898 valid_1's auc: 0.835886 valid_1's binary_logloss: 0.136402
[43] training's auc: 0.898369 training's binary_logloss: 0.117574 valid_1's auc: 0.835781 valid_1's binary_logloss: 0.136384
[44] training's auc: 0.898976 training's binary_logloss: 0.117282 valid_1's auc: 0.835734 valid_1's binary_logloss: 0.13635
[45] training's auc: 0.899731 training's binary_logloss: 0.116964 valid_1's auc: 0.835387 valid_1's binary_logloss: 0.136392
[46] training's auc: 0.900449 training's binary_logloss: 0.116652 valid_1's auc: 0.834976 valid_1's binary_logloss: 0.136464
[47] training's auc: 0.901242 training's binary_logloss: 0.116351 valid_1's auc: 0.835071 valid_1's binary_logloss: 0.136448
[48] training's auc: 0.901939 training's binary_logloss: 0.116067 valid_1's auc: 0.835151 valid_1's binary_logloss: 0.136451
[49] training's auc: 0.902518 training's binary_logloss: 0.115802 valid_1's auc: 0.835004 valid_1's binary_logloss: 0.136483
[50] training's auc: 0.903345 training's binary_logloss: 0.115504 valid_1's auc: 0.835137 valid_1's binary_logloss: 0.136475
[51] training's auc: 0.903946 training's binary_logloss: 0.115262 valid_1's auc: 0.835195 valid_1's binary_logloss: 0.136478
[52] training's auc: 0.90458 training's binary_logloss: 0.114987 valid_1's auc: 0.835183 valid_1's binary_logloss: 0.136493
[53] training's auc: 0.905458 training's binary_logloss: 0.114648 valid_1's auc: 0.835056 valid_1's binary_logloss: 0.136535
[54] training's auc: 0.906364 training's binary_logloss: 0.114342 valid_1's auc: 0.83491 valid_1's binary_logloss: 0.136548
Early stopping, best iteration is:
[24] training's auc: 0.879707 training's binary_logloss: 0.12514 valid_1's auc: 0.836834 valid_1's binary_logloss: 0.137815
[1] training's auc: 0.837734 training's binary_logloss: 0.161276 valid_1's auc: 0.811237 valid_1's binary_logloss: 0.162739
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.838083 training's binary_logloss: 0.158496 valid_1's auc: 0.812007 valid_1's binary_logloss: 0.16052
[3] training's auc: 0.842784 training's binary_logloss: 0.156127 valid_1's auc: 0.812462 valid_1's binary_logloss: 0.158621
[4] training's auc: 0.844319 training's binary_logloss: 0.154044 valid_1's auc: 0.81274 valid_1's binary_logloss: 0.156961
[5] training's auc: 0.845041 training's binary_logloss: 0.152183 valid_1's auc: 0.812911 valid_1's binary_logloss: 0.155516
[6] training's auc: 0.847298 training's binary_logloss: 0.150496 valid_1's auc: 0.814648 valid_1's binary_logloss: 0.154152
[7] training's auc: 0.85138 training's binary_logloss: 0.148946 valid_1's auc: 0.817969 valid_1's binary_logloss: 0.152919
[8] training's auc: 0.85325 training's binary_logloss: 0.14753 valid_1's auc: 0.819637 valid_1's binary_logloss: 0.151783
[9] training's auc: 0.855477 training's binary_logloss: 0.146204 valid_1's auc: 0.820554 valid_1's binary_logloss: 0.150745
[10] training's auc: 0.856633 training's binary_logloss: 0.144987 valid_1's auc: 0.822052 valid_1's binary_logloss: 0.149773
[11] training's auc: 0.857486 training's binary_logloss: 0.143828 valid_1's auc: 0.822585 valid_1's binary_logloss: 0.148919
[12] training's auc: 0.858683 training's binary_logloss: 0.142732 valid_1's auc: 0.82361 valid_1's binary_logloss: 0.148096
[13] training's auc: 0.859318 training's binary_logloss: 0.141717 valid_1's auc: 0.824109 valid_1's binary_logloss: 0.147361
[14] training's auc: 0.86003 training's binary_logloss: 0.14078 valid_1's auc: 0.825365 valid_1's binary_logloss: 0.14662
[15] training's auc: 0.860995 training's binary_logloss: 0.139866 valid_1's auc: 0.82562 valid_1's binary_logloss: 0.145952
[16] training's auc: 0.863553 training's binary_logloss: 0.138994 valid_1's auc: 0.826893 valid_1's binary_logloss: 0.145306
[17] training's auc: 0.86395 training's binary_logloss: 0.138198 valid_1's auc: 0.827559 valid_1's binary_logloss: 0.144718
[18] training's auc: 0.86467 training's binary_logloss: 0.137433 valid_1's auc: 0.827288 valid_1's binary_logloss: 0.14422
[19] training's auc: 0.865444 training's binary_logloss: 0.136726 valid_1's auc: 0.82772 valid_1's binary_logloss: 0.143725
[20] training's auc: 0.865902 training's binary_logloss: 0.136056 valid_1's auc: 0.827686 valid_1's binary_logloss: 0.143274
[21] training's auc: 0.868 training's binary_logloss: 0.135393 valid_1's auc: 0.8281 valid_1's binary_logloss: 0.142815
[22] training's auc: 0.868543 training's binary_logloss: 0.134762 valid_1's auc: 0.828266 valid_1's binary_logloss: 0.142392
[23] training's auc: 0.869243 training's binary_logloss: 0.134169 valid_1's auc: 0.828254 valid_1's binary_logloss: 0.141988
[24] training's auc: 0.869585 training's binary_logloss: 0.133618 valid_1's auc: 0.828747 valid_1's binary_logloss: 0.141599
[25] training's auc: 0.870514 training's binary_logloss: 0.133061 valid_1's auc: 0.829033 valid_1's binary_logloss: 0.141263
[26] training's auc: 0.87127 training's binary_logloss: 0.132519 valid_1's auc: 0.82931 valid_1's binary_logloss: 0.140911
[27] training's auc: 0.871571 training's binary_logloss: 0.132014 valid_1's auc: 0.830019 valid_1's binary_logloss: 0.140624
[28] training's auc: 0.872979 training's binary_logloss: 0.13152 valid_1's auc: 0.830377 valid_1's binary_logloss: 0.140325
[29] training's auc: 0.873731 training's binary_logloss: 0.131059 valid_1's auc: 0.830222 valid_1's binary_logloss: 0.140055
[30] training's auc: 0.874539 training's binary_logloss: 0.130608 valid_1's auc: 0.830427 valid_1's binary_logloss: 0.139792
[31] training's auc: 0.875138 training's binary_logloss: 0.130139 valid_1's auc: 0.830759 valid_1's binary_logloss: 0.139531
[32] training's auc: 0.876244 training's binary_logloss: 0.129695 valid_1's auc: 0.831139 valid_1's binary_logloss: 0.139299
[33] training's auc: 0.876693 training's binary_logloss: 0.129292 valid_1's auc: 0.831148 valid_1's binary_logloss: 0.139097
[34] training's auc: 0.877728 training's binary_logloss: 0.128876 valid_1's auc: 0.831196 valid_1's binary_logloss: 0.138903
[35] training's auc: 0.87836 training's binary_logloss: 0.128476 valid_1's auc: 0.83126 valid_1's binary_logloss: 0.138693
[36] training's auc: 0.878832 training's binary_logloss: 0.128091 valid_1's auc: 0.831394 valid_1's binary_logloss: 0.138514
[37] training's auc: 0.87954 training's binary_logloss: 0.127744 valid_1's auc: 0.831226 valid_1's binary_logloss: 0.138368
[38] training's auc: 0.880357 training's binary_logloss: 0.127374 valid_1's auc: 0.831384 valid_1's binary_logloss: 0.138194
[39] training's auc: 0.880808 training's binary_logloss: 0.127036 valid_1's auc: 0.831662 valid_1's binary_logloss: 0.138018
[40] training's auc: 0.881511 training's binary_logloss: 0.126666 valid_1's auc: 0.831933 valid_1's binary_logloss: 0.137862
[41] training's auc: 0.881985 training's binary_logloss: 0.12633 valid_1's auc: 0.831902 valid_1's binary_logloss: 0.137715
[42] training's auc: 0.882713 training's binary_logloss: 0.125978 valid_1's auc: 0.832052 valid_1's binary_logloss: 0.137567
[43] training's auc: 0.883265 training's binary_logloss: 0.125651 valid_1's auc: 0.832179 valid_1's binary_logloss: 0.137431
[44] training's auc: 0.883964 training's binary_logloss: 0.125331 valid_1's auc: 0.832183 valid_1's binary_logloss: 0.137327
[45] training's auc: 0.884613 training's binary_logloss: 0.125018 valid_1's auc: 0.832111 valid_1's binary_logloss: 0.137222
[46] training's auc: 0.88539 training's binary_logloss: 0.12472 valid_1's auc: 0.831985 valid_1's binary_logloss: 0.137137
[47] training's auc: 0.885888 training's binary_logloss: 0.124438 valid_1's auc: 0.832004 valid_1's binary_logloss: 0.13704
[48] training's auc: 0.886465 training's binary_logloss: 0.124154 valid_1's auc: 0.83189 valid_1's binary_logloss: 0.136976
[49] training's auc: 0.887158 training's binary_logloss: 0.123862 valid_1's auc: 0.831839 valid_1's binary_logloss: 0.13689
[50] training's auc: 0.887792 training's binary_logloss: 0.123593 valid_1's auc: 0.831919 valid_1's binary_logloss: 0.136792
[51] training's auc: 0.888372 training's binary_logloss: 0.123316 valid_1's auc: 0.831843 valid_1's binary_logloss: 0.13673
[52] training's auc: 0.889055 training's binary_logloss: 0.123032 valid_1's auc: 0.83205 valid_1's binary_logloss: 0.136644
[53] training's auc: 0.889533 training's binary_logloss: 0.122776 valid_1's auc: 0.832081 valid_1's binary_logloss: 0.136566
[54] training's auc: 0.889959 training's binary_logloss: 0.122536 valid_1's auc: 0.832157 valid_1's binary_logloss: 0.136502
[55] training's auc: 0.890585 training's binary_logloss: 0.122262 valid_1's auc: 0.832167 valid_1's binary_logloss: 0.136443
[56] training's auc: 0.891197 training's binary_logloss: 0.121997 valid_1's auc: 0.832013 valid_1's binary_logloss: 0.136397
[57] training's auc: 0.891832 training's binary_logloss: 0.121749 valid_1's auc: 0.832293 valid_1's binary_logloss: 0.136303
[58] training's auc: 0.892343 training's binary_logloss: 0.121495 valid_1's auc: 0.832422 valid_1's binary_logloss: 0.136236
[59] training's auc: 0.892862 training's binary_logloss: 0.121253 valid_1's auc: 0.832588 valid_1's binary_logloss: 0.136163
[60] training's auc: 0.893433 training's binary_logloss: 0.121009 valid_1's auc: 0.832825 valid_1's binary_logloss: 0.136131
[61] training's auc: 0.89388 training's binary_logloss: 0.120789 valid_1's auc: 0.832816 valid_1's binary_logloss: 0.136092
[62] training's auc: 0.894368 training's binary_logloss: 0.120568 valid_1's auc: 0.832982 valid_1's binary_logloss: 0.136011
[63] training's auc: 0.894807 training's binary_logloss: 0.120343 valid_1's auc: 0.832891 valid_1's binary_logloss: 0.135983
[64] training's auc: 0.895198 training's binary_logloss: 0.120125 valid_1's auc: 0.832882 valid_1's binary_logloss: 0.135968
[65] training's auc: 0.895662 training's binary_logloss: 0.119905 valid_1's auc: 0.832912 valid_1's binary_logloss: 0.135935
[66] training's auc: 0.896099 training's binary_logloss: 0.119685 valid_1's auc: 0.833032 valid_1's binary_logloss: 0.135889
[67] training's auc: 0.896607 training's binary_logloss: 0.119487 valid_1's auc: 0.833073 valid_1's binary_logloss: 0.135872
[68] training's auc: 0.896972 training's binary_logloss: 0.119294 valid_1's auc: 0.833152 valid_1's binary_logloss: 0.135808
[69] training's auc: 0.897273 training's binary_logloss: 0.119112 valid_1's auc: 0.833208 valid_1's binary_logloss: 0.135783
[70] training's auc: 0.89767 training's binary_logloss: 0.118895 valid_1's auc: 0.833212 valid_1's binary_logloss: 0.135737
[71] training's auc: 0.898091 training's binary_logloss: 0.118699 valid_1's auc: 0.83321 valid_1's binary_logloss: 0.135722
[72] training's auc: 0.898525 training's binary_logloss: 0.118512 valid_1's auc: 0.833343 valid_1's binary_logloss: 0.135678
[73] training's auc: 0.899019 training's binary_logloss: 0.118324 valid_1's auc: 0.833429 valid_1's binary_logloss: 0.135629
[74] training's auc: 0.899509 training's binary_logloss: 0.118129 valid_1's auc: 0.833433 valid_1's binary_logloss: 0.135598
[75] training's auc: 0.899952 training's binary_logloss: 0.117938 valid_1's auc: 0.833343 valid_1's binary_logloss: 0.13558
[76] training's auc: 0.900361 training's binary_logloss: 0.117758 valid_1's auc: 0.833325 valid_1's binary_logloss: 0.135575
[77] training's auc: 0.90087 training's binary_logloss: 0.117565 valid_1's auc: 0.833269 valid_1's binary_logloss: 0.135552
[78] training's auc: 0.901319 training's binary_logloss: 0.117386 valid_1's auc: 0.833213 valid_1's binary_logloss: 0.135547
[79] training's auc: 0.901747 training's binary_logloss: 0.117203 valid_1's auc: 0.833282 valid_1's binary_logloss: 0.135513
[80] training's auc: 0.902152 training's binary_logloss: 0.117013 valid_1's auc: 0.833572 valid_1's binary_logloss: 0.135457
[81] training's auc: 0.902492 training's binary_logloss: 0.116823 valid_1's auc: 0.833411 valid_1's binary_logloss: 0.13547
[82] training's auc: 0.903049 training's binary_logloss: 0.116625 valid_1's auc: 0.833206 valid_1's binary_logloss: 0.135488
[83] training's auc: 0.903471 training's binary_logloss: 0.116439 valid_1's auc: 0.833108 valid_1's binary_logloss: 0.135501
[84] training's auc: 0.903843 training's binary_logloss: 0.116254 valid_1's auc: 0.833093 valid_1's binary_logloss: 0.135484
[85] training's auc: 0.904255 training's binary_logloss: 0.116085 valid_1's auc: 0.833152 valid_1's binary_logloss: 0.135459
[86] training's auc: 0.904616 training's binary_logloss: 0.115906 valid_1's auc: 0.833183 valid_1's binary_logloss: 0.135452
[87] training's auc: 0.90511 training's binary_logloss: 0.115715 valid_1's auc: 0.833357 valid_1's binary_logloss: 0.135406
[88] training's auc: 0.905552 training's binary_logloss: 0.11554 valid_1's auc: 0.833313 valid_1's binary_logloss: 0.135413
[89] training's auc: 0.905782 training's binary_logloss: 0.115368 valid_1's auc: 0.833118 valid_1's binary_logloss: 0.135465
[90] training's auc: 0.906402 training's binary_logloss: 0.1152 valid_1's auc: 0.833008 valid_1's binary_logloss: 0.135462
[91] training's auc: 0.90678 training's binary_logloss: 0.115025 valid_1's auc: 0.833024 valid_1's binary_logloss: 0.135454
[92] training's auc: 0.907267 training's binary_logloss: 0.114869 valid_1's auc: 0.832856 valid_1's binary_logloss: 0.135473
[93] training's auc: 0.907639 training's binary_logloss: 0.11471 valid_1's auc: 0.832735 valid_1's binary_logloss: 0.135484
[94] training's auc: 0.908043 training's binary_logloss: 0.114543 valid_1's auc: 0.832616 valid_1's binary_logloss: 0.135497
[95] training's auc: 0.908496 training's binary_logloss: 0.114378 valid_1's auc: 0.832601 valid_1's binary_logloss: 0.135482
[96] training's auc: 0.908806 training's binary_logloss: 0.114231 valid_1's auc: 0.832501 valid_1's binary_logloss: 0.135497
[97] training's auc: 0.909291 training's binary_logloss: 0.114089 valid_1's auc: 0.83245 valid_1's binary_logloss: 0.135489
[98] training's auc: 0.909734 training's binary_logloss: 0.113941 valid_1's auc: 0.83241 valid_1's binary_logloss: 0.135519
[99] training's auc: 0.91011 training's binary_logloss: 0.113768 valid_1's auc: 0.832287 valid_1's binary_logloss: 0.13554
[100] training's auc: 0.910406 training's binary_logloss: 0.113633 valid_1's auc: 0.832417 valid_1's binary_logloss: 0.135528
Did not meet early stopping. Best iteration is:
[100] training's auc: 0.910406 training's binary_logloss: 0.113633 valid_1's auc: 0.832417 valid_1's binary_logloss: 0.135528
[1] training's auc: 0.834968 training's binary_logloss: 0.163773 valid_1's auc: 0.821043 valid_1's binary_logloss: 0.158115
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.838239 training's binary_logloss: 0.161032 valid_1's auc: 0.821012 valid_1's binary_logloss: 0.155876
[3] training's auc: 0.840234 training's binary_logloss: 0.158647 valid_1's auc: 0.823263 valid_1's binary_logloss: 0.153889
[4] training's auc: 0.843458 training's binary_logloss: 0.156523 valid_1's auc: 0.824476 valid_1's binary_logloss: 0.152193
[5] training's auc: 0.844817 training's binary_logloss: 0.154605 valid_1's auc: 0.824408 valid_1's binary_logloss: 0.150716
[6] training's auc: 0.847021 training's binary_logloss: 0.152897 valid_1's auc: 0.825713 valid_1's binary_logloss: 0.149348
[7] training's auc: 0.849297 training's binary_logloss: 0.151328 valid_1's auc: 0.827121 valid_1's binary_logloss: 0.148103
[8] training's auc: 0.849998 training's binary_logloss: 0.14988 valid_1's auc: 0.827653 valid_1's binary_logloss: 0.146995
[9] training's auc: 0.85458 training's binary_logloss: 0.148532 valid_1's auc: 0.831026 valid_1's binary_logloss: 0.145949
[10] training's auc: 0.855509 training's binary_logloss: 0.147296 valid_1's auc: 0.831281 valid_1's binary_logloss: 0.145031
[11] training's auc: 0.856246 training's binary_logloss: 0.14615 valid_1's auc: 0.831419 valid_1's binary_logloss: 0.144169
[12] training's auc: 0.857872 training's binary_logloss: 0.145056 valid_1's auc: 0.832354 valid_1's binary_logloss: 0.143375
[13] training's auc: 0.858942 training's binary_logloss: 0.144057 valid_1's auc: 0.832972 valid_1's binary_logloss: 0.142651
[14] training's auc: 0.859631 training's binary_logloss: 0.14312 valid_1's auc: 0.83333 valid_1's binary_logloss: 0.141954
[15] training's auc: 0.860722 training's binary_logloss: 0.142197 valid_1's auc: 0.833568 valid_1's binary_logloss: 0.141301
[16] training's auc: 0.863183 training's binary_logloss: 0.141328 valid_1's auc: 0.834883 valid_1's binary_logloss: 0.140695
[17] training's auc: 0.863972 training's binary_logloss: 0.140492 valid_1's auc: 0.834798 valid_1's binary_logloss: 0.140127
[18] training's auc: 0.864686 training's binary_logloss: 0.139718 valid_1's auc: 0.835465 valid_1's binary_logloss: 0.139621
[19] training's auc: 0.866016 training's binary_logloss: 0.13897 valid_1's auc: 0.835677 valid_1's binary_logloss: 0.139122
[20] training's auc: 0.86684 training's binary_logloss: 0.138259 valid_1's auc: 0.835646 valid_1's binary_logloss: 0.138687
[21] training's auc: 0.867644 training's binary_logloss: 0.137606 valid_1's auc: 0.835848 valid_1's binary_logloss: 0.138226
[22] training's auc: 0.868152 training's binary_logloss: 0.136956 valid_1's auc: 0.836027 valid_1's binary_logloss: 0.137796
[23] training's auc: 0.868796 training's binary_logloss: 0.136347 valid_1's auc: 0.835593 valid_1's binary_logloss: 0.137441
[24] training's auc: 0.869255 training's binary_logloss: 0.135757 valid_1's auc: 0.835475 valid_1's binary_logloss: 0.13705
[25] training's auc: 0.870085 training's binary_logloss: 0.135184 valid_1's auc: 0.835308 valid_1's binary_logloss: 0.13671
[26] training's auc: 0.870608 training's binary_logloss: 0.134641 valid_1's auc: 0.835219 valid_1's binary_logloss: 0.136381
[27] training's auc: 0.871254 training's binary_logloss: 0.13413 valid_1's auc: 0.835476 valid_1's binary_logloss: 0.136052
[28] training's auc: 0.871616 training's binary_logloss: 0.133635 valid_1's auc: 0.835511 valid_1's binary_logloss: 0.13576
[29] training's auc: 0.872434 training's binary_logloss: 0.133158 valid_1's auc: 0.835373 valid_1's binary_logloss: 0.135482
[30] training's auc: 0.872924 training's binary_logloss: 0.132685 valid_1's auc: 0.835494 valid_1's binary_logloss: 0.135204
[31] training's auc: 0.873492 training's binary_logloss: 0.13225 valid_1's auc: 0.835769 valid_1's binary_logloss: 0.134933
[32] training's auc: 0.874161 training's binary_logloss: 0.131837 valid_1's auc: 0.836014 valid_1's binary_logloss: 0.134703
[33] training's auc: 0.874683 training's binary_logloss: 0.131435 valid_1's auc: 0.83589 valid_1's binary_logloss: 0.134483
[34] training's auc: 0.875437 training's binary_logloss: 0.131007 valid_1's auc: 0.835888 valid_1's binary_logloss: 0.134276
[35] training's auc: 0.875951 training's binary_logloss: 0.130595 valid_1's auc: 0.835848 valid_1's binary_logloss: 0.134093
[36] training's auc: 0.876585 training's binary_logloss: 0.130204 valid_1's auc: 0.835488 valid_1's binary_logloss: 0.133947
[37] training's auc: 0.877407 training's binary_logloss: 0.129814 valid_1's auc: 0.835598 valid_1's binary_logloss: 0.133778
[38] training's auc: 0.878157 training's binary_logloss: 0.129443 valid_1's auc: 0.835474 valid_1's binary_logloss: 0.133635
[39] training's auc: 0.878696 training's binary_logloss: 0.129103 valid_1's auc: 0.835344 valid_1's binary_logloss: 0.133476
[40] training's auc: 0.879407 training's binary_logloss: 0.128765 valid_1's auc: 0.835335 valid_1's binary_logloss: 0.133346
[41] training's auc: 0.879982 training's binary_logloss: 0.128421 valid_1's auc: 0.835214 valid_1's binary_logloss: 0.133229
[42] training's auc: 0.880304 training's binary_logloss: 0.128086 valid_1's auc: 0.835161 valid_1's binary_logloss: 0.133114
[43] training's auc: 0.880965 training's binary_logloss: 0.12776 valid_1's auc: 0.835025 valid_1's binary_logloss: 0.133006
[44] training's auc: 0.881554 training's binary_logloss: 0.127445 valid_1's auc: 0.835312 valid_1's binary_logloss: 0.132859
[45] training's auc: 0.882161 training's binary_logloss: 0.127148 valid_1's auc: 0.835355 valid_1's binary_logloss: 0.132736
[46] training's auc: 0.88264 training's binary_logloss: 0.126873 valid_1's auc: 0.835221 valid_1's binary_logloss: 0.132663
[47] training's auc: 0.883084 training's binary_logloss: 0.126586 valid_1's auc: 0.835245 valid_1's binary_logloss: 0.132558
[48] training's auc: 0.883576 training's binary_logloss: 0.126311 valid_1's auc: 0.835237 valid_1's binary_logloss: 0.132461
[49] training's auc: 0.884051 training's binary_logloss: 0.12603 valid_1's auc: 0.835215 valid_1's binary_logloss: 0.132358
[50] training's auc: 0.884653 training's binary_logloss: 0.12577 valid_1's auc: 0.835204 valid_1's binary_logloss: 0.132295
[51] training's auc: 0.885098 training's binary_logloss: 0.125494 valid_1's auc: 0.834878 valid_1's binary_logloss: 0.132258
[52] training's auc: 0.885441 training's binary_logloss: 0.125236 valid_1's auc: 0.834618 valid_1's binary_logloss: 0.1322
Early stopping, best iteration is:
[22] training's auc: 0.868152 training's binary_logloss: 0.136956 valid_1's auc: 0.836027 valid_1's binary_logloss: 0.137796
[1] training's auc: 0.831554 training's binary_logloss: 0.160023 valid_1's auc: 0.806887 valid_1's binary_logloss: 0.165809
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.838051 training's binary_logloss: 0.157401 valid_1's auc: 0.815218 valid_1's binary_logloss: 0.163348
[3] training's auc: 0.83981 training's binary_logloss: 0.155112 valid_1's auc: 0.816237 valid_1's binary_logloss: 0.161273
[4] training's auc: 0.842063 training's binary_logloss: 0.153072 valid_1's auc: 0.818044 valid_1's binary_logloss: 0.159434
[5] training's auc: 0.842473 training's binary_logloss: 0.151241 valid_1's auc: 0.819113 valid_1's binary_logloss: 0.157771
[6] training's auc: 0.848324 training's binary_logloss: 0.149582 valid_1's auc: 0.825403 valid_1's binary_logloss: 0.15632
[7] training's auc: 0.849976 training's binary_logloss: 0.148033 valid_1's auc: 0.826331 valid_1's binary_logloss: 0.154927
[8] training's auc: 0.851587 training's binary_logloss: 0.146653 valid_1's auc: 0.826219 valid_1's binary_logloss: 0.153725
[9] training's auc: 0.852693 training's binary_logloss: 0.145356 valid_1's auc: 0.826868 valid_1's binary_logloss: 0.152591
[10] training's auc: 0.854556 training's binary_logloss: 0.144125 valid_1's auc: 0.827617 valid_1's binary_logloss: 0.151588
[11] training's auc: 0.856037 training's binary_logloss: 0.142985 valid_1's auc: 0.828156 valid_1's binary_logloss: 0.1507
[12] training's auc: 0.857124 training's binary_logloss: 0.141937 valid_1's auc: 0.828893 valid_1's binary_logloss: 0.14983
[13] training's auc: 0.858632 training's binary_logloss: 0.140945 valid_1's auc: 0.829318 valid_1's binary_logloss: 0.149006
[14] training's auc: 0.859266 training's binary_logloss: 0.140008 valid_1's auc: 0.82961 valid_1's binary_logloss: 0.148239
[15] training's auc: 0.861287 training's binary_logloss: 0.139111 valid_1's auc: 0.829856 valid_1's binary_logloss: 0.147559
[16] training's auc: 0.861499 training's binary_logloss: 0.138301 valid_1's auc: 0.830268 valid_1's binary_logloss: 0.146926
[17] training's auc: 0.862109 training's binary_logloss: 0.137529 valid_1's auc: 0.83005 valid_1's binary_logloss: 0.146353
[18] training's auc: 0.863 training's binary_logloss: 0.136778 valid_1's auc: 0.830376 valid_1's binary_logloss: 0.145822
[19] training's auc: 0.863656 training's binary_logloss: 0.13607 valid_1's auc: 0.830781 valid_1's binary_logloss: 0.14527
[20] training's auc: 0.864203 training's binary_logloss: 0.135404 valid_1's auc: 0.830807 valid_1's binary_logloss: 0.144786
[21] training's auc: 0.865529 training's binary_logloss: 0.13471 valid_1's auc: 0.83151 valid_1's binary_logloss: 0.144303
[22] training's auc: 0.866285 training's binary_logloss: 0.134045 valid_1's auc: 0.831725 valid_1's binary_logloss: 0.143857
[23] training's auc: 0.866904 training's binary_logloss: 0.133471 valid_1's auc: 0.831541 valid_1's binary_logloss: 0.143464
[24] training's auc: 0.867221 training's binary_logloss: 0.132876 valid_1's auc: 0.831808 valid_1's binary_logloss: 0.143085
[25] training's auc: 0.86748 training's binary_logloss: 0.132323 valid_1's auc: 0.831894 valid_1's binary_logloss: 0.142758
[26] training's auc: 0.868707 training's binary_logloss: 0.131799 valid_1's auc: 0.832195 valid_1's binary_logloss: 0.142403
[27] training's auc: 0.87084 training's binary_logloss: 0.131276 valid_1's auc: 0.834824 valid_1's binary_logloss: 0.142076
[28] training's auc: 0.871699 training's binary_logloss: 0.130787 valid_1's auc: 0.834936 valid_1's binary_logloss: 0.141738
[29] training's auc: 0.872287 training's binary_logloss: 0.130332 valid_1's auc: 0.835056 valid_1's binary_logloss: 0.141465
[30] training's auc: 0.872864 training's binary_logloss: 0.129872 valid_1's auc: 0.835243 valid_1's binary_logloss: 0.141178
[31] training's auc: 0.873331 training's binary_logloss: 0.129425 valid_1's auc: 0.835342 valid_1's binary_logloss: 0.140924
[32] training's auc: 0.873908 training's binary_logloss: 0.128997 valid_1's auc: 0.835278 valid_1's binary_logloss: 0.140672
[33] training's auc: 0.874561 training's binary_logloss: 0.128569 valid_1's auc: 0.835269 valid_1's binary_logloss: 0.14043
[34] training's auc: 0.875206 training's binary_logloss: 0.128172 valid_1's auc: 0.835436 valid_1's binary_logloss: 0.140194
[35] training's auc: 0.876108 training's binary_logloss: 0.127771 valid_1's auc: 0.835601 valid_1's binary_logloss: 0.139972
[36] training's auc: 0.876679 training's binary_logloss: 0.127399 valid_1's auc: 0.835615 valid_1's binary_logloss: 0.139792
[37] training's auc: 0.877834 training's binary_logloss: 0.127032 valid_1's auc: 0.835237 valid_1's binary_logloss: 0.139626
[38] training's auc: 0.878828 training's binary_logloss: 0.126653 valid_1's auc: 0.835451 valid_1's binary_logloss: 0.139462
[39] training's auc: 0.879497 training's binary_logloss: 0.126313 valid_1's auc: 0.835361 valid_1's binary_logloss: 0.139289
[40] training's auc: 0.880318 training's binary_logloss: 0.125957 valid_1's auc: 0.835497 valid_1's binary_logloss: 0.13913
[41] training's auc: 0.880932 training's binary_logloss: 0.125614 valid_1's auc: 0.835496 valid_1's binary_logloss: 0.138984
[42] training's auc: 0.881747 training's binary_logloss: 0.12529 valid_1's auc: 0.835919 valid_1's binary_logloss: 0.138832
[43] training's auc: 0.882329 training's binary_logloss: 0.12498 valid_1's auc: 0.836033 valid_1's binary_logloss: 0.138699
[44] training's auc: 0.883181 training's binary_logloss: 0.124652 valid_1's auc: 0.836211 valid_1's binary_logloss: 0.138563
[45] training's auc: 0.88362 training's binary_logloss: 0.124369 valid_1's auc: 0.836173 valid_1's binary_logloss: 0.138425
[46] training's auc: 0.88418 training's binary_logloss: 0.124097 valid_1's auc: 0.836211 valid_1's binary_logloss: 0.138304
[47] training's auc: 0.884652 training's binary_logloss: 0.123836 valid_1's auc: 0.836327 valid_1's binary_logloss: 0.138204
[48] training's auc: 0.885354 training's binary_logloss: 0.123545 valid_1's auc: 0.836478 valid_1's binary_logloss: 0.138101
[49] training's auc: 0.885909 training's binary_logloss: 0.123268 valid_1's auc: 0.836602 valid_1's binary_logloss: 0.137996
[50] training's auc: 0.886303 training's binary_logloss: 0.12301 valid_1's auc: 0.836592 valid_1's binary_logloss: 0.137909
[51] training's auc: 0.887046 training's binary_logloss: 0.122738 valid_1's auc: 0.836567 valid_1's binary_logloss: 0.137815
[52] training's auc: 0.887538 training's binary_logloss: 0.122475 valid_1's auc: 0.836485 valid_1's binary_logloss: 0.137743
[53] training's auc: 0.888365 training's binary_logloss: 0.122203 valid_1's auc: 0.836359 valid_1's binary_logloss: 0.137665
[54] training's auc: 0.888994 training's binary_logloss: 0.121954 valid_1's auc: 0.83641 valid_1's binary_logloss: 0.137578
[55] training's auc: 0.889626 training's binary_logloss: 0.121701 valid_1's auc: 0.8362 valid_1's binary_logloss: 0.137517
[56] training's auc: 0.890199 training's binary_logloss: 0.121439 valid_1's auc: 0.835909 valid_1's binary_logloss: 0.137486
[57] training's auc: 0.890836 training's binary_logloss: 0.121152 valid_1's auc: 0.83567 valid_1's binary_logloss: 0.13743
[58] training's auc: 0.891476 training's binary_logloss: 0.120907 valid_1's auc: 0.835707 valid_1's binary_logloss: 0.137377
[59] training's auc: 0.891926 training's binary_logloss: 0.120664 valid_1's auc: 0.835679 valid_1's binary_logloss: 0.137306
[60] training's auc: 0.892547 training's binary_logloss: 0.120417 valid_1's auc: 0.835415 valid_1's binary_logloss: 0.13728
[61] training's auc: 0.893078 training's binary_logloss: 0.120185 valid_1's auc: 0.835633 valid_1's binary_logloss: 0.137234
[62] training's auc: 0.89355 training's binary_logloss: 0.119962 valid_1's auc: 0.835758 valid_1's binary_logloss: 0.137185
[63] training's auc: 0.894031 training's binary_logloss: 0.119742 valid_1's auc: 0.836017 valid_1's binary_logloss: 0.137122
[64] training's auc: 0.894539 training's binary_logloss: 0.119495 valid_1's auc: 0.835863 valid_1's binary_logloss: 0.137099
[65] training's auc: 0.894982 training's binary_logloss: 0.119271 valid_1's auc: 0.836085 valid_1's binary_logloss: 0.137043
[66] training's auc: 0.895427 training's binary_logloss: 0.119063 valid_1's auc: 0.83612 valid_1's binary_logloss: 0.136998
[67] training's auc: 0.896055 training's binary_logloss: 0.118825 valid_1's auc: 0.836042 valid_1's binary_logloss: 0.13697
[68] training's auc: 0.896506 training's binary_logloss: 0.118603 valid_1's auc: 0.836111 valid_1's binary_logloss: 0.136919
[69] training's auc: 0.896924 training's binary_logloss: 0.118395 valid_1's auc: 0.836163 valid_1's binary_logloss: 0.136883
[70] training's auc: 0.897416 training's binary_logloss: 0.118174 valid_1's auc: 0.836038 valid_1's binary_logloss: 0.136881
[71] training's auc: 0.897741 training's binary_logloss: 0.117991 valid_1's auc: 0.836235 valid_1's binary_logloss: 0.136831
[72] training's auc: 0.898193 training's binary_logloss: 0.117795 valid_1's auc: 0.836225 valid_1's binary_logloss: 0.136816
[73] training's auc: 0.898635 training's binary_logloss: 0.117598 valid_1's auc: 0.836128 valid_1's binary_logloss: 0.136796
[74] training's auc: 0.89918 training's binary_logloss: 0.117369 valid_1's auc: 0.836128 valid_1's binary_logloss: 0.136762
[75] training's auc: 0.899743 training's binary_logloss: 0.117171 valid_1's auc: 0.836083 valid_1's binary_logloss: 0.136753
[76] training's auc: 0.900254 training's binary_logloss: 0.116962 valid_1's auc: 0.835954 valid_1's binary_logloss: 0.136743
[77] training's auc: 0.900719 training's binary_logloss: 0.116774 valid_1's auc: 0.835935 valid_1's binary_logloss: 0.136728
[78] training's auc: 0.901182 training's binary_logloss: 0.116584 valid_1's auc: 0.835743 valid_1's binary_logloss: 0.136732
[79] training's auc: 0.901602 training's binary_logloss: 0.116383 valid_1's auc: 0.835559 valid_1's binary_logloss: 0.136739
Early stopping, best iteration is:
[49] training's auc: 0.885909 training's binary_logloss: 0.123268 valid_1's auc: 0.836602 valid_1's binary_logloss: 0.137996
[1] training's auc: 0.826748 training's binary_logloss: 0.154968 valid_1's auc: 0.803425 valid_1's binary_logloss: 0.157463
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.836555 training's binary_logloss: 0.149496 valid_1's auc: 0.808095 valid_1's binary_logloss: 0.152787
[3] training's auc: 0.84589 training's binary_logloss: 0.14552 valid_1's auc: 0.818221 valid_1's binary_logloss: 0.149504
[4] training's auc: 0.849916 training's binary_logloss: 0.142321 valid_1's auc: 0.820007 valid_1's binary_logloss: 0.147127
[5] training's auc: 0.854133 training's binary_logloss: 0.139798 valid_1's auc: 0.823727 valid_1's binary_logloss: 0.145103
[6] training's auc: 0.856182 training's binary_logloss: 0.137627 valid_1's auc: 0.825009 valid_1's binary_logloss: 0.143373
[7] training's auc: 0.858685 training's binary_logloss: 0.13582 valid_1's auc: 0.825653 valid_1's binary_logloss: 0.142048
[8] training's auc: 0.860749 training's binary_logloss: 0.134209 valid_1's auc: 0.82723 valid_1's binary_logloss: 0.140973
[9] training's auc: 0.862288 training's binary_logloss: 0.132905 valid_1's auc: 0.827152 valid_1's binary_logloss: 0.140128
[10] training's auc: 0.863359 training's binary_logloss: 0.131696 valid_1's auc: 0.826922 valid_1's binary_logloss: 0.139455
[11] training's auc: 0.864831 training's binary_logloss: 0.130651 valid_1's auc: 0.826633 valid_1's binary_logloss: 0.138937
[12] training's auc: 0.86649 training's binary_logloss: 0.129643 valid_1's auc: 0.827473 valid_1's binary_logloss: 0.13837
[13] training's auc: 0.869283 training's binary_logloss: 0.128694 valid_1's auc: 0.828053 valid_1's binary_logloss: 0.137936
[14] training's auc: 0.871246 training's binary_logloss: 0.12782 valid_1's auc: 0.828512 valid_1's binary_logloss: 0.137584
[15] training's auc: 0.873082 training's binary_logloss: 0.127083 valid_1's auc: 0.828762 valid_1's binary_logloss: 0.137264
[16] training's auc: 0.874639 training's binary_logloss: 0.126358 valid_1's auc: 0.828428 valid_1's binary_logloss: 0.137059
[17] training's auc: 0.876274 training's binary_logloss: 0.125701 valid_1's auc: 0.830404 valid_1's binary_logloss: 0.136741
[18] training's auc: 0.878104 training's binary_logloss: 0.125056 valid_1's auc: 0.830777 valid_1's binary_logloss: 0.136559
[19] training's auc: 0.879424 training's binary_logloss: 0.124463 valid_1's auc: 0.830886 valid_1's binary_logloss: 0.136412
[20] training's auc: 0.881084 training's binary_logloss: 0.123909 valid_1's auc: 0.830546 valid_1's binary_logloss: 0.136364
[21] training's auc: 0.882314 training's binary_logloss: 0.123367 valid_1's auc: 0.830743 valid_1's binary_logloss: 0.13624
[22] training's auc: 0.883456 training's binary_logloss: 0.122838 valid_1's auc: 0.830338 valid_1's binary_logloss: 0.136239
[23] training's auc: 0.884414 training's binary_logloss: 0.122395 valid_1's auc: 0.830352 valid_1's binary_logloss: 0.136222
[24] training's auc: 0.886023 training's binary_logloss: 0.122003 valid_1's auc: 0.830379 valid_1's binary_logloss: 0.136183
[25] training's auc: 0.887392 training's binary_logloss: 0.121498 valid_1's auc: 0.830786 valid_1's binary_logloss: 0.136064
[26] training's auc: 0.889126 training's binary_logloss: 0.120955 valid_1's auc: 0.830244 valid_1's binary_logloss: 0.136126
[27] training's auc: 0.890674 training's binary_logloss: 0.120483 valid_1's auc: 0.83127 valid_1's binary_logloss: 0.135957
[28] training's auc: 0.892233 training's binary_logloss: 0.120033 valid_1's auc: 0.831087 valid_1's binary_logloss: 0.136021
[29] training's auc: 0.893185 training's binary_logloss: 0.119623 valid_1's auc: 0.831173 valid_1's binary_logloss: 0.136008
[30] training's auc: 0.893962 training's binary_logloss: 0.119226 valid_1's auc: 0.830783 valid_1's binary_logloss: 0.136045
[31] training's auc: 0.895277 training's binary_logloss: 0.118865 valid_1's auc: 0.830792 valid_1's binary_logloss: 0.13605
[32] training's auc: 0.896264 training's binary_logloss: 0.118449 valid_1's auc: 0.83037 valid_1's binary_logloss: 0.136129
[33] training's auc: 0.897433 training's binary_logloss: 0.118074 valid_1's auc: 0.830262 valid_1's binary_logloss: 0.13612
[34] training's auc: 0.898709 training's binary_logloss: 0.117671 valid_1's auc: 0.830007 valid_1's binary_logloss: 0.136183
[35] training's auc: 0.899458 training's binary_logloss: 0.117356 valid_1's auc: 0.830459 valid_1's binary_logloss: 0.136143
[36] training's auc: 0.900333 training's binary_logloss: 0.117021 valid_1's auc: 0.83025 valid_1's binary_logloss: 0.136165
[37] training's auc: 0.901332 training's binary_logloss: 0.11667 valid_1's auc: 0.830028 valid_1's binary_logloss: 0.136238
[38] training's auc: 0.901871 training's binary_logloss: 0.116421 valid_1's auc: 0.83018 valid_1's binary_logloss: 0.136221
[39] training's auc: 0.902474 training's binary_logloss: 0.116123 valid_1's auc: 0.829585 valid_1's binary_logloss: 0.136294
[40] training's auc: 0.903361 training's binary_logloss: 0.11576 valid_1's auc: 0.829164 valid_1's binary_logloss: 0.136378
[41] training's auc: 0.904646 training's binary_logloss: 0.11539 valid_1's auc: 0.829438 valid_1's binary_logloss: 0.136363
[42] training's auc: 0.905184 training's binary_logloss: 0.115135 valid_1's auc: 0.829039 valid_1's binary_logloss: 0.136486
[43] training's auc: 0.906312 training's binary_logloss: 0.114752 valid_1's auc: 0.829138 valid_1's binary_logloss: 0.136494
[44] training's auc: 0.907064 training's binary_logloss: 0.114437 valid_1's auc: 0.8285 valid_1's binary_logloss: 0.136601
[45] training's auc: 0.907857 training's binary_logloss: 0.114142 valid_1's auc: 0.828302 valid_1's binary_logloss: 0.136663
[46] training's auc: 0.908716 training's binary_logloss: 0.113857 valid_1's auc: 0.828067 valid_1's binary_logloss: 0.136699
[47] training's auc: 0.909434 training's binary_logloss: 0.113559 valid_1's auc: 0.827427 valid_1's binary_logloss: 0.136838
[48] training's auc: 0.910056 training's binary_logloss: 0.113288 valid_1's auc: 0.827262 valid_1's binary_logloss: 0.136884
[49] training's auc: 0.911068 training's binary_logloss: 0.113049 valid_1's auc: 0.827366 valid_1's binary_logloss: 0.136902
[50] training's auc: 0.91188 training's binary_logloss: 0.112747 valid_1's auc: 0.827301 valid_1's binary_logloss: 0.136914
[51] training's auc: 0.912428 training's binary_logloss: 0.112461 valid_1's auc: 0.82745 valid_1's binary_logloss: 0.136914
[52] training's auc: 0.913125 training's binary_logloss: 0.112173 valid_1's auc: 0.827205 valid_1's binary_logloss: 0.136978
[53] training's auc: 0.913681 training's binary_logloss: 0.111892 valid_1's auc: 0.82682 valid_1's binary_logloss: 0.137049
[54] training's auc: 0.914082 training's binary_logloss: 0.111684 valid_1's auc: 0.826392 valid_1's binary_logloss: 0.137167
[55] training's auc: 0.914484 training's binary_logloss: 0.111439 valid_1's auc: 0.825958 valid_1's binary_logloss: 0.137261
[56] training's auc: 0.915277 training's binary_logloss: 0.111116 valid_1's auc: 0.825479 valid_1's binary_logloss: 0.137387
[57] training's auc: 0.915933 training's binary_logloss: 0.110828 valid_1's auc: 0.825209 valid_1's binary_logloss: 0.137439
Early stopping, best iteration is:
[27] training's auc: 0.890674 training's binary_logloss: 0.120483 valid_1's auc: 0.83127 valid_1's binary_logloss: 0.135957
[1] training's auc: 0.826536 training's binary_logloss: 0.157612 valid_1's auc: 0.813933 valid_1's binary_logloss: 0.152966
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.833922 training's binary_logloss: 0.151852 valid_1's auc: 0.819276 valid_1's binary_logloss: 0.148232
[3] training's auc: 0.846011 training's binary_logloss: 0.147727 valid_1's auc: 0.827981 valid_1's binary_logloss: 0.144993
[4] training's auc: 0.847993 training's binary_logloss: 0.144577 valid_1's auc: 0.828017 valid_1's binary_logloss: 0.142423
[5] training's auc: 0.851201 training's binary_logloss: 0.142053 valid_1's auc: 0.830137 valid_1's binary_logloss: 0.140459
[6] training's auc: 0.854501 training's binary_logloss: 0.139841 valid_1's auc: 0.833671 valid_1's binary_logloss: 0.138805
[7] training's auc: 0.857002 training's binary_logloss: 0.137983 valid_1's auc: 0.833928 valid_1's binary_logloss: 0.137474
[8] training's auc: 0.859436 training's binary_logloss: 0.136399 valid_1's auc: 0.833571 valid_1's binary_logloss: 0.136393
[9] training's auc: 0.860825 training's binary_logloss: 0.135081 valid_1's auc: 0.833876 valid_1's binary_logloss: 0.135486
[10] training's auc: 0.86242 training's binary_logloss: 0.133888 valid_1's auc: 0.833999 valid_1's binary_logloss: 0.134722
[11] training's auc: 0.864508 training's binary_logloss: 0.132756 valid_1's auc: 0.834346 valid_1's binary_logloss: 0.134118
[12] training's auc: 0.866719 training's binary_logloss: 0.131783 valid_1's auc: 0.834346 valid_1's binary_logloss: 0.1336
[13] training's auc: 0.868192 training's binary_logloss: 0.130893 valid_1's auc: 0.833813 valid_1's binary_logloss: 0.133217
[14] training's auc: 0.869762 training's binary_logloss: 0.130034 valid_1's auc: 0.833901 valid_1's binary_logloss: 0.132878
[15] training's auc: 0.871373 training's binary_logloss: 0.129248 valid_1's auc: 0.833952 valid_1's binary_logloss: 0.132484
[16] training's auc: 0.873266 training's binary_logloss: 0.128519 valid_1's auc: 0.834067 valid_1's binary_logloss: 0.132217
[17] training's auc: 0.874923 training's binary_logloss: 0.127812 valid_1's auc: 0.833979 valid_1's binary_logloss: 0.131992
[18] training's auc: 0.877008 training's binary_logloss: 0.127095 valid_1's auc: 0.834096 valid_1's binary_logloss: 0.131834
[19] training's auc: 0.878778 training's binary_logloss: 0.126438 valid_1's auc: 0.833552 valid_1's binary_logloss: 0.131727
[20] training's auc: 0.879948 training's binary_logloss: 0.125807 valid_1's auc: 0.834697 valid_1's binary_logloss: 0.131556
[21] training's auc: 0.881297 training's binary_logloss: 0.125274 valid_1's auc: 0.835003 valid_1's binary_logloss: 0.131404
[22] training's auc: 0.882174 training's binary_logloss: 0.124746 valid_1's auc: 0.835031 valid_1's binary_logloss: 0.13125
[23] training's auc: 0.883417 training's binary_logloss: 0.124211 valid_1's auc: 0.835055 valid_1's binary_logloss: 0.13112
[24] training's auc: 0.88438 training's binary_logloss: 0.123718 valid_1's auc: 0.835253 valid_1's binary_logloss: 0.131043
[25] training's auc: 0.885391 training's binary_logloss: 0.12328 valid_1's auc: 0.835576 valid_1's binary_logloss: 0.130925
[26] training's auc: 0.886387 training's binary_logloss: 0.122847 valid_1's auc: 0.835916 valid_1's binary_logloss: 0.130847
[27] training's auc: 0.887273 training's binary_logloss: 0.122431 valid_1's auc: 0.835852 valid_1's binary_logloss: 0.130816
[28] training's auc: 0.888955 training's binary_logloss: 0.122026 valid_1's auc: 0.836178 valid_1's binary_logloss: 0.130752
[29] training's auc: 0.890076 training's binary_logloss: 0.121661 valid_1's auc: 0.836584 valid_1's binary_logloss: 0.130648
[30] training's auc: 0.891226 training's binary_logloss: 0.121318 valid_1's auc: 0.836574 valid_1's binary_logloss: 0.130635
[31] training's auc: 0.892165 training's binary_logloss: 0.120913 valid_1's auc: 0.836375 valid_1's binary_logloss: 0.130624
[32] training's auc: 0.893304 training's binary_logloss: 0.120561 valid_1's auc: 0.836579 valid_1's binary_logloss: 0.130578
[33] training's auc: 0.894716 training's binary_logloss: 0.120119 valid_1's auc: 0.836479 valid_1's binary_logloss: 0.130569
[34] training's auc: 0.895702 training's binary_logloss: 0.119727 valid_1's auc: 0.836735 valid_1's binary_logloss: 0.130559
[35] training's auc: 0.896494 training's binary_logloss: 0.119417 valid_1's auc: 0.836668 valid_1's binary_logloss: 0.13054
[36] training's auc: 0.897486 training's binary_logloss: 0.119101 valid_1's auc: 0.836731 valid_1's binary_logloss: 0.130505
[37] training's auc: 0.898338 training's binary_logloss: 0.118768 valid_1's auc: 0.836764 valid_1's binary_logloss: 0.130514
[38] training's auc: 0.899571 training's binary_logloss: 0.118386 valid_1's auc: 0.836756 valid_1's binary_logloss: 0.130511
[39] training's auc: 0.900326 training's binary_logloss: 0.118053 valid_1's auc: 0.836788 valid_1's binary_logloss: 0.13049
[40] training's auc: 0.900921 training's binary_logloss: 0.11773 valid_1's auc: 0.836802 valid_1's binary_logloss: 0.130496
[41] training's auc: 0.90159 training's binary_logloss: 0.117437 valid_1's auc: 0.836427 valid_1's binary_logloss: 0.13057
[42] training's auc: 0.902347 training's binary_logloss: 0.117126 valid_1's auc: 0.836369 valid_1's binary_logloss: 0.130592
[43] training's auc: 0.903066 training's binary_logloss: 0.116867 valid_1's auc: 0.836185 valid_1's binary_logloss: 0.130604
[44] training's auc: 0.90378 training's binary_logloss: 0.116589 valid_1's auc: 0.836402 valid_1's binary_logloss: 0.130567
[45] training's auc: 0.904294 training's binary_logloss: 0.116325 valid_1's auc: 0.83639 valid_1's binary_logloss: 0.13059
[46] training's auc: 0.904872 training's binary_logloss: 0.116048 valid_1's auc: 0.83624 valid_1's binary_logloss: 0.130654
[47] training's auc: 0.905607 training's binary_logloss: 0.115746 valid_1's auc: 0.835959 valid_1's binary_logloss: 0.130712
[48] training's auc: 0.906087 training's binary_logloss: 0.115539 valid_1's auc: 0.835977 valid_1's binary_logloss: 0.130721
[49] training's auc: 0.906514 training's binary_logloss: 0.115289 valid_1's auc: 0.835799 valid_1's binary_logloss: 0.130759
[50] training's auc: 0.907259 training's binary_logloss: 0.114924 valid_1's auc: 0.83566 valid_1's binary_logloss: 0.130782
[51] training's auc: 0.90839 training's binary_logloss: 0.11458 valid_1's auc: 0.835205 valid_1's binary_logloss: 0.130834
[52] training's auc: 0.909006 training's binary_logloss: 0.11429 valid_1's auc: 0.835231 valid_1's binary_logloss: 0.130839
[53] training's auc: 0.9095 training's binary_logloss: 0.114024 valid_1's auc: 0.83503 valid_1's binary_logloss: 0.130885
[54] training's auc: 0.910602 training's binary_logloss: 0.113673 valid_1's auc: 0.835345 valid_1's binary_logloss: 0.130836
[55] training's auc: 0.91125 training's binary_logloss: 0.113424 valid_1's auc: 0.834787 valid_1's binary_logloss: 0.130953
[56] training's auc: 0.911709 training's binary_logloss: 0.113214 valid_1's auc: 0.834664 valid_1's binary_logloss: 0.130992
[57] training's auc: 0.91235 training's binary_logloss: 0.1129 valid_1's auc: 0.834378 valid_1's binary_logloss: 0.13104
[58] training's auc: 0.912611 training's binary_logloss: 0.112722 valid_1's auc: 0.83405 valid_1's binary_logloss: 0.131101
[59] training's auc: 0.91292 training's binary_logloss: 0.112552 valid_1's auc: 0.833795 valid_1's binary_logloss: 0.131142
[60] training's auc: 0.913351 training's binary_logloss: 0.112316 valid_1's auc: 0.833687 valid_1's binary_logloss: 0.13118
[61] training's auc: 0.913715 training's binary_logloss: 0.112097 valid_1's auc: 0.833491 valid_1's binary_logloss: 0.131234
[62] training's auc: 0.914214 training's binary_logloss: 0.111847 valid_1's auc: 0.833702 valid_1's binary_logloss: 0.131245
[63] training's auc: 0.914492 training's binary_logloss: 0.111689 valid_1's auc: 0.833594 valid_1's binary_logloss: 0.131284
[64] training's auc: 0.914815 training's binary_logloss: 0.111489 valid_1's auc: 0.833607 valid_1's binary_logloss: 0.131345
[65] training's auc: 0.915111 training's binary_logloss: 0.111296 valid_1's auc: 0.833351 valid_1's binary_logloss: 0.131384
[66] training's auc: 0.915387 training's binary_logloss: 0.111105 valid_1's auc: 0.83321 valid_1's binary_logloss: 0.131409
[67] training's auc: 0.915761 training's binary_logloss: 0.110879 valid_1's auc: 0.832888 valid_1's binary_logloss: 0.131485
[68] training's auc: 0.916041 training's binary_logloss: 0.110693 valid_1's auc: 0.832765 valid_1's binary_logloss: 0.131506
[69] training's auc: 0.916312 training's binary_logloss: 0.110534 valid_1's auc: 0.832775 valid_1's binary_logloss: 0.13152
Early stopping, best iteration is:
[39] training's auc: 0.900326 training's binary_logloss: 0.118053 valid_1's auc: 0.836788 valid_1's binary_logloss: 0.13049
[1] training's auc: 0.826981 training's binary_logloss: 0.154236 valid_1's auc: 0.809544 valid_1's binary_logloss: 0.160225
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.83318 training's binary_logloss: 0.148821 valid_1's auc: 0.816379 valid_1's binary_logloss: 0.154958
[3] training's auc: 0.839979 training's binary_logloss: 0.14483 valid_1's auc: 0.821316 valid_1's binary_logloss: 0.151317
[4] training's auc: 0.846419 training's binary_logloss: 0.141615 valid_1's auc: 0.824525 valid_1's binary_logloss: 0.148684
[5] training's auc: 0.849909 training's binary_logloss: 0.138858 valid_1's auc: 0.827563 valid_1's binary_logloss: 0.146511
[6] training's auc: 0.852023 training's binary_logloss: 0.136697 valid_1's auc: 0.8285 valid_1's binary_logloss: 0.144887
[7] training's auc: 0.853577 training's binary_logloss: 0.134875 valid_1's auc: 0.829318 valid_1's binary_logloss: 0.143413
[8] training's auc: 0.85589 training's binary_logloss: 0.133321 valid_1's auc: 0.829899 valid_1's binary_logloss: 0.142375
[9] training's auc: 0.85859 training's binary_logloss: 0.131982 valid_1's auc: 0.830585 valid_1's binary_logloss: 0.141475
[10] training's auc: 0.861139 training's binary_logloss: 0.130776 valid_1's auc: 0.831109 valid_1's binary_logloss: 0.140659
[11] training's auc: 0.862927 training's binary_logloss: 0.129727 valid_1's auc: 0.831841 valid_1's binary_logloss: 0.140031
[12] training's auc: 0.866804 training's binary_logloss: 0.128767 valid_1's auc: 0.833663 valid_1's binary_logloss: 0.139461
[13] training's auc: 0.869373 training's binary_logloss: 0.127873 valid_1's auc: 0.834192 valid_1's binary_logloss: 0.138979
[14] training's auc: 0.870511 training's binary_logloss: 0.12712 valid_1's auc: 0.833954 valid_1's binary_logloss: 0.138631
[15] training's auc: 0.872438 training's binary_logloss: 0.126333 valid_1's auc: 0.834223 valid_1's binary_logloss: 0.138327
[16] training's auc: 0.873811 training's binary_logloss: 0.125651 valid_1's auc: 0.834597 valid_1's binary_logloss: 0.137972
[17] training's auc: 0.874942 training's binary_logloss: 0.124989 valid_1's auc: 0.835925 valid_1's binary_logloss: 0.137684
[18] training's auc: 0.87696 training's binary_logloss: 0.124348 valid_1's auc: 0.835854 valid_1's binary_logloss: 0.137406
[19] training's auc: 0.878464 training's binary_logloss: 0.123715 valid_1's auc: 0.835325 valid_1's binary_logloss: 0.137239
[20] training's auc: 0.880092 training's binary_logloss: 0.123177 valid_1's auc: 0.835659 valid_1's binary_logloss: 0.137069
[21] training's auc: 0.88192 training's binary_logloss: 0.122617 valid_1's auc: 0.836472 valid_1's binary_logloss: 0.136871
[22] training's auc: 0.883704 training's binary_logloss: 0.122012 valid_1's auc: 0.835974 valid_1's binary_logloss: 0.136839
[23] training's auc: 0.884942 training's binary_logloss: 0.12148 valid_1's auc: 0.835693 valid_1's binary_logloss: 0.136752
[24] training's auc: 0.885751 training's binary_logloss: 0.121041 valid_1's auc: 0.835475 valid_1's binary_logloss: 0.136718
[25] training's auc: 0.886647 training's binary_logloss: 0.120619 valid_1's auc: 0.835748 valid_1's binary_logloss: 0.136632
[26] training's auc: 0.887608 training's binary_logloss: 0.120249 valid_1's auc: 0.836039 valid_1's binary_logloss: 0.136546
[27] training's auc: 0.888983 training's binary_logloss: 0.119805 valid_1's auc: 0.836076 valid_1's binary_logloss: 0.136504
[28] training's auc: 0.890412 training's binary_logloss: 0.119343 valid_1's auc: 0.836036 valid_1's binary_logloss: 0.136451
[29] training's auc: 0.891652 training's binary_logloss: 0.119003 valid_1's auc: 0.835839 valid_1's binary_logloss: 0.136429
[30] training's auc: 0.892492 training's binary_logloss: 0.118635 valid_1's auc: 0.835543 valid_1's binary_logloss: 0.136479
[31] training's auc: 0.89346 training's binary_logloss: 0.118324 valid_1's auc: 0.835264 valid_1's binary_logloss: 0.13649
[32] training's auc: 0.894608 training's binary_logloss: 0.117824 valid_1's auc: 0.835525 valid_1's binary_logloss: 0.136385
[33] training's auc: 0.895812 training's binary_logloss: 0.117444 valid_1's auc: 0.83534 valid_1's binary_logloss: 0.136439
[34] training's auc: 0.896878 training's binary_logloss: 0.117123 valid_1's auc: 0.835344 valid_1's binary_logloss: 0.136436
[35] training's auc: 0.898083 training's binary_logloss: 0.116745 valid_1's auc: 0.835112 valid_1's binary_logloss: 0.13644
[36] training's auc: 0.898771 training's binary_logloss: 0.11644 valid_1's auc: 0.83522 valid_1's binary_logloss: 0.136427
[37] training's auc: 0.899968 training's binary_logloss: 0.11604 valid_1's auc: 0.835186 valid_1's binary_logloss: 0.136437
[38] training's auc: 0.900848 training's binary_logloss: 0.115693 valid_1's auc: 0.834808 valid_1's binary_logloss: 0.136568
[39] training's auc: 0.901811 training's binary_logloss: 0.115325 valid_1's auc: 0.834869 valid_1's binary_logloss: 0.136583
[40] training's auc: 0.902572 training's binary_logloss: 0.115013 valid_1's auc: 0.834885 valid_1's binary_logloss: 0.136572
[41] training's auc: 0.903625 training's binary_logloss: 0.114656 valid_1's auc: 0.835368 valid_1's binary_logloss: 0.13653
[42] training's auc: 0.904412 training's binary_logloss: 0.11431 valid_1's auc: 0.835207 valid_1's binary_logloss: 0.136602
[43] training's auc: 0.90527 training's binary_logloss: 0.113953 valid_1's auc: 0.835318 valid_1's binary_logloss: 0.136603
[44] training's auc: 0.906103 training's binary_logloss: 0.113593 valid_1's auc: 0.835348 valid_1's binary_logloss: 0.136631
[45] training's auc: 0.906916 training's binary_logloss: 0.113352 valid_1's auc: 0.835336 valid_1's binary_logloss: 0.136637
[46] training's auc: 0.907522 training's binary_logloss: 0.113109 valid_1's auc: 0.834717 valid_1's binary_logloss: 0.136786
[47] training's auc: 0.908075 training's binary_logloss: 0.112867 valid_1's auc: 0.834353 valid_1's binary_logloss: 0.136883
[48] training's auc: 0.908895 training's binary_logloss: 0.112525 valid_1's auc: 0.834442 valid_1's binary_logloss: 0.136881
[49] training's auc: 0.909415 training's binary_logloss: 0.112304 valid_1's auc: 0.834089 valid_1's binary_logloss: 0.136982
[50] training's auc: 0.909814 training's binary_logloss: 0.112051 valid_1's auc: 0.83388 valid_1's binary_logloss: 0.137055
[51] training's auc: 0.910926 training's binary_logloss: 0.111633 valid_1's auc: 0.833764 valid_1's binary_logloss: 0.137092
Early stopping, best iteration is:
[21] training's auc: 0.88192 training's binary_logloss: 0.122617 valid_1's auc: 0.836472 valid_1's binary_logloss: 0.136871
[1] training's auc: 0.833554 training's binary_logloss: 0.162544 valid_1's auc: 0.809374 valid_1's binary_logloss: 0.163756
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.834408 training's binary_logloss: 0.160721 valid_1's auc: 0.810272 valid_1's binary_logloss: 0.162237
[3] training's auc: 0.835394 training's binary_logloss: 0.159082 valid_1's auc: 0.810548 valid_1's binary_logloss: 0.160871
[4] training's auc: 0.838899 training's binary_logloss: 0.157582 valid_1's auc: 0.812747 valid_1's binary_logloss: 0.15962
[5] training's auc: 0.840351 training's binary_logloss: 0.156209 valid_1's auc: 0.814265 valid_1's binary_logloss: 0.15844
[6] training's auc: 0.840646 training's binary_logloss: 0.15493 valid_1's auc: 0.81386 valid_1's binary_logloss: 0.157397
[7] training's auc: 0.841543 training's binary_logloss: 0.15373 valid_1's auc: 0.814288 valid_1's binary_logloss: 0.156428
[8] training's auc: 0.842107 training's binary_logloss: 0.152623 valid_1's auc: 0.814339 valid_1's binary_logloss: 0.155524
[9] training's auc: 0.844663 training's binary_logloss: 0.151573 valid_1's auc: 0.81668 valid_1's binary_logloss: 0.154629
[10] training's auc: 0.84698 training's binary_logloss: 0.150581 valid_1's auc: 0.818362 valid_1's binary_logloss: 0.153822
[11] training's auc: 0.848471 training's binary_logloss: 0.149633 valid_1's auc: 0.819097 valid_1's binary_logloss: 0.153039
[12] training's auc: 0.848973 training's binary_logloss: 0.148749 valid_1's auc: 0.818541 valid_1's binary_logloss: 0.1523
[13] training's auc: 0.850055 training's binary_logloss: 0.147917 valid_1's auc: 0.818981 valid_1's binary_logloss: 0.151654
[14] training's auc: 0.85106 training's binary_logloss: 0.147116 valid_1's auc: 0.819982 valid_1's binary_logloss: 0.151016
[15] training's auc: 0.85236 training's binary_logloss: 0.146348 valid_1's auc: 0.822217 valid_1's binary_logloss: 0.150409
[16] training's auc: 0.852761 training's binary_logloss: 0.145614 valid_1's auc: 0.822393 valid_1's binary_logloss: 0.149825
[17] training's auc: 0.854535 training's binary_logloss: 0.144919 valid_1's auc: 0.823603 valid_1's binary_logloss: 0.149252
[18] training's auc: 0.855618 training's binary_logloss: 0.144231 valid_1's auc: 0.824472 valid_1's binary_logloss: 0.14872
[19] training's auc: 0.855886 training's binary_logloss: 0.143579 valid_1's auc: 0.824791 valid_1's binary_logloss: 0.148213
[20] training's auc: 0.856486 training's binary_logloss: 0.142954 valid_1's auc: 0.824649 valid_1's binary_logloss: 0.147757
[21] training's auc: 0.856948 training's binary_logloss: 0.142355 valid_1's auc: 0.824831 valid_1's binary_logloss: 0.147306
[22] training's auc: 0.857309 training's binary_logloss: 0.141772 valid_1's auc: 0.824775 valid_1's binary_logloss: 0.146894
[23] training's auc: 0.857557 training's binary_logloss: 0.141208 valid_1's auc: 0.824999 valid_1's binary_logloss: 0.146481
[24] training's auc: 0.858596 training's binary_logloss: 0.140678 valid_1's auc: 0.82608 valid_1's binary_logloss: 0.146063
[25] training's auc: 0.858888 training's binary_logloss: 0.140157 valid_1's auc: 0.826538 valid_1's binary_logloss: 0.145642
[26] training's auc: 0.859374 training's binary_logloss: 0.139665 valid_1's auc: 0.826526 valid_1's binary_logloss: 0.145282
[27] training's auc: 0.859817 training's binary_logloss: 0.139183 valid_1's auc: 0.826702 valid_1's binary_logloss: 0.144927
[28] training's auc: 0.859962 training's binary_logloss: 0.138719 valid_1's auc: 0.826687 valid_1's binary_logloss: 0.144592
[29] training's auc: 0.860467 training's binary_logloss: 0.138277 valid_1's auc: 0.826924 valid_1's binary_logloss: 0.144279
[30] training's auc: 0.861005 training's binary_logloss: 0.137834 valid_1's auc: 0.827543 valid_1's binary_logloss: 0.143933
[31] training's auc: 0.862332 training's binary_logloss: 0.137406 valid_1's auc: 0.82794 valid_1's binary_logloss: 0.143618
[32] training's auc: 0.862524 training's binary_logloss: 0.137001 valid_1's auc: 0.827978 valid_1's binary_logloss: 0.143333
[33] training's auc: 0.862853 training's binary_logloss: 0.136609 valid_1's auc: 0.827937 valid_1's binary_logloss: 0.143045
[34] training's auc: 0.86312 training's binary_logloss: 0.136226 valid_1's auc: 0.827907 valid_1's binary_logloss: 0.142784
[35] training's auc: 0.863436 training's binary_logloss: 0.135868 valid_1's auc: 0.82815 valid_1's binary_logloss: 0.142534
[36] training's auc: 0.863581 training's binary_logloss: 0.135515 valid_1's auc: 0.828185 valid_1's binary_logloss: 0.142275
[37] training's auc: 0.86388 training's binary_logloss: 0.135168 valid_1's auc: 0.828367 valid_1's binary_logloss: 0.142032
[38] training's auc: 0.864021 training's binary_logloss: 0.134833 valid_1's auc: 0.828344 valid_1's binary_logloss: 0.141797
[39] training's auc: 0.864613 training's binary_logloss: 0.134502 valid_1's auc: 0.828733 valid_1's binary_logloss: 0.141559
[40] training's auc: 0.864905 training's binary_logloss: 0.134174 valid_1's auc: 0.828864 valid_1's binary_logloss: 0.141345
[41] training's auc: 0.864796 training's binary_logloss: 0.133862 valid_1's auc: 0.828513 valid_1's binary_logloss: 0.141132
[42] training's auc: 0.865422 training's binary_logloss: 0.13355 valid_1's auc: 0.828571 valid_1's binary_logloss: 0.140918
[43] training's auc: 0.865699 training's binary_logloss: 0.133245 valid_1's auc: 0.828832 valid_1's binary_logloss: 0.140722
[44] training's auc: 0.866628 training's binary_logloss: 0.132953 valid_1's auc: 0.82865 valid_1's binary_logloss: 0.140546
[45] training's auc: 0.867185 training's binary_logloss: 0.132671 valid_1's auc: 0.828716 valid_1's binary_logloss: 0.140378
[46] training's auc: 0.867573 training's binary_logloss: 0.132399 valid_1's auc: 0.828603 valid_1's binary_logloss: 0.140213
[47] training's auc: 0.868346 training's binary_logloss: 0.132116 valid_1's auc: 0.828753 valid_1's binary_logloss: 0.140043
[48] training's auc: 0.868574 training's binary_logloss: 0.131838 valid_1's auc: 0.828883 valid_1's binary_logloss: 0.13989
[49] training's auc: 0.869185 training's binary_logloss: 0.131572 valid_1's auc: 0.829007 valid_1's binary_logloss: 0.139735
[50] training's auc: 0.86939 training's binary_logloss: 0.131312 valid_1's auc: 0.828962 valid_1's binary_logloss: 0.139593
[51] training's auc: 0.869891 training's binary_logloss: 0.131061 valid_1's auc: 0.829077 valid_1's binary_logloss: 0.139453
[52] training's auc: 0.870056 training's binary_logloss: 0.130812 valid_1's auc: 0.829144 valid_1's binary_logloss: 0.139305
[53] training's auc: 0.870347 training's binary_logloss: 0.130574 valid_1's auc: 0.829094 valid_1's binary_logloss: 0.139171
[54] training's auc: 0.870761 training's binary_logloss: 0.130344 valid_1's auc: 0.829273 valid_1's binary_logloss: 0.139041
[55] training's auc: 0.871094 training's binary_logloss: 0.130125 valid_1's auc: 0.829215 valid_1's binary_logloss: 0.138928
[56] training's auc: 0.871612 training's binary_logloss: 0.129899 valid_1's auc: 0.829352 valid_1's binary_logloss: 0.138805
[57] training's auc: 0.871957 training's binary_logloss: 0.129687 valid_1's auc: 0.829341 valid_1's binary_logloss: 0.13868
[58] training's auc: 0.872399 training's binary_logloss: 0.129461 valid_1's auc: 0.82949 valid_1's binary_logloss: 0.138571
[59] training's auc: 0.873004 training's binary_logloss: 0.129264 valid_1's auc: 0.829744 valid_1's binary_logloss: 0.138464
[60] training's auc: 0.87328 training's binary_logloss: 0.129061 valid_1's auc: 0.829976 valid_1's binary_logloss: 0.138349
[61] training's auc: 0.873414 training's binary_logloss: 0.128873 valid_1's auc: 0.829966 valid_1's binary_logloss: 0.138254
[62] training's auc: 0.87384 training's binary_logloss: 0.128658 valid_1's auc: 0.830198 valid_1's binary_logloss: 0.138158
[63] training's auc: 0.874343 training's binary_logloss: 0.128434 valid_1's auc: 0.830123 valid_1's binary_logloss: 0.138072
[64] training's auc: 0.874796 training's binary_logloss: 0.128236 valid_1's auc: 0.830189 valid_1's binary_logloss: 0.137989
[65] training's auc: 0.87515 training's binary_logloss: 0.128024 valid_1's auc: 0.830189 valid_1's binary_logloss: 0.137907
[66] training's auc: 0.875662 training's binary_logloss: 0.127826 valid_1's auc: 0.830376 valid_1's binary_logloss: 0.13782
[67] training's auc: 0.87604 training's binary_logloss: 0.127628 valid_1's auc: 0.830281 valid_1's binary_logloss: 0.137756
[68] training's auc: 0.876385 training's binary_logloss: 0.12744 valid_1's auc: 0.830178 valid_1's binary_logloss: 0.137696
[69] training's auc: 0.876669 training's binary_logloss: 0.127265 valid_1's auc: 0.830175 valid_1's binary_logloss: 0.137619
[70] training's auc: 0.877085 training's binary_logloss: 0.127081 valid_1's auc: 0.830168 valid_1's binary_logloss: 0.137543
[71] training's auc: 0.877303 training's binary_logloss: 0.126906 valid_1's auc: 0.830186 valid_1's binary_logloss: 0.137466
[72] training's auc: 0.877695 training's binary_logloss: 0.126734 valid_1's auc: 0.830313 valid_1's binary_logloss: 0.137394
[73] training's auc: 0.878252 training's binary_logloss: 0.126568 valid_1's auc: 0.830459 valid_1's binary_logloss: 0.137326
[74] training's auc: 0.878552 training's binary_logloss: 0.126399 valid_1's auc: 0.830389 valid_1's binary_logloss: 0.137268
[75] training's auc: 0.878847 training's binary_logloss: 0.126233 valid_1's auc: 0.830387 valid_1's binary_logloss: 0.137195
[76] training's auc: 0.879223 training's binary_logloss: 0.126067 valid_1's auc: 0.830422 valid_1's binary_logloss: 0.137148
[77] training's auc: 0.879613 training's binary_logloss: 0.125892 valid_1's auc: 0.830546 valid_1's binary_logloss: 0.137069
[78] training's auc: 0.879885 training's binary_logloss: 0.125725 valid_1's auc: 0.83071 valid_1's binary_logloss: 0.137
[79] training's auc: 0.880211 training's binary_logloss: 0.12556 valid_1's auc: 0.830809 valid_1's binary_logloss: 0.136931
[80] training's auc: 0.880562 training's binary_logloss: 0.125401 valid_1's auc: 0.830779 valid_1's binary_logloss: 0.136888
[81] training's auc: 0.880816 training's binary_logloss: 0.125251 valid_1's auc: 0.830908 valid_1's binary_logloss: 0.136822
[82] training's auc: 0.881056 training's binary_logloss: 0.125107 valid_1's auc: 0.831028 valid_1's binary_logloss: 0.136761
[83] training's auc: 0.881286 training's binary_logloss: 0.12496 valid_1's auc: 0.831078 valid_1's binary_logloss: 0.13671
[84] training's auc: 0.881823 training's binary_logloss: 0.124801 valid_1's auc: 0.831215 valid_1's binary_logloss: 0.136673
[85] training's auc: 0.882153 training's binary_logloss: 0.124663 valid_1's auc: 0.831239 valid_1's binary_logloss: 0.136632
[86] training's auc: 0.882446 training's binary_logloss: 0.124526 valid_1's auc: 0.8313 valid_1's binary_logloss: 0.136583
[87] training's auc: 0.882854 training's binary_logloss: 0.124388 valid_1's auc: 0.831374 valid_1's binary_logloss: 0.136535
[88] training's auc: 0.883032 training's binary_logloss: 0.124257 valid_1's auc: 0.831378 valid_1's binary_logloss: 0.13649
[89] training's auc: 0.883325 training's binary_logloss: 0.124122 valid_1's auc: 0.831386 valid_1's binary_logloss: 0.136453
[90] training's auc: 0.883681 training's binary_logloss: 0.123966 valid_1's auc: 0.831326 valid_1's binary_logloss: 0.136432
[91] training's auc: 0.884053 training's binary_logloss: 0.123832 valid_1's auc: 0.831381 valid_1's binary_logloss: 0.136421
[92] training's auc: 0.884267 training's binary_logloss: 0.123702 valid_1's auc: 0.831471 valid_1's binary_logloss: 0.136378
[93] training's auc: 0.884545 training's binary_logloss: 0.12357 valid_1's auc: 0.831606 valid_1's binary_logloss: 0.13632
[94] training's auc: 0.884801 training's binary_logloss: 0.123438 valid_1's auc: 0.831699 valid_1's binary_logloss: 0.13626
[95] training's auc: 0.885092 training's binary_logloss: 0.123313 valid_1's auc: 0.831787 valid_1's binary_logloss: 0.13621
[96] training's auc: 0.885431 training's binary_logloss: 0.123176 valid_1's auc: 0.831779 valid_1's binary_logloss: 0.136179
[97] training's auc: 0.885606 training's binary_logloss: 0.123054 valid_1's auc: 0.831772 valid_1's binary_logloss: 0.136145
[98] training's auc: 0.885931 training's binary_logloss: 0.122923 valid_1's auc: 0.831962 valid_1's binary_logloss: 0.136102
[99] training's auc: 0.88622 training's binary_logloss: 0.122807 valid_1's auc: 0.831926 valid_1's binary_logloss: 0.136075
[100] training's auc: 0.886509 training's binary_logloss: 0.122684 valid_1's auc: 0.832055 valid_1's binary_logloss: 0.136055
Did not meet early stopping. Best iteration is:
[100] training's auc: 0.886509 training's binary_logloss: 0.122684 valid_1's auc: 0.832055 valid_1's binary_logloss: 0.136055
[1] training's auc: 0.827262 training's binary_logloss: 0.164984 valid_1's auc: 0.814459 valid_1's binary_logloss: 0.159102
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.830551 training's binary_logloss: 0.163203 valid_1's auc: 0.816647 valid_1's binary_logloss: 0.15761
[3] training's auc: 0.832488 training's binary_logloss: 0.161548 valid_1's auc: 0.817481 valid_1's binary_logloss: 0.156239
[4] training's auc: 0.835842 training's binary_logloss: 0.160044 valid_1's auc: 0.819822 valid_1's binary_logloss: 0.154962
[5] training's auc: 0.836913 training's binary_logloss: 0.158667 valid_1's auc: 0.820096 valid_1's binary_logloss: 0.153811
[6] training's auc: 0.840105 training's binary_logloss: 0.157353 valid_1's auc: 0.822531 valid_1's binary_logloss: 0.152728
[7] training's auc: 0.841561 training's binary_logloss: 0.15615 valid_1's auc: 0.823889 valid_1's binary_logloss: 0.151741
[8] training's auc: 0.842854 training's binary_logloss: 0.155019 valid_1's auc: 0.82521 valid_1's binary_logloss: 0.150799
[9] training's auc: 0.843458 training's binary_logloss: 0.153964 valid_1's auc: 0.825692 valid_1's binary_logloss: 0.149919
[10] training's auc: 0.84395 training's binary_logloss: 0.152972 valid_1's auc: 0.826443 valid_1's binary_logloss: 0.149104
[11] training's auc: 0.844672 training's binary_logloss: 0.15203 valid_1's auc: 0.826672 valid_1's binary_logloss: 0.148332
[12] training's auc: 0.844944 training's binary_logloss: 0.151129 valid_1's auc: 0.826907 valid_1's binary_logloss: 0.147591
[13] training's auc: 0.849167 training's binary_logloss: 0.150265 valid_1's auc: 0.829954 valid_1's binary_logloss: 0.146921
[14] training's auc: 0.849691 training's binary_logloss: 0.149446 valid_1's auc: 0.830112 valid_1's binary_logloss: 0.146274
[15] training's auc: 0.850592 training's binary_logloss: 0.148674 valid_1's auc: 0.830058 valid_1's binary_logloss: 0.145676
[16] training's auc: 0.85089 training's binary_logloss: 0.147942 valid_1's auc: 0.830336 valid_1's binary_logloss: 0.14511
[17] training's auc: 0.851504 training's binary_logloss: 0.147233 valid_1's auc: 0.830348 valid_1's binary_logloss: 0.144549
[18] training's auc: 0.853085 training's binary_logloss: 0.146561 valid_1's auc: 0.831213 valid_1's binary_logloss: 0.144043
[19] training's auc: 0.853338 training's binary_logloss: 0.145919 valid_1's auc: 0.831153 valid_1's binary_logloss: 0.143542
[20] training's auc: 0.853709 training's binary_logloss: 0.145312 valid_1's auc: 0.831174 valid_1's binary_logloss: 0.143059
[21] training's auc: 0.853893 training's binary_logloss: 0.144724 valid_1's auc: 0.831255 valid_1's binary_logloss: 0.1426
[22] training's auc: 0.854515 training's binary_logloss: 0.144124 valid_1's auc: 0.83154 valid_1's binary_logloss: 0.142174
[23] training's auc: 0.855437 training's binary_logloss: 0.143551 valid_1's auc: 0.832046 valid_1's binary_logloss: 0.141728
[24] training's auc: 0.855848 training's binary_logloss: 0.142994 valid_1's auc: 0.83219 valid_1's binary_logloss: 0.141329
[25] training's auc: 0.856236 training's binary_logloss: 0.142476 valid_1's auc: 0.83247 valid_1's binary_logloss: 0.140942
[26] training's auc: 0.857535 training's binary_logloss: 0.141961 valid_1's auc: 0.833284 valid_1's binary_logloss: 0.140583
[27] training's auc: 0.85816 training's binary_logloss: 0.141475 valid_1's auc: 0.83385 valid_1's binary_logloss: 0.140229
[28] training's auc: 0.85834 training's binary_logloss: 0.141009 valid_1's auc: 0.834773 valid_1's binary_logloss: 0.139891
[29] training's auc: 0.859051 training's binary_logloss: 0.140565 valid_1's auc: 0.834765 valid_1's binary_logloss: 0.139566
[30] training's auc: 0.859594 training's binary_logloss: 0.140122 valid_1's auc: 0.834965 valid_1's binary_logloss: 0.139251
[31] training's auc: 0.860228 training's binary_logloss: 0.139684 valid_1's auc: 0.834736 valid_1's binary_logloss: 0.138938
[32] training's auc: 0.860423 training's binary_logloss: 0.139266 valid_1's auc: 0.834899 valid_1's binary_logloss: 0.138657
[33] training's auc: 0.860966 training's binary_logloss: 0.138868 valid_1's auc: 0.834923 valid_1's binary_logloss: 0.13837
[34] training's auc: 0.86156 training's binary_logloss: 0.138489 valid_1's auc: 0.835098 valid_1's binary_logloss: 0.138092
[35] training's auc: 0.861928 training's binary_logloss: 0.138111 valid_1's auc: 0.834853 valid_1's binary_logloss: 0.137864
[36] training's auc: 0.862497 training's binary_logloss: 0.137743 valid_1's auc: 0.835174 valid_1's binary_logloss: 0.137595
[37] training's auc: 0.863114 training's binary_logloss: 0.137389 valid_1's auc: 0.83502 valid_1's binary_logloss: 0.137377
[38] training's auc: 0.863514 training's binary_logloss: 0.137037 valid_1's auc: 0.835015 valid_1's binary_logloss: 0.137149
[39] training's auc: 0.864036 training's binary_logloss: 0.136699 valid_1's auc: 0.834852 valid_1's binary_logloss: 0.136933
[40] training's auc: 0.864322 training's binary_logloss: 0.136369 valid_1's auc: 0.834849 valid_1's binary_logloss: 0.136712
[41] training's auc: 0.864698 training's binary_logloss: 0.13605 valid_1's auc: 0.83483 valid_1's binary_logloss: 0.136515
[42] training's auc: 0.865142 training's binary_logloss: 0.135737 valid_1's auc: 0.834867 valid_1's binary_logloss: 0.136307
[43] training's auc: 0.865385 training's binary_logloss: 0.135444 valid_1's auc: 0.834807 valid_1's binary_logloss: 0.136113
[44] training's auc: 0.865917 training's binary_logloss: 0.135152 valid_1's auc: 0.834847 valid_1's binary_logloss: 0.135923
[45] training's auc: 0.866261 training's binary_logloss: 0.134855 valid_1's auc: 0.834987 valid_1's binary_logloss: 0.135731
[46] training's auc: 0.866517 training's binary_logloss: 0.134574 valid_1's auc: 0.835046 valid_1's binary_logloss: 0.135542
[47] training's auc: 0.867096 training's binary_logloss: 0.134304 valid_1's auc: 0.835268 valid_1's binary_logloss: 0.13538
[48] training's auc: 0.86735 training's binary_logloss: 0.134046 valid_1's auc: 0.835167 valid_1's binary_logloss: 0.135222
[49] training's auc: 0.867557 training's binary_logloss: 0.133781 valid_1's auc: 0.835182 valid_1's binary_logloss: 0.135052
[50] training's auc: 0.867932 training's binary_logloss: 0.133531 valid_1's auc: 0.835298 valid_1's binary_logloss: 0.134895
[51] training's auc: 0.868198 training's binary_logloss: 0.133286 valid_1's auc: 0.835252 valid_1's binary_logloss: 0.134748
[52] training's auc: 0.868351 training's binary_logloss: 0.133055 valid_1's auc: 0.835057 valid_1's binary_logloss: 0.134624
[53] training's auc: 0.868656 training's binary_logloss: 0.132822 valid_1's auc: 0.834948 valid_1's binary_logloss: 0.134499
[54] training's auc: 0.869209 training's binary_logloss: 0.132571 valid_1's auc: 0.835191 valid_1's binary_logloss: 0.134377
[55] training's auc: 0.869434 training's binary_logloss: 0.132326 valid_1's auc: 0.83536 valid_1's binary_logloss: 0.13426
[56] training's auc: 0.869864 training's binary_logloss: 0.13209 valid_1's auc: 0.835405 valid_1's binary_logloss: 0.13413
[57] training's auc: 0.870135 training's binary_logloss: 0.131872 valid_1's auc: 0.835367 valid_1's binary_logloss: 0.134025
[58] training's auc: 0.870594 training's binary_logloss: 0.131647 valid_1's auc: 0.835333 valid_1's binary_logloss: 0.133921
[59] training's auc: 0.871025 training's binary_logloss: 0.131431 valid_1's auc: 0.83533 valid_1's binary_logloss: 0.133819
[60] training's auc: 0.871346 training's binary_logloss: 0.13122 valid_1's auc: 0.83531 valid_1's binary_logloss: 0.133704
[61] training's auc: 0.871735 training's binary_logloss: 0.131024 valid_1's auc: 0.835202 valid_1's binary_logloss: 0.133615
[62] training's auc: 0.872014 training's binary_logloss: 0.130829 valid_1's auc: 0.835255 valid_1's binary_logloss: 0.133519
[63] training's auc: 0.872287 training's binary_logloss: 0.130631 valid_1's auc: 0.83523 valid_1's binary_logloss: 0.133428
[64] training's auc: 0.872718 training's binary_logloss: 0.130435 valid_1's auc: 0.835118 valid_1's binary_logloss: 0.133349
[65] training's auc: 0.872962 training's binary_logloss: 0.130241 valid_1's auc: 0.835002 valid_1's binary_logloss: 0.133269
[66] training's auc: 0.873417 training's binary_logloss: 0.130035 valid_1's auc: 0.834742 valid_1's binary_logloss: 0.133195
[67] training's auc: 0.873736 training's binary_logloss: 0.12984 valid_1's auc: 0.83455 valid_1's binary_logloss: 0.133127
[68] training's auc: 0.874046 training's binary_logloss: 0.129648 valid_1's auc: 0.834585 valid_1's binary_logloss: 0.133051
[69] training's auc: 0.87447 training's binary_logloss: 0.129473 valid_1's auc: 0.834662 valid_1's binary_logloss: 0.132977
[70] training's auc: 0.874752 training's binary_logloss: 0.129302 valid_1's auc: 0.834585 valid_1's binary_logloss: 0.13291
[71] training's auc: 0.87516 training's binary_logloss: 0.12913 valid_1's auc: 0.834553 valid_1's binary_logloss: 0.132837
[72] training's auc: 0.875175 training's binary_logloss: 0.128958 valid_1's auc: 0.834445 valid_1's binary_logloss: 0.13277
[73] training's auc: 0.875522 training's binary_logloss: 0.12878 valid_1's auc: 0.834429 valid_1's binary_logloss: 0.132714
[74] training's auc: 0.875833 training's binary_logloss: 0.128601 valid_1's auc: 0.83442 valid_1's binary_logloss: 0.132667
[75] training's auc: 0.876159 training's binary_logloss: 0.128433 valid_1's auc: 0.834458 valid_1's binary_logloss: 0.132618
[76] training's auc: 0.876415 training's binary_logloss: 0.128277 valid_1's auc: 0.834277 valid_1's binary_logloss: 0.13257
[77] training's auc: 0.877178 training's binary_logloss: 0.128074 valid_1's auc: 0.834383 valid_1's binary_logloss: 0.132494
[78] training's auc: 0.877351 training's binary_logloss: 0.127925 valid_1's auc: 0.834213 valid_1's binary_logloss: 0.132459
[79] training's auc: 0.877613 training's binary_logloss: 0.127769 valid_1's auc: 0.834214 valid_1's binary_logloss: 0.132406
[80] training's auc: 0.878348 training's binary_logloss: 0.127566 valid_1's auc: 0.834598 valid_1's binary_logloss: 0.132323
[81] training's auc: 0.878479 training's binary_logloss: 0.127425 valid_1's auc: 0.834567 valid_1's binary_logloss: 0.132274
[82] training's auc: 0.878749 training's binary_logloss: 0.12728 valid_1's auc: 0.83453 valid_1's binary_logloss: 0.132232
[83] training's auc: 0.879358 training's binary_logloss: 0.12709 valid_1's auc: 0.834772 valid_1's binary_logloss: 0.132166
[84] training's auc: 0.879659 training's binary_logloss: 0.126932 valid_1's auc: 0.834822 valid_1's binary_logloss: 0.132102
[85] training's auc: 0.879947 training's binary_logloss: 0.126797 valid_1's auc: 0.834782 valid_1's binary_logloss: 0.132061
[86] training's auc: 0.880229 training's binary_logloss: 0.126653 valid_1's auc: 0.834734 valid_1's binary_logloss: 0.132027
Early stopping, best iteration is:
[56] training's auc: 0.869864 training's binary_logloss: 0.13209 valid_1's auc: 0.835405 valid_1's binary_logloss: 0.13413
[1] training's auc: 0.8281 training's binary_logloss: 0.161191 valid_1's auc: 0.808103 valid_1's binary_logloss: 0.166754
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.830121 training's binary_logloss: 0.159489 valid_1's auc: 0.812189 valid_1's binary_logloss: 0.165109
[3] training's auc: 0.835635 training's binary_logloss: 0.157945 valid_1's auc: 0.816194 valid_1's binary_logloss: 0.163649
[4] training's auc: 0.836461 training's binary_logloss: 0.15651 valid_1's auc: 0.817627 valid_1's binary_logloss: 0.162285
[5] training's auc: 0.836565 training's binary_logloss: 0.155182 valid_1's auc: 0.817499 valid_1's binary_logloss: 0.161064
[6] training's auc: 0.837705 training's binary_logloss: 0.153933 valid_1's auc: 0.818751 valid_1's binary_logloss: 0.159919
[7] training's auc: 0.837873 training's binary_logloss: 0.15278 valid_1's auc: 0.819147 valid_1's binary_logloss: 0.158832
[8] training's auc: 0.841403 training's binary_logloss: 0.151696 valid_1's auc: 0.822279 valid_1's binary_logloss: 0.157819
[9] training's auc: 0.843963 training's binary_logloss: 0.150679 valid_1's auc: 0.825362 valid_1's binary_logloss: 0.15687
[10] training's auc: 0.844797 training's binary_logloss: 0.149701 valid_1's auc: 0.826034 valid_1's binary_logloss: 0.155988
[11] training's auc: 0.845488 training's binary_logloss: 0.148783 valid_1's auc: 0.826089 valid_1's binary_logloss: 0.155161
[12] training's auc: 0.846385 training's binary_logloss: 0.147921 valid_1's auc: 0.826158 valid_1's binary_logloss: 0.154389
[13] training's auc: 0.846688 training's binary_logloss: 0.147115 valid_1's auc: 0.826264 valid_1's binary_logloss: 0.153642
[14] training's auc: 0.847998 training's binary_logloss: 0.146309 valid_1's auc: 0.826785 valid_1's binary_logloss: 0.152956
[15] training's auc: 0.84912 training's binary_logloss: 0.145548 valid_1's auc: 0.827127 valid_1's binary_logloss: 0.152306
[16] training's auc: 0.849686 training's binary_logloss: 0.144822 valid_1's auc: 0.827598 valid_1's binary_logloss: 0.151687
[17] training's auc: 0.850391 training's binary_logloss: 0.144133 valid_1's auc: 0.827978 valid_1's binary_logloss: 0.151097
[18] training's auc: 0.8507 training's binary_logloss: 0.143483 valid_1's auc: 0.828336 valid_1's binary_logloss: 0.15051
[19] training's auc: 0.850862 training's binary_logloss: 0.142861 valid_1's auc: 0.828735 valid_1's binary_logloss: 0.149983
[20] training's auc: 0.851641 training's binary_logloss: 0.142231 valid_1's auc: 0.828923 valid_1's binary_logloss: 0.149465
[21] training's auc: 0.852151 training's binary_logloss: 0.141648 valid_1's auc: 0.828957 valid_1's binary_logloss: 0.148972
[22] training's auc: 0.852667 training's binary_logloss: 0.141073 valid_1's auc: 0.828942 valid_1's binary_logloss: 0.148505
[23] training's auc: 0.853505 training's binary_logloss: 0.140526 valid_1's auc: 0.829491 valid_1's binary_logloss: 0.148063
[24] training's auc: 0.85405 training's binary_logloss: 0.140009 valid_1's auc: 0.829805 valid_1's binary_logloss: 0.147636
[25] training's auc: 0.854569 training's binary_logloss: 0.139517 valid_1's auc: 0.829916 valid_1's binary_logloss: 0.14722
[26] training's auc: 0.854866 training's binary_logloss: 0.139037 valid_1's auc: 0.830011 valid_1's binary_logloss: 0.146831
[27] training's auc: 0.855193 training's binary_logloss: 0.138571 valid_1's auc: 0.830071 valid_1's binary_logloss: 0.146464
[28] training's auc: 0.855638 training's binary_logloss: 0.138124 valid_1's auc: 0.830212 valid_1's binary_logloss: 0.146104
[29] training's auc: 0.85636 training's binary_logloss: 0.137692 valid_1's auc: 0.830406 valid_1's binary_logloss: 0.145776
[30] training's auc: 0.856688 training's binary_logloss: 0.137267 valid_1's auc: 0.8305 valid_1's binary_logloss: 0.145445
[31] training's auc: 0.856868 training's binary_logloss: 0.136829 valid_1's auc: 0.830464 valid_1's binary_logloss: 0.14513
[32] training's auc: 0.857205 training's binary_logloss: 0.136403 valid_1's auc: 0.830382 valid_1's binary_logloss: 0.144833
[33] training's auc: 0.857722 training's binary_logloss: 0.135986 valid_1's auc: 0.830871 valid_1's binary_logloss: 0.144549
[34] training's auc: 0.858083 training's binary_logloss: 0.135592 valid_1's auc: 0.830993 valid_1's binary_logloss: 0.144278
[35] training's auc: 0.858582 training's binary_logloss: 0.135213 valid_1's auc: 0.831227 valid_1's binary_logloss: 0.144017
[36] training's auc: 0.858874 training's binary_logloss: 0.134854 valid_1's auc: 0.831553 valid_1's binary_logloss: 0.143755
[37] training's auc: 0.859225 training's binary_logloss: 0.134524 valid_1's auc: 0.831652 valid_1's binary_logloss: 0.143496
[38] training's auc: 0.859857 training's binary_logloss: 0.134179 valid_1's auc: 0.831622 valid_1's binary_logloss: 0.143255
[39] training's auc: 0.860142 training's binary_logloss: 0.13385 valid_1's auc: 0.831666 valid_1's binary_logloss: 0.143012
[40] training's auc: 0.860609 training's binary_logloss: 0.133531 valid_1's auc: 0.831583 valid_1's binary_logloss: 0.142804
[41] training's auc: 0.861011 training's binary_logloss: 0.133229 valid_1's auc: 0.831469 valid_1's binary_logloss: 0.142604
[42] training's auc: 0.861459 training's binary_logloss: 0.13293 valid_1's auc: 0.831849 valid_1's binary_logloss: 0.142375
[43] training's auc: 0.862034 training's binary_logloss: 0.132645 valid_1's auc: 0.832024 valid_1's binary_logloss: 0.142171
[44] training's auc: 0.862624 training's binary_logloss: 0.132348 valid_1's auc: 0.832122 valid_1's binary_logloss: 0.141961
[45] training's auc: 0.863075 training's binary_logloss: 0.132057 valid_1's auc: 0.832244 valid_1's binary_logloss: 0.14178
[46] training's auc: 0.863482 training's binary_logloss: 0.131787 valid_1's auc: 0.832383 valid_1's binary_logloss: 0.141588
[47] training's auc: 0.864013 training's binary_logloss: 0.131514 valid_1's auc: 0.832505 valid_1's binary_logloss: 0.141409
[48] training's auc: 0.864378 training's binary_logloss: 0.131246 valid_1's auc: 0.832461 valid_1's binary_logloss: 0.141255
[49] training's auc: 0.864735 training's binary_logloss: 0.130999 valid_1's auc: 0.832593 valid_1's binary_logloss: 0.141083
[50] training's auc: 0.865054 training's binary_logloss: 0.130751 valid_1's auc: 0.832614 valid_1's binary_logloss: 0.140934
[51] training's auc: 0.86566 training's binary_logloss: 0.130501 valid_1's auc: 0.832665 valid_1's binary_logloss: 0.140759
[52] training's auc: 0.866985 training's binary_logloss: 0.130256 valid_1's auc: 0.833439 valid_1's binary_logloss: 0.140614
[53] training's auc: 0.867368 training's binary_logloss: 0.130028 valid_1's auc: 0.833525 valid_1's binary_logloss: 0.140464
[54] training's auc: 0.867789 training's binary_logloss: 0.129798 valid_1's auc: 0.833693 valid_1's binary_logloss: 0.140316
[55] training's auc: 0.868135 training's binary_logloss: 0.129573 valid_1's auc: 0.83365 valid_1's binary_logloss: 0.14019
[56] training's auc: 0.868443 training's binary_logloss: 0.129356 valid_1's auc: 0.833691 valid_1's binary_logloss: 0.140058
[57] training's auc: 0.869228 training's binary_logloss: 0.12914 valid_1's auc: 0.833753 valid_1's binary_logloss: 0.139923
[58] training's auc: 0.870099 training's binary_logloss: 0.128918 valid_1's auc: 0.834884 valid_1's binary_logloss: 0.139779
[59] training's auc: 0.8704 training's binary_logloss: 0.128698 valid_1's auc: 0.834998 valid_1's binary_logloss: 0.139649
[60] training's auc: 0.870787 training's binary_logloss: 0.128498 valid_1's auc: 0.835039 valid_1's binary_logloss: 0.139537
[61] training's auc: 0.871358 training's binary_logloss: 0.128295 valid_1's auc: 0.83528 valid_1's binary_logloss: 0.13941
[62] training's auc: 0.871561 training's binary_logloss: 0.1281 valid_1's auc: 0.835304 valid_1's binary_logloss: 0.139301
[63] training's auc: 0.872065 training's binary_logloss: 0.127897 valid_1's auc: 0.835507 valid_1's binary_logloss: 0.139192
[64] training's auc: 0.87255 training's binary_logloss: 0.127694 valid_1's auc: 0.835633 valid_1's binary_logloss: 0.139084
[65] training's auc: 0.873128 training's binary_logloss: 0.127509 valid_1's auc: 0.835849 valid_1's binary_logloss: 0.138982
[66] training's auc: 0.873811 training's binary_logloss: 0.127322 valid_1's auc: 0.836271 valid_1's binary_logloss: 0.138877
[67] training's auc: 0.874168 training's binary_logloss: 0.127142 valid_1's auc: 0.83649 valid_1's binary_logloss: 0.138782
[68] training's auc: 0.874511 training's binary_logloss: 0.126968 valid_1's auc: 0.836469 valid_1's binary_logloss: 0.138685
[69] training's auc: 0.874794 training's binary_logloss: 0.126802 valid_1's auc: 0.836519 valid_1's binary_logloss: 0.138601
[70] training's auc: 0.875148 training's binary_logloss: 0.126628 valid_1's auc: 0.836402 valid_1's binary_logloss: 0.138535
[71] training's auc: 0.87548 training's binary_logloss: 0.126461 valid_1's auc: 0.836434 valid_1's binary_logloss: 0.138451
[72] training's auc: 0.87565 training's binary_logloss: 0.126305 valid_1's auc: 0.836519 valid_1's binary_logloss: 0.138381
[73] training's auc: 0.875883 training's binary_logloss: 0.126151 valid_1's auc: 0.836375 valid_1's binary_logloss: 0.13831
[74] training's auc: 0.876259 training's binary_logloss: 0.126 valid_1's auc: 0.836531 valid_1's binary_logloss: 0.138232
[75] training's auc: 0.876685 training's binary_logloss: 0.125833 valid_1's auc: 0.8367 valid_1's binary_logloss: 0.138158
[76] training's auc: 0.877089 training's binary_logloss: 0.125656 valid_1's auc: 0.836729 valid_1's binary_logloss: 0.138105
[77] training's auc: 0.877318 training's binary_logloss: 0.125509 valid_1's auc: 0.836693 valid_1's binary_logloss: 0.138029
[78] training's auc: 0.877799 training's binary_logloss: 0.125347 valid_1's auc: 0.836719 valid_1's binary_logloss: 0.137964
[79] training's auc: 0.878132 training's binary_logloss: 0.125194 valid_1's auc: 0.83659 valid_1's binary_logloss: 0.137904
[80] training's auc: 0.87855 training's binary_logloss: 0.125039 valid_1's auc: 0.836555 valid_1's binary_logloss: 0.13785
[81] training's auc: 0.879258 training's binary_logloss: 0.124867 valid_1's auc: 0.836639 valid_1's binary_logloss: 0.137779
[82] training's auc: 0.879547 training's binary_logloss: 0.124709 valid_1's auc: 0.836588 valid_1's binary_logloss: 0.137736
[83] training's auc: 0.87979 training's binary_logloss: 0.124552 valid_1's auc: 0.836763 valid_1's binary_logloss: 0.137653
[84] training's auc: 0.880259 training's binary_logloss: 0.124401 valid_1's auc: 0.836729 valid_1's binary_logloss: 0.137599
[85] training's auc: 0.880808 training's binary_logloss: 0.124241 valid_1's auc: 0.836846 valid_1's binary_logloss: 0.137529
[86] training's auc: 0.881171 training's binary_logloss: 0.124091 valid_1's auc: 0.836909 valid_1's binary_logloss: 0.137486
[87] training's auc: 0.881571 training's binary_logloss: 0.12395 valid_1's auc: 0.836801 valid_1's binary_logloss: 0.137452
[88] training's auc: 0.881823 training's binary_logloss: 0.123804 valid_1's auc: 0.836841 valid_1's binary_logloss: 0.137396
[89] training's auc: 0.881997 training's binary_logloss: 0.123665 valid_1's auc: 0.836892 valid_1's binary_logloss: 0.137341
[90] training's auc: 0.882378 training's binary_logloss: 0.123527 valid_1's auc: 0.837013 valid_1's binary_logloss: 0.137278
[91] training's auc: 0.882729 training's binary_logloss: 0.123387 valid_1's auc: 0.836962 valid_1's binary_logloss: 0.137228
[92] training's auc: 0.883047 training's binary_logloss: 0.123243 valid_1's auc: 0.836959 valid_1's binary_logloss: 0.137194
[93] training's auc: 0.883395 training's binary_logloss: 0.123099 valid_1's auc: 0.836823 valid_1's binary_logloss: 0.137153
[94] training's auc: 0.883703 training's binary_logloss: 0.122965 valid_1's auc: 0.83683 valid_1's binary_logloss: 0.137116
[95] training's auc: 0.883999 training's binary_logloss: 0.122831 valid_1's auc: 0.836845 valid_1's binary_logloss: 0.137074
[96] training's auc: 0.884282 training's binary_logloss: 0.122699 valid_1's auc: 0.836855 valid_1's binary_logloss: 0.137046
[97] training's auc: 0.884585 training's binary_logloss: 0.122579 valid_1's auc: 0.836874 valid_1's binary_logloss: 0.137013
[98] training's auc: 0.885051 training's binary_logloss: 0.122437 valid_1's auc: 0.836829 valid_1's binary_logloss: 0.13699
[99] training's auc: 0.885387 training's binary_logloss: 0.122291 valid_1's auc: 0.836871 valid_1's binary_logloss: 0.136957
[100] training's auc: 0.885694 training's binary_logloss: 0.122158 valid_1's auc: 0.836953 valid_1's binary_logloss: 0.136918
Did not meet early stopping. Best iteration is:
[100] training's auc: 0.885694 training's binary_logloss: 0.122158 valid_1's auc: 0.836953 valid_1's binary_logloss: 0.136918
[1] training's auc: 0.832524 training's binary_logloss: 0.151834 valid_1's auc: 0.810215 valid_1's binary_logloss: 0.154799
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.841651 training's binary_logloss: 0.145706 valid_1's auc: 0.81388 valid_1's binary_logloss: 0.149695
[3] training's auc: 0.848861 training's binary_logloss: 0.141427 valid_1's auc: 0.819479 valid_1's binary_logloss: 0.146326
[4] training's auc: 0.854436 training's binary_logloss: 0.13816 valid_1's auc: 0.824007 valid_1's binary_logloss: 0.143989
[5] training's auc: 0.857136 training's binary_logloss: 0.135599 valid_1's auc: 0.824922 valid_1's binary_logloss: 0.142051
[6] training's auc: 0.859407 training's binary_logloss: 0.13355 valid_1's auc: 0.825404 valid_1's binary_logloss: 0.14084
[7] training's auc: 0.863636 training's binary_logloss: 0.131802 valid_1's auc: 0.826363 valid_1's binary_logloss: 0.139813
[8] training's auc: 0.866637 training's binary_logloss: 0.130282 valid_1's auc: 0.827789 valid_1's binary_logloss: 0.138856
[9] training's auc: 0.869752 training's binary_logloss: 0.128886 valid_1's auc: 0.828546 valid_1's binary_logloss: 0.138261
[10] training's auc: 0.872061 training's binary_logloss: 0.127733 valid_1's auc: 0.828314 valid_1's binary_logloss: 0.137816
[11] training's auc: 0.87427 training's binary_logloss: 0.126663 valid_1's auc: 0.828028 valid_1's binary_logloss: 0.137393
[12] training's auc: 0.877346 training's binary_logloss: 0.12568 valid_1's auc: 0.828266 valid_1's binary_logloss: 0.137151
[13] training's auc: 0.878915 training's binary_logloss: 0.124817 valid_1's auc: 0.828588 valid_1's binary_logloss: 0.136917
[14] training's auc: 0.881276 training's binary_logloss: 0.123982 valid_1's auc: 0.829513 valid_1's binary_logloss: 0.13661
[15] training's auc: 0.88281 training's binary_logloss: 0.123273 valid_1's auc: 0.829658 valid_1's binary_logloss: 0.136415
[16] training's auc: 0.884544 training's binary_logloss: 0.122535 valid_1's auc: 0.829542 valid_1's binary_logloss: 0.13633
[17] training's auc: 0.886727 training's binary_logloss: 0.121855 valid_1's auc: 0.829712 valid_1's binary_logloss: 0.136201
[18] training's auc: 0.888133 training's binary_logloss: 0.121193 valid_1's auc: 0.829617 valid_1's binary_logloss: 0.136194
[19] training's auc: 0.88962 training's binary_logloss: 0.120592 valid_1's auc: 0.829096 valid_1's binary_logloss: 0.136227
[20] training's auc: 0.891916 training's binary_logloss: 0.119951 valid_1's auc: 0.830135 valid_1's binary_logloss: 0.136044
[21] training's auc: 0.893752 training's binary_logloss: 0.119294 valid_1's auc: 0.830487 valid_1's binary_logloss: 0.136008
[22] training's auc: 0.895005 training's binary_logloss: 0.118856 valid_1's auc: 0.830981 valid_1's binary_logloss: 0.135937
[23] training's auc: 0.896073 training's binary_logloss: 0.118345 valid_1's auc: 0.830459 valid_1's binary_logloss: 0.135992
[24] training's auc: 0.897409 training's binary_logloss: 0.117776 valid_1's auc: 0.829998 valid_1's binary_logloss: 0.136099
[25] training's auc: 0.898628 training's binary_logloss: 0.117259 valid_1's auc: 0.830319 valid_1's binary_logloss: 0.136145
[26] training's auc: 0.899922 training's binary_logloss: 0.116736 valid_1's auc: 0.829601 valid_1's binary_logloss: 0.136271
[27] training's auc: 0.901244 training's binary_logloss: 0.116311 valid_1's auc: 0.829311 valid_1's binary_logloss: 0.13637
[28] training's auc: 0.902391 training's binary_logloss: 0.115821 valid_1's auc: 0.829279 valid_1's binary_logloss: 0.136399
[29] training's auc: 0.903654 training's binary_logloss: 0.115323 valid_1's auc: 0.82939 valid_1's binary_logloss: 0.136325
[30] training's auc: 0.90479 training's binary_logloss: 0.114864 valid_1's auc: 0.829288 valid_1's binary_logloss: 0.13641
[31] training's auc: 0.905826 training's binary_logloss: 0.114398 valid_1's auc: 0.829461 valid_1's binary_logloss: 0.136434
[32] training's auc: 0.906879 training's binary_logloss: 0.113924 valid_1's auc: 0.829616 valid_1's binary_logloss: 0.136459
[33] training's auc: 0.907772 training's binary_logloss: 0.113512 valid_1's auc: 0.829389 valid_1's binary_logloss: 0.136555
[34] training's auc: 0.908365 training's binary_logloss: 0.11323 valid_1's auc: 0.82846 valid_1's binary_logloss: 0.136746
[35] training's auc: 0.910189 training's binary_logloss: 0.112808 valid_1's auc: 0.828318 valid_1's binary_logloss: 0.136756
[36] training's auc: 0.911766 training's binary_logloss: 0.112379 valid_1's auc: 0.827944 valid_1's binary_logloss: 0.136814
[37] training's auc: 0.91282 training's binary_logloss: 0.111913 valid_1's auc: 0.82758 valid_1's binary_logloss: 0.136938
[38] training's auc: 0.91355 training's binary_logloss: 0.111672 valid_1's auc: 0.82728 valid_1's binary_logloss: 0.137008
[39] training's auc: 0.914382 training's binary_logloss: 0.111258 valid_1's auc: 0.826766 valid_1's binary_logloss: 0.137115
[40] training's auc: 0.914966 training's binary_logloss: 0.110902 valid_1's auc: 0.826479 valid_1's binary_logloss: 0.137168
[41] training's auc: 0.91596 training's binary_logloss: 0.110483 valid_1's auc: 0.826401 valid_1's binary_logloss: 0.137238
[42] training's auc: 0.916836 training's binary_logloss: 0.110066 valid_1's auc: 0.826672 valid_1's binary_logloss: 0.137229
[43] training's auc: 0.917365 training's binary_logloss: 0.109776 valid_1's auc: 0.826463 valid_1's binary_logloss: 0.137302
[44] training's auc: 0.918317 training's binary_logloss: 0.109366 valid_1's auc: 0.826509 valid_1's binary_logloss: 0.137336
[45] training's auc: 0.918959 training's binary_logloss: 0.109043 valid_1's auc: 0.826104 valid_1's binary_logloss: 0.137474
[46] training's auc: 0.919455 training's binary_logloss: 0.108736 valid_1's auc: 0.825627 valid_1's binary_logloss: 0.137502
[47] training's auc: 0.919889 training's binary_logloss: 0.108432 valid_1's auc: 0.82505 valid_1's binary_logloss: 0.137645
[48] training's auc: 0.920487 training's binary_logloss: 0.108076 valid_1's auc: 0.824748 valid_1's binary_logloss: 0.137782
[49] training's auc: 0.92091 training's binary_logloss: 0.10776 valid_1's auc: 0.824522 valid_1's binary_logloss: 0.137841
[50] training's auc: 0.921419 training's binary_logloss: 0.107472 valid_1's auc: 0.823697 valid_1's binary_logloss: 0.137982
[51] training's auc: 0.921866 training's binary_logloss: 0.107186 valid_1's auc: 0.823116 valid_1's binary_logloss: 0.13816
[52] training's auc: 0.922531 training's binary_logloss: 0.106825 valid_1's auc: 0.822641 valid_1's binary_logloss: 0.138261
Early stopping, best iteration is:
[22] training's auc: 0.895005 training's binary_logloss: 0.118856 valid_1's auc: 0.830981 valid_1's binary_logloss: 0.135937
[1] training's auc: 0.828048 training's binary_logloss: 0.154412 valid_1's auc: 0.813248 valid_1's binary_logloss: 0.150478
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.838818 training's binary_logloss: 0.147945 valid_1's auc: 0.823436 valid_1's binary_logloss: 0.145159
[3] training's auc: 0.847917 training's binary_logloss: 0.143683 valid_1's auc: 0.827467 valid_1's binary_logloss: 0.141769
[4] training's auc: 0.852332 training's binary_logloss: 0.140417 valid_1's auc: 0.832318 valid_1's binary_logloss: 0.139293
[5] training's auc: 0.856185 training's binary_logloss: 0.137871 valid_1's auc: 0.834218 valid_1's binary_logloss: 0.137259
[6] training's auc: 0.858986 training's binary_logloss: 0.135825 valid_1's auc: 0.834942 valid_1's binary_logloss: 0.135816
[7] training's auc: 0.862072 training's binary_logloss: 0.134104 valid_1's auc: 0.83456 valid_1's binary_logloss: 0.134649
[8] training's auc: 0.865154 training's binary_logloss: 0.132513 valid_1's auc: 0.834548 valid_1's binary_logloss: 0.133888
[9] training's auc: 0.868103 training's binary_logloss: 0.131122 valid_1's auc: 0.834036 valid_1's binary_logloss: 0.133319
[10] training's auc: 0.870354 training's binary_logloss: 0.129904 valid_1's auc: 0.833554 valid_1's binary_logloss: 0.132872
[11] training's auc: 0.872117 training's binary_logloss: 0.12893 valid_1's auc: 0.833087 valid_1's binary_logloss: 0.132547
[12] training's auc: 0.874628 training's binary_logloss: 0.127791 valid_1's auc: 0.833894 valid_1's binary_logloss: 0.132165
[13] training's auc: 0.87649 training's binary_logloss: 0.126928 valid_1's auc: 0.834007 valid_1's binary_logloss: 0.13189
[14] training's auc: 0.877958 training's binary_logloss: 0.126088 valid_1's auc: 0.834353 valid_1's binary_logloss: 0.131695
[15] training's auc: 0.87987 training's binary_logloss: 0.125264 valid_1's auc: 0.834076 valid_1's binary_logloss: 0.131561
[16] training's auc: 0.881247 training's binary_logloss: 0.124594 valid_1's auc: 0.834722 valid_1's binary_logloss: 0.131334
[17] training's auc: 0.882799 training's binary_logloss: 0.123899 valid_1's auc: 0.834964 valid_1's binary_logloss: 0.131211
[18] training's auc: 0.884688 training's binary_logloss: 0.123272 valid_1's auc: 0.83485 valid_1's binary_logloss: 0.13114
[19] training's auc: 0.887093 training's binary_logloss: 0.122594 valid_1's auc: 0.834481 valid_1's binary_logloss: 0.131118
[20] training's auc: 0.888645 training's binary_logloss: 0.122036 valid_1's auc: 0.834566 valid_1's binary_logloss: 0.131022
[21] training's auc: 0.890101 training's binary_logloss: 0.121518 valid_1's auc: 0.83444 valid_1's binary_logloss: 0.131012
[22] training's auc: 0.89199 training's binary_logloss: 0.120978 valid_1's auc: 0.835134 valid_1's binary_logloss: 0.13091
[23] training's auc: 0.893115 training's binary_logloss: 0.120503 valid_1's auc: 0.834905 valid_1's binary_logloss: 0.130901
[24] training's auc: 0.89425 training's binary_logloss: 0.120071 valid_1's auc: 0.834944 valid_1's binary_logloss: 0.130899
[25] training's auc: 0.896234 training's binary_logloss: 0.11961 valid_1's auc: 0.834868 valid_1's binary_logloss: 0.13091
[26] training's auc: 0.897988 training's binary_logloss: 0.119133 valid_1's auc: 0.834867 valid_1's binary_logloss: 0.130916
[27] training's auc: 0.900021 training's binary_logloss: 0.118536 valid_1's auc: 0.835253 valid_1's binary_logloss: 0.130872
[28] training's auc: 0.901236 training's binary_logloss: 0.118061 valid_1's auc: 0.834938 valid_1's binary_logloss: 0.130919
[29] training's auc: 0.902273 training's binary_logloss: 0.117633 valid_1's auc: 0.834933 valid_1's binary_logloss: 0.130942
[30] training's auc: 0.90344 training's binary_logloss: 0.117128 valid_1's auc: 0.835086 valid_1's binary_logloss: 0.13093
[31] training's auc: 0.904989 training's binary_logloss: 0.116573 valid_1's auc: 0.834735 valid_1's binary_logloss: 0.131048
[32] training's auc: 0.906064 training's binary_logloss: 0.116177 valid_1's auc: 0.834512 valid_1's binary_logloss: 0.131105
[33] training's auc: 0.907233 training's binary_logloss: 0.115669 valid_1's auc: 0.834607 valid_1's binary_logloss: 0.131128
[34] training's auc: 0.908283 training's binary_logloss: 0.115242 valid_1's auc: 0.83418 valid_1's binary_logloss: 0.131201
[35] training's auc: 0.909177 training's binary_logloss: 0.114917 valid_1's auc: 0.834127 valid_1's binary_logloss: 0.131192
[36] training's auc: 0.910041 training's binary_logloss: 0.11455 valid_1's auc: 0.834032 valid_1's binary_logloss: 0.131204
[37] training's auc: 0.91133 training's binary_logloss: 0.114086 valid_1's auc: 0.833683 valid_1's binary_logloss: 0.131294
[38] training's auc: 0.912082 training's binary_logloss: 0.113701 valid_1's auc: 0.83313 valid_1's binary_logloss: 0.131392
[39] training's auc: 0.913667 training's binary_logloss: 0.113191 valid_1's auc: 0.832843 valid_1's binary_logloss: 0.131444
[40] training's auc: 0.914351 training's binary_logloss: 0.112884 valid_1's auc: 0.832774 valid_1's binary_logloss: 0.131502
[41] training's auc: 0.915414 training's binary_logloss: 0.11247 valid_1's auc: 0.832676 valid_1's binary_logloss: 0.1315
[42] training's auc: 0.916287 training's binary_logloss: 0.112052 valid_1's auc: 0.832815 valid_1's binary_logloss: 0.1315
[43] training's auc: 0.916907 training's binary_logloss: 0.111769 valid_1's auc: 0.832765 valid_1's binary_logloss: 0.131503
[44] training's auc: 0.917432 training's binary_logloss: 0.11145 valid_1's auc: 0.832535 valid_1's binary_logloss: 0.131588
[45] training's auc: 0.918118 training's binary_logloss: 0.111119 valid_1's auc: 0.832331 valid_1's binary_logloss: 0.131621
[46] training's auc: 0.918672 training's binary_logloss: 0.110766 valid_1's auc: 0.832024 valid_1's binary_logloss: 0.131713
[47] training's auc: 0.919434 training's binary_logloss: 0.110401 valid_1's auc: 0.832321 valid_1's binary_logloss: 0.131718
[48] training's auc: 0.920029 training's binary_logloss: 0.110045 valid_1's auc: 0.831987 valid_1's binary_logloss: 0.131801
[49] training's auc: 0.920497 training's binary_logloss: 0.10977 valid_1's auc: 0.831565 valid_1's binary_logloss: 0.131891
[50] training's auc: 0.920972 training's binary_logloss: 0.109469 valid_1's auc: 0.831158 valid_1's binary_logloss: 0.131974
[51] training's auc: 0.921743 training's binary_logloss: 0.10912 valid_1's auc: 0.831089 valid_1's binary_logloss: 0.132058
[52] training's auc: 0.92203 training's binary_logloss: 0.108878 valid_1's auc: 0.830666 valid_1's binary_logloss: 0.13216
[53] training's auc: 0.92238 training's binary_logloss: 0.108644 valid_1's auc: 0.830708 valid_1's binary_logloss: 0.132151
[54] training's auc: 0.923155 training's binary_logloss: 0.108384 valid_1's auc: 0.830674 valid_1's binary_logloss: 0.132202
[55] training's auc: 0.923529 training's binary_logloss: 0.108162 valid_1's auc: 0.830666 valid_1's binary_logloss: 0.132232
[56] training's auc: 0.92453 training's binary_logloss: 0.107735 valid_1's auc: 0.830478 valid_1's binary_logloss: 0.132303
[57] training's auc: 0.925407 training's binary_logloss: 0.107358 valid_1's auc: 0.830011 valid_1's binary_logloss: 0.132415
Early stopping, best iteration is:
[27] training's auc: 0.900021 training's binary_logloss: 0.118536 valid_1's auc: 0.835253 valid_1's binary_logloss: 0.130872
[1] training's auc: 0.828913 training's binary_logloss: 0.1513 valid_1's auc: 0.813322 valid_1's binary_logloss: 0.157425
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.838521 training's binary_logloss: 0.145131 valid_1's auc: 0.820145 valid_1's binary_logloss: 0.151622
[3] training's auc: 0.846607 training's binary_logloss: 0.140729 valid_1's auc: 0.825868 valid_1's binary_logloss: 0.147951
[4] training's auc: 0.850816 training's binary_logloss: 0.137428 valid_1's auc: 0.829265 valid_1's binary_logloss: 0.145188
[5] training's auc: 0.853633 training's binary_logloss: 0.134834 valid_1's auc: 0.830186 valid_1's binary_logloss: 0.143358
[6] training's auc: 0.857865 training's binary_logloss: 0.132735 valid_1's auc: 0.831739 valid_1's binary_logloss: 0.141765
[7] training's auc: 0.860114 training's binary_logloss: 0.131069 valid_1's auc: 0.831874 valid_1's binary_logloss: 0.140696
[8] training's auc: 0.862852 training's binary_logloss: 0.129626 valid_1's auc: 0.832406 valid_1's binary_logloss: 0.139791
[9] training's auc: 0.867591 training's binary_logloss: 0.128354 valid_1's auc: 0.834667 valid_1's binary_logloss: 0.139034
[10] training's auc: 0.870563 training's binary_logloss: 0.127226 valid_1's auc: 0.835516 valid_1's binary_logloss: 0.138504
[11] training's auc: 0.87269 training's binary_logloss: 0.126257 valid_1's auc: 0.835186 valid_1's binary_logloss: 0.13812
[12] training's auc: 0.874685 training's binary_logloss: 0.125334 valid_1's auc: 0.835244 valid_1's binary_logloss: 0.137751
[13] training's auc: 0.87706 training's binary_logloss: 0.124377 valid_1's auc: 0.83635 valid_1's binary_logloss: 0.137295
[14] training's auc: 0.878797 training's binary_logloss: 0.123546 valid_1's auc: 0.836994 valid_1's binary_logloss: 0.136995
[15] training's auc: 0.881059 training's binary_logloss: 0.122713 valid_1's auc: 0.836748 valid_1's binary_logloss: 0.136816
[16] training's auc: 0.882857 training's binary_logloss: 0.122007 valid_1's auc: 0.836892 valid_1's binary_logloss: 0.13669
[17] training's auc: 0.884836 training's binary_logloss: 0.121341 valid_1's auc: 0.837216 valid_1's binary_logloss: 0.13659
[18] training's auc: 0.886805 training's binary_logloss: 0.120686 valid_1's auc: 0.836351 valid_1's binary_logloss: 0.136609
[19] training's auc: 0.888532 training's binary_logloss: 0.120068 valid_1's auc: 0.836253 valid_1's binary_logloss: 0.136576
[20] training's auc: 0.890016 training's binary_logloss: 0.119462 valid_1's auc: 0.835822 valid_1's binary_logloss: 0.136635
[21] training's auc: 0.891476 training's binary_logloss: 0.118847 valid_1's auc: 0.835245 valid_1's binary_logloss: 0.136682
[22] training's auc: 0.892749 training's binary_logloss: 0.118355 valid_1's auc: 0.834305 valid_1's binary_logloss: 0.136841
[23] training's auc: 0.894384 training's binary_logloss: 0.117683 valid_1's auc: 0.834757 valid_1's binary_logloss: 0.136771
[24] training's auc: 0.895648 training's binary_logloss: 0.117166 valid_1's auc: 0.834602 valid_1's binary_logloss: 0.136824
[25] training's auc: 0.896765 training's binary_logloss: 0.116726 valid_1's auc: 0.834784 valid_1's binary_logloss: 0.136771
[26] training's auc: 0.89839 training's binary_logloss: 0.116233 valid_1's auc: 0.834448 valid_1's binary_logloss: 0.13683
[27] training's auc: 0.900189 training's binary_logloss: 0.115695 valid_1's auc: 0.834224 valid_1's binary_logloss: 0.136874
[28] training's auc: 0.901256 training's binary_logloss: 0.115208 valid_1's auc: 0.834226 valid_1's binary_logloss: 0.13691
[29] training's auc: 0.902791 training's binary_logloss: 0.114716 valid_1's auc: 0.834532 valid_1's binary_logloss: 0.136899
[30] training's auc: 0.904033 training's binary_logloss: 0.114234 valid_1's auc: 0.8344 valid_1's binary_logloss: 0.136942
[31] training's auc: 0.905406 training's binary_logloss: 0.113721 valid_1's auc: 0.833395 valid_1's binary_logloss: 0.137152
[32] training's auc: 0.906341 training's binary_logloss: 0.113251 valid_1's auc: 0.833072 valid_1's binary_logloss: 0.13723
[33] training's auc: 0.907192 training's binary_logloss: 0.112832 valid_1's auc: 0.832695 valid_1's binary_logloss: 0.137285
[34] training's auc: 0.908053 training's binary_logloss: 0.112414 valid_1's auc: 0.832382 valid_1's binary_logloss: 0.137378
[35] training's auc: 0.90877 training's binary_logloss: 0.112071 valid_1's auc: 0.831911 valid_1's binary_logloss: 0.137477
[36] training's auc: 0.909813 training's binary_logloss: 0.111627 valid_1's auc: 0.83137 valid_1's binary_logloss: 0.137577
[37] training's auc: 0.910836 training's binary_logloss: 0.111195 valid_1's auc: 0.831089 valid_1's binary_logloss: 0.13766
[38] training's auc: 0.911613 training's binary_logloss: 0.110882 valid_1's auc: 0.830867 valid_1's binary_logloss: 0.137751
[39] training's auc: 0.91272 training's binary_logloss: 0.110484 valid_1's auc: 0.830394 valid_1's binary_logloss: 0.137869
[40] training's auc: 0.913793 training's binary_logloss: 0.110044 valid_1's auc: 0.8304 valid_1's binary_logloss: 0.137888
[41] training's auc: 0.914528 training's binary_logloss: 0.109675 valid_1's auc: 0.830122 valid_1's binary_logloss: 0.138002
[42] training's auc: 0.914956 training's binary_logloss: 0.109363 valid_1's auc: 0.829776 valid_1's binary_logloss: 0.138102
[43] training's auc: 0.915811 training's binary_logloss: 0.108956 valid_1's auc: 0.829509 valid_1's binary_logloss: 0.138193
[44] training's auc: 0.916951 training's binary_logloss: 0.108493 valid_1's auc: 0.829083 valid_1's binary_logloss: 0.138279
[45] training's auc: 0.9179 training's binary_logloss: 0.10821 valid_1's auc: 0.829589 valid_1's binary_logloss: 0.13826
[46] training's auc: 0.91854 training's binary_logloss: 0.107901 valid_1's auc: 0.829425 valid_1's binary_logloss: 0.138317
[47] training's auc: 0.91979 training's binary_logloss: 0.107478 valid_1's auc: 0.829384 valid_1's binary_logloss: 0.138323
Early stopping, best iteration is:
[17] training's auc: 0.884836 training's binary_logloss: 0.121341 valid_1's auc: 0.837216 valid_1's binary_logloss: 0.13659
[1] training's auc: 0.834155 training's binary_logloss: 0.157535 valid_1's auc: 0.809225 valid_1's binary_logloss: 0.159741
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.840268 training's binary_logloss: 0.152801 valid_1's auc: 0.810729 valid_1's binary_logloss: 0.155895
[3] training's auc: 0.845177 training's binary_logloss: 0.149144 valid_1's auc: 0.815149 valid_1's binary_logloss: 0.152882
[4] training's auc: 0.851829 training's binary_logloss: 0.146183 valid_1's auc: 0.818688 valid_1's binary_logloss: 0.150431
[5] training's auc: 0.854379 training's binary_logloss: 0.143657 valid_1's auc: 0.82151 valid_1's binary_logloss: 0.148464
[6] training's auc: 0.855776 training's binary_logloss: 0.141468 valid_1's auc: 0.822828 valid_1's binary_logloss: 0.146882
[7] training's auc: 0.859431 training's binary_logloss: 0.139565 valid_1's auc: 0.825804 valid_1's binary_logloss: 0.145384
[8] training's auc: 0.861041 training's binary_logloss: 0.137871 valid_1's auc: 0.826668 valid_1's binary_logloss: 0.144164
[9] training's auc: 0.86324 training's binary_logloss: 0.136384 valid_1's auc: 0.827153 valid_1's binary_logloss: 0.143066
[10] training's auc: 0.864005 training's binary_logloss: 0.13508 valid_1's auc: 0.827105 valid_1's binary_logloss: 0.142214
[11] training's auc: 0.86568 training's binary_logloss: 0.133872 valid_1's auc: 0.827924 valid_1's binary_logloss: 0.141399
[12] training's auc: 0.86637 training's binary_logloss: 0.13279 valid_1's auc: 0.828857 valid_1's binary_logloss: 0.140724
[13] training's auc: 0.867983 training's binary_logloss: 0.131786 valid_1's auc: 0.828629 valid_1's binary_logloss: 0.140125
[14] training's auc: 0.870678 training's binary_logloss: 0.13086 valid_1's auc: 0.82928 valid_1's binary_logloss: 0.139623
[15] training's auc: 0.872381 training's binary_logloss: 0.129971 valid_1's auc: 0.829124 valid_1's binary_logloss: 0.13917
[16] training's auc: 0.874084 training's binary_logloss: 0.129129 valid_1's auc: 0.829453 valid_1's binary_logloss: 0.13875
[17] training's auc: 0.875531 training's binary_logloss: 0.128354 valid_1's auc: 0.830011 valid_1's binary_logloss: 0.138336
[18] training's auc: 0.876671 training's binary_logloss: 0.127666 valid_1's auc: 0.83001 valid_1's binary_logloss: 0.138035
[19] training's auc: 0.878352 training's binary_logloss: 0.126931 valid_1's auc: 0.830589 valid_1's binary_logloss: 0.137706
[20] training's auc: 0.879126 training's binary_logloss: 0.126301 valid_1's auc: 0.83036 valid_1's binary_logloss: 0.137482
[21] training's auc: 0.880431 training's binary_logloss: 0.1257 valid_1's auc: 0.830484 valid_1's binary_logloss: 0.137259
[22] training's auc: 0.882093 training's binary_logloss: 0.125125 valid_1's auc: 0.830798 valid_1's binary_logloss: 0.137025
[23] training's auc: 0.883151 training's binary_logloss: 0.124579 valid_1's auc: 0.830586 valid_1's binary_logloss: 0.136868
[24] training's auc: 0.884372 training's binary_logloss: 0.124055 valid_1's auc: 0.830802 valid_1's binary_logloss: 0.136682
[25] training's auc: 0.885357 training's binary_logloss: 0.123542 valid_1's auc: 0.830365 valid_1's binary_logloss: 0.136608
[26] training's auc: 0.886431 training's binary_logloss: 0.123012 valid_1's auc: 0.831249 valid_1's binary_logloss: 0.136395
[27] training's auc: 0.887494 training's binary_logloss: 0.122507 valid_1's auc: 0.831646 valid_1's binary_logloss: 0.136249
[28] training's auc: 0.888418 training's binary_logloss: 0.122046 valid_1's auc: 0.831411 valid_1's binary_logloss: 0.136164
[29] training's auc: 0.889334 training's binary_logloss: 0.121587 valid_1's auc: 0.831862 valid_1's binary_logloss: 0.136059
[30] training's auc: 0.890223 training's binary_logloss: 0.121171 valid_1's auc: 0.832165 valid_1's binary_logloss: 0.135971
[31] training's auc: 0.891023 training's binary_logloss: 0.120764 valid_1's auc: 0.832158 valid_1's binary_logloss: 0.135905
[32] training's auc: 0.892233 training's binary_logloss: 0.120289 valid_1's auc: 0.832309 valid_1's binary_logloss: 0.135843
[33] training's auc: 0.893302 training's binary_logloss: 0.119875 valid_1's auc: 0.832313 valid_1's binary_logloss: 0.135789
[34] training's auc: 0.894184 training's binary_logloss: 0.119535 valid_1's auc: 0.832285 valid_1's binary_logloss: 0.135718
[35] training's auc: 0.895102 training's binary_logloss: 0.119154 valid_1's auc: 0.832395 valid_1's binary_logloss: 0.135699
[36] training's auc: 0.896732 training's binary_logloss: 0.118779 valid_1's auc: 0.83243 valid_1's binary_logloss: 0.135667
[37] training's auc: 0.897638 training's binary_logloss: 0.118406 valid_1's auc: 0.832078 valid_1's binary_logloss: 0.135693
[38] training's auc: 0.898482 training's binary_logloss: 0.118028 valid_1's auc: 0.831806 valid_1's binary_logloss: 0.135687
[39] training's auc: 0.899187 training's binary_logloss: 0.117702 valid_1's auc: 0.831788 valid_1's binary_logloss: 0.135704
[40] training's auc: 0.900175 training's binary_logloss: 0.117351 valid_1's auc: 0.831869 valid_1's binary_logloss: 0.135694
[41] training's auc: 0.901143 training's binary_logloss: 0.117041 valid_1's auc: 0.832279 valid_1's binary_logloss: 0.135649
[42] training's auc: 0.902041 training's binary_logloss: 0.116743 valid_1's auc: 0.832143 valid_1's binary_logloss: 0.135646
[43] training's auc: 0.902758 training's binary_logloss: 0.116441 valid_1's auc: 0.831881 valid_1's binary_logloss: 0.13569
[44] training's auc: 0.903347 training's binary_logloss: 0.116146 valid_1's auc: 0.832097 valid_1's binary_logloss: 0.135668
[45] training's auc: 0.903992 training's binary_logloss: 0.115866 valid_1's auc: 0.831696 valid_1's binary_logloss: 0.135711
[46] training's auc: 0.904754 training's binary_logloss: 0.11561 valid_1's auc: 0.831498 valid_1's binary_logloss: 0.135713
[47] training's auc: 0.905386 training's binary_logloss: 0.115318 valid_1's auc: 0.831491 valid_1's binary_logloss: 0.135709
[48] training's auc: 0.906117 training's binary_logloss: 0.115006 valid_1's auc: 0.831528 valid_1's binary_logloss: 0.135706
[49] training's auc: 0.906935 training's binary_logloss: 0.114682 valid_1's auc: 0.831174 valid_1's binary_logloss: 0.135774
[50] training's auc: 0.907701 training's binary_logloss: 0.114402 valid_1's auc: 0.83091 valid_1's binary_logloss: 0.135843
[51] training's auc: 0.908348 training's binary_logloss: 0.114118 valid_1's auc: 0.830852 valid_1's binary_logloss: 0.135824
[52] training's auc: 0.909119 training's binary_logloss: 0.113838 valid_1's auc: 0.830338 valid_1's binary_logloss: 0.135903
[53] training's auc: 0.909667 training's binary_logloss: 0.11357 valid_1's auc: 0.830304 valid_1's binary_logloss: 0.135966
[54] training's auc: 0.910304 training's binary_logloss: 0.113315 valid_1's auc: 0.83024 valid_1's binary_logloss: 0.135985
[55] training's auc: 0.910861 training's binary_logloss: 0.113066 valid_1's auc: 0.829635 valid_1's binary_logloss: 0.136077
[56] training's auc: 0.91163 training's binary_logloss: 0.112756 valid_1's auc: 0.829613 valid_1's binary_logloss: 0.136078
[57] training's auc: 0.912594 training's binary_logloss: 0.112525 valid_1's auc: 0.829864 valid_1's binary_logloss: 0.136064
[58] training's auc: 0.912997 training's binary_logloss: 0.112299 valid_1's auc: 0.829731 valid_1's binary_logloss: 0.136105
[59] training's auc: 0.91358 training's binary_logloss: 0.112107 valid_1's auc: 0.8296 valid_1's binary_logloss: 0.136134
[60] training's auc: 0.914432 training's binary_logloss: 0.111875 valid_1's auc: 0.829749 valid_1's binary_logloss: 0.136134
[61] training's auc: 0.915141 training's binary_logloss: 0.111621 valid_1's auc: 0.829699 valid_1's binary_logloss: 0.136152
[62] training's auc: 0.915543 training's binary_logloss: 0.111393 valid_1's auc: 0.829652 valid_1's binary_logloss: 0.13617
[63] training's auc: 0.916065 training's binary_logloss: 0.111139 valid_1's auc: 0.829285 valid_1's binary_logloss: 0.136273
[64] training's auc: 0.917277 training's binary_logloss: 0.110859 valid_1's auc: 0.829356 valid_1's binary_logloss: 0.136277
[65] training's auc: 0.917725 training's binary_logloss: 0.110708 valid_1's auc: 0.829319 valid_1's binary_logloss: 0.136294
[66] training's auc: 0.91815 training's binary_logloss: 0.110464 valid_1's auc: 0.829105 valid_1's binary_logloss: 0.136348
Early stopping, best iteration is:
[36] training's auc: 0.896732 training's binary_logloss: 0.118779 valid_1's auc: 0.83243 valid_1's binary_logloss: 0.135667
[1] training's auc: 0.830321 training's binary_logloss: 0.160118 valid_1's auc: 0.816703 valid_1's binary_logloss: 0.155116
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.838112 training's binary_logloss: 0.155185 valid_1's auc: 0.819589 valid_1's binary_logloss: 0.15114
[3] training's auc: 0.844501 training's binary_logloss: 0.151502 valid_1's auc: 0.826594 valid_1's binary_logloss: 0.14801
[4] training's auc: 0.850247 training's binary_logloss: 0.148442 valid_1's auc: 0.831345 valid_1's binary_logloss: 0.145592
[5] training's auc: 0.853379 training's binary_logloss: 0.145903 valid_1's auc: 0.832215 valid_1's binary_logloss: 0.143675
[6] training's auc: 0.855274 training's binary_logloss: 0.143788 valid_1's auc: 0.833619 valid_1's binary_logloss: 0.141977
[7] training's auc: 0.856474 training's binary_logloss: 0.14192 valid_1's auc: 0.833492 valid_1's binary_logloss: 0.140546
[8] training's auc: 0.860043 training's binary_logloss: 0.140211 valid_1's auc: 0.835566 valid_1's binary_logloss: 0.139366
[9] training's auc: 0.862219 training's binary_logloss: 0.138661 valid_1's auc: 0.83513 valid_1's binary_logloss: 0.138348
[10] training's auc: 0.864816 training's binary_logloss: 0.137301 valid_1's auc: 0.835264 valid_1's binary_logloss: 0.137397
[11] training's auc: 0.866544 training's binary_logloss: 0.136061 valid_1's auc: 0.834734 valid_1's binary_logloss: 0.13673
[12] training's auc: 0.867721 training's binary_logloss: 0.134912 valid_1's auc: 0.834759 valid_1's binary_logloss: 0.135997
[13] training's auc: 0.868838 training's binary_logloss: 0.133928 valid_1's auc: 0.834905 valid_1's binary_logloss: 0.13534
[14] training's auc: 0.869539 training's binary_logloss: 0.133055 valid_1's auc: 0.83531 valid_1's binary_logloss: 0.134747
[15] training's auc: 0.870822 training's binary_logloss: 0.132181 valid_1's auc: 0.835584 valid_1's binary_logloss: 0.134234
[16] training's auc: 0.87248 training's binary_logloss: 0.131317 valid_1's auc: 0.836268 valid_1's binary_logloss: 0.133765
[17] training's auc: 0.873858 training's binary_logloss: 0.130581 valid_1's auc: 0.836513 valid_1's binary_logloss: 0.133386
[18] training's auc: 0.87479 training's binary_logloss: 0.129893 valid_1's auc: 0.836428 valid_1's binary_logloss: 0.133048
[19] training's auc: 0.875715 training's binary_logloss: 0.129254 valid_1's auc: 0.836102 valid_1's binary_logloss: 0.132732
[20] training's auc: 0.876736 training's binary_logloss: 0.128603 valid_1's auc: 0.836375 valid_1's binary_logloss: 0.132465
[21] training's auc: 0.878103 training's binary_logloss: 0.127953 valid_1's auc: 0.835829 valid_1's binary_logloss: 0.132306
[22] training's auc: 0.878886 training's binary_logloss: 0.12739 valid_1's auc: 0.835794 valid_1's binary_logloss: 0.132093
[23] training's auc: 0.880407 training's binary_logloss: 0.1268 valid_1's auc: 0.836325 valid_1's binary_logloss: 0.131847
[24] training's auc: 0.881723 training's binary_logloss: 0.126251 valid_1's auc: 0.836025 valid_1's binary_logloss: 0.131715
[25] training's auc: 0.882895 training's binary_logloss: 0.12573 valid_1's auc: 0.83542 valid_1's binary_logloss: 0.131627
[26] training's auc: 0.884086 training's binary_logloss: 0.125226 valid_1's auc: 0.835825 valid_1's binary_logloss: 0.131431
[27] training's auc: 0.8852 training's binary_logloss: 0.124689 valid_1's auc: 0.835708 valid_1's binary_logloss: 0.131362
[28] training's auc: 0.886431 training's binary_logloss: 0.124188 valid_1's auc: 0.835493 valid_1's binary_logloss: 0.131286
[29] training's auc: 0.887554 training's binary_logloss: 0.123669 valid_1's auc: 0.835606 valid_1's binary_logloss: 0.131174
[30] training's auc: 0.888472 training's binary_logloss: 0.123218 valid_1's auc: 0.8353 valid_1's binary_logloss: 0.131128
[31] training's auc: 0.889253 training's binary_logloss: 0.122779 valid_1's auc: 0.835381 valid_1's binary_logloss: 0.131053
[32] training's auc: 0.890172 training's binary_logloss: 0.122376 valid_1's auc: 0.83571 valid_1's binary_logloss: 0.130988
[33] training's auc: 0.891123 training's binary_logloss: 0.121981 valid_1's auc: 0.835725 valid_1's binary_logloss: 0.130933
[34] training's auc: 0.891993 training's binary_logloss: 0.121583 valid_1's auc: 0.835694 valid_1's binary_logloss: 0.130882
[35] training's auc: 0.892804 training's binary_logloss: 0.121225 valid_1's auc: 0.836193 valid_1's binary_logloss: 0.130803
[36] training's auc: 0.893698 training's binary_logloss: 0.120868 valid_1's auc: 0.836218 valid_1's binary_logloss: 0.130756
[37] training's auc: 0.894587 training's binary_logloss: 0.120537 valid_1's auc: 0.836403 valid_1's binary_logloss: 0.130726
[38] training's auc: 0.895894 training's binary_logloss: 0.12017 valid_1's auc: 0.836668 valid_1's binary_logloss: 0.13066
[39] training's auc: 0.89676 training's binary_logloss: 0.119834 valid_1's auc: 0.836884 valid_1's binary_logloss: 0.130624
[40] training's auc: 0.897908 training's binary_logloss: 0.119485 valid_1's auc: 0.836758 valid_1's binary_logloss: 0.130628
[41] training's auc: 0.899031 training's binary_logloss: 0.119134 valid_1's auc: 0.836836 valid_1's binary_logloss: 0.130592
[42] training's auc: 0.899779 training's binary_logloss: 0.118832 valid_1's auc: 0.836842 valid_1's binary_logloss: 0.130555
[43] training's auc: 0.90061 training's binary_logloss: 0.118551 valid_1's auc: 0.837118 valid_1's binary_logloss: 0.130528
[44] training's auc: 0.901692 training's binary_logloss: 0.118231 valid_1's auc: 0.836978 valid_1's binary_logloss: 0.130503
[45] training's auc: 0.902873 training's binary_logloss: 0.117955 valid_1's auc: 0.836697 valid_1's binary_logloss: 0.130536
[46] training's auc: 0.903763 training's binary_logloss: 0.117624 valid_1's auc: 0.836779 valid_1's binary_logloss: 0.130507
[47] training's auc: 0.904385 training's binary_logloss: 0.117314 valid_1's auc: 0.836647 valid_1's binary_logloss: 0.13053
[48] training's auc: 0.905132 training's binary_logloss: 0.117054 valid_1's auc: 0.83667 valid_1's binary_logloss: 0.130519
[49] training's auc: 0.90582 training's binary_logloss: 0.116776 valid_1's auc: 0.836535 valid_1's binary_logloss: 0.130526
[50] training's auc: 0.906685 training's binary_logloss: 0.116515 valid_1's auc: 0.836645 valid_1's binary_logloss: 0.130511
[51] training's auc: 0.907393 training's binary_logloss: 0.116233 valid_1's auc: 0.836419 valid_1's binary_logloss: 0.130549
[52] training's auc: 0.908155 training's binary_logloss: 0.115906 valid_1's auc: 0.83682 valid_1's binary_logloss: 0.130513
[53] training's auc: 0.908978 training's binary_logloss: 0.115632 valid_1's auc: 0.836792 valid_1's binary_logloss: 0.130515
[54] training's auc: 0.910154 training's binary_logloss: 0.11528 valid_1's auc: 0.836589 valid_1's binary_logloss: 0.13052
[55] training's auc: 0.91065 training's binary_logloss: 0.115017 valid_1's auc: 0.83667 valid_1's binary_logloss: 0.130529
[56] training's auc: 0.911401 training's binary_logloss: 0.114787 valid_1's auc: 0.836543 valid_1's binary_logloss: 0.130542
[57] training's auc: 0.912152 training's binary_logloss: 0.114438 valid_1's auc: 0.836247 valid_1's binary_logloss: 0.13062
[58] training's auc: 0.912718 training's binary_logloss: 0.11415 valid_1's auc: 0.836422 valid_1's binary_logloss: 0.130604
[59] training's auc: 0.913288 training's binary_logloss: 0.113881 valid_1's auc: 0.836274 valid_1's binary_logloss: 0.130659
[60] training's auc: 0.913878 training's binary_logloss: 0.11362 valid_1's auc: 0.836298 valid_1's binary_logloss: 0.130661
[61] training's auc: 0.914514 training's binary_logloss: 0.113338 valid_1's auc: 0.836247 valid_1's binary_logloss: 0.130692
[62] training's auc: 0.915189 training's binary_logloss: 0.113082 valid_1's auc: 0.836054 valid_1's binary_logloss: 0.130728
[63] training's auc: 0.915948 training's binary_logloss: 0.112834 valid_1's auc: 0.836045 valid_1's binary_logloss: 0.130742
[64] training's auc: 0.916478 training's binary_logloss: 0.112559 valid_1's auc: 0.83592 valid_1's binary_logloss: 0.130768
[65] training's auc: 0.917249 training's binary_logloss: 0.112279 valid_1's auc: 0.835768 valid_1's binary_logloss: 0.130777
[66] training's auc: 0.91778 training's binary_logloss: 0.112064 valid_1's auc: 0.835708 valid_1's binary_logloss: 0.130778
[67] training's auc: 0.918292 training's binary_logloss: 0.111823 valid_1's auc: 0.835719 valid_1's binary_logloss: 0.130783
[68] training's auc: 0.918597 training's binary_logloss: 0.111627 valid_1's auc: 0.83548 valid_1's binary_logloss: 0.130838
[69] training's auc: 0.918969 training's binary_logloss: 0.111438 valid_1's auc: 0.835289 valid_1's binary_logloss: 0.130863
[70] training's auc: 0.919372 training's binary_logloss: 0.111215 valid_1's auc: 0.835147 valid_1's binary_logloss: 0.130894
[71] training's auc: 0.919686 training's binary_logloss: 0.111018 valid_1's auc: 0.834969 valid_1's binary_logloss: 0.130913
[72] training's auc: 0.92018 training's binary_logloss: 0.110773 valid_1's auc: 0.834828 valid_1's binary_logloss: 0.130957
[73] training's auc: 0.920605 training's binary_logloss: 0.110529 valid_1's auc: 0.834838 valid_1's binary_logloss: 0.130998
Early stopping, best iteration is:
[43] training's auc: 0.90061 training's binary_logloss: 0.118551 valid_1's auc: 0.837118 valid_1's binary_logloss: 0.130528
[1] training's auc: 0.831449 training's binary_logloss: 0.156612 valid_1's auc: 0.812186 valid_1's binary_logloss: 0.162609
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.838544 training's binary_logloss: 0.15198 valid_1's auc: 0.818631 valid_1's binary_logloss: 0.15834
[3] training's auc: 0.845962 training's binary_logloss: 0.148315 valid_1's auc: 0.826533 valid_1's binary_logloss: 0.154811
[4] training's auc: 0.847613 training's binary_logloss: 0.145358 valid_1's auc: 0.826752 valid_1's binary_logloss: 0.152238
[5] training's auc: 0.851244 training's binary_logloss: 0.14278 valid_1's auc: 0.826951 valid_1's binary_logloss: 0.150168
[6] training's auc: 0.853801 training's binary_logloss: 0.1406 valid_1's auc: 0.828619 valid_1's binary_logloss: 0.148366
[7] training's auc: 0.856072 training's binary_logloss: 0.138717 valid_1's auc: 0.829353 valid_1's binary_logloss: 0.146869
[8] training's auc: 0.858102 training's binary_logloss: 0.137004 valid_1's auc: 0.829894 valid_1's binary_logloss: 0.145609
[9] training's auc: 0.859304 training's binary_logloss: 0.135609 valid_1's auc: 0.830241 valid_1's binary_logloss: 0.144535
[10] training's auc: 0.860572 training's binary_logloss: 0.134246 valid_1's auc: 0.830829 valid_1's binary_logloss: 0.1436
[11] training's auc: 0.861543 training's binary_logloss: 0.13308 valid_1's auc: 0.83108 valid_1's binary_logloss: 0.14279
[12] training's auc: 0.863605 training's binary_logloss: 0.131996 valid_1's auc: 0.831866 valid_1's binary_logloss: 0.142051
[13] training's auc: 0.866796 training's binary_logloss: 0.131005 valid_1's auc: 0.833934 valid_1's binary_logloss: 0.141452
[14] training's auc: 0.867831 training's binary_logloss: 0.130151 valid_1's auc: 0.833984 valid_1's binary_logloss: 0.140874
[15] training's auc: 0.868812 training's binary_logloss: 0.129316 valid_1's auc: 0.834186 valid_1's binary_logloss: 0.140349
[16] training's auc: 0.870408 training's binary_logloss: 0.128513 valid_1's auc: 0.833926 valid_1's binary_logloss: 0.139925
[17] training's auc: 0.872207 training's binary_logloss: 0.127774 valid_1's auc: 0.833675 valid_1's binary_logloss: 0.139565
[18] training's auc: 0.874225 training's binary_logloss: 0.127072 valid_1's auc: 0.833749 valid_1's binary_logloss: 0.139243
[19] training's auc: 0.875987 training's binary_logloss: 0.126391 valid_1's auc: 0.834193 valid_1's binary_logloss: 0.138919
[20] training's auc: 0.877256 training's binary_logloss: 0.125776 valid_1's auc: 0.834358 valid_1's binary_logloss: 0.138635
[21] training's auc: 0.877916 training's binary_logloss: 0.125208 valid_1's auc: 0.834422 valid_1's binary_logloss: 0.138379
[22] training's auc: 0.879106 training's binary_logloss: 0.124598 valid_1's auc: 0.835397 valid_1's binary_logloss: 0.138081
[23] training's auc: 0.880253 training's binary_logloss: 0.12403 valid_1's auc: 0.835149 valid_1's binary_logloss: 0.137929
[24] training's auc: 0.881439 training's binary_logloss: 0.123504 valid_1's auc: 0.835204 valid_1's binary_logloss: 0.137706
[25] training's auc: 0.882928 training's binary_logloss: 0.122993 valid_1's auc: 0.834476 valid_1's binary_logloss: 0.137609
[26] training's auc: 0.884344 training's binary_logloss: 0.122475 valid_1's auc: 0.83465 valid_1's binary_logloss: 0.137477
[27] training's auc: 0.88567 training's binary_logloss: 0.122011 valid_1's auc: 0.834629 valid_1's binary_logloss: 0.137347
[28] training's auc: 0.886836 training's binary_logloss: 0.121522 valid_1's auc: 0.834662 valid_1's binary_logloss: 0.137262
[29] training's auc: 0.887996 training's binary_logloss: 0.121067 valid_1's auc: 0.834568 valid_1's binary_logloss: 0.137184
[30] training's auc: 0.88917 training's binary_logloss: 0.120636 valid_1's auc: 0.834537 valid_1's binary_logloss: 0.137136
[31] training's auc: 0.890518 training's binary_logloss: 0.120186 valid_1's auc: 0.834577 valid_1's binary_logloss: 0.137043
[32] training's auc: 0.891685 training's binary_logloss: 0.119767 valid_1's auc: 0.834491 valid_1's binary_logloss: 0.136929
[33] training's auc: 0.89238 training's binary_logloss: 0.119375 valid_1's auc: 0.834389 valid_1's binary_logloss: 0.136899
[34] training's auc: 0.893423 training's binary_logloss: 0.118966 valid_1's auc: 0.834363 valid_1's binary_logloss: 0.136853
[35] training's auc: 0.89453 training's binary_logloss: 0.118578 valid_1's auc: 0.834314 valid_1's binary_logloss: 0.136834
[36] training's auc: 0.895678 training's binary_logloss: 0.118174 valid_1's auc: 0.833647 valid_1's binary_logloss: 0.136873
[37] training's auc: 0.896658 training's binary_logloss: 0.117818 valid_1's auc: 0.833576 valid_1's binary_logloss: 0.136853
[38] training's auc: 0.897511 training's binary_logloss: 0.117456 valid_1's auc: 0.833379 valid_1's binary_logloss: 0.136879
[39] training's auc: 0.898513 training's binary_logloss: 0.117084 valid_1's auc: 0.833757 valid_1's binary_logloss: 0.136831
[40] training's auc: 0.899563 training's binary_logloss: 0.116742 valid_1's auc: 0.833841 valid_1's binary_logloss: 0.136805
[41] training's auc: 0.900523 training's binary_logloss: 0.116382 valid_1's auc: 0.833902 valid_1's binary_logloss: 0.136797
[42] training's auc: 0.901757 training's binary_logloss: 0.116039 valid_1's auc: 0.833718 valid_1's binary_logloss: 0.136807
[43] training's auc: 0.902504 training's binary_logloss: 0.115729 valid_1's auc: 0.833766 valid_1's binary_logloss: 0.136759
[44] training's auc: 0.903397 training's binary_logloss: 0.115411 valid_1's auc: 0.833738 valid_1's binary_logloss: 0.136743
[45] training's auc: 0.90415 training's binary_logloss: 0.115085 valid_1's auc: 0.833555 valid_1's binary_logloss: 0.136762
[46] training's auc: 0.905083 training's binary_logloss: 0.114801 valid_1's auc: 0.833766 valid_1's binary_logloss: 0.136752
[47] training's auc: 0.905951 training's binary_logloss: 0.114525 valid_1's auc: 0.833694 valid_1's binary_logloss: 0.136774
[48] training's auc: 0.906707 training's binary_logloss: 0.11419 valid_1's auc: 0.833433 valid_1's binary_logloss: 0.136836
[49] training's auc: 0.907162 training's binary_logloss: 0.113932 valid_1's auc: 0.833518 valid_1's binary_logloss: 0.136822
[50] training's auc: 0.90782 training's binary_logloss: 0.11367 valid_1's auc: 0.833616 valid_1's binary_logloss: 0.136821
[51] training's auc: 0.908567 training's binary_logloss: 0.113369 valid_1's auc: 0.833593 valid_1's binary_logloss: 0.136836
[52] training's auc: 0.909113 training's binary_logloss: 0.113103 valid_1's auc: 0.833387 valid_1's binary_logloss: 0.136905
Early stopping, best iteration is:
[22] training's auc: 0.879106 training's binary_logloss: 0.124598 valid_1's auc: 0.835397 valid_1's binary_logloss: 0.138081
[1] training's auc: 0.840935 training's binary_logloss: 0.159881 valid_1's auc: 0.804742 valid_1's binary_logloss: 0.161628
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.847957 training's binary_logloss: 0.156222 valid_1's auc: 0.810282 valid_1's binary_logloss: 0.158673
[3] training's auc: 0.853174 training's binary_logloss: 0.153239 valid_1's auc: 0.815485 valid_1's binary_logloss: 0.156283
[4] training's auc: 0.856652 training's binary_logloss: 0.150625 valid_1's auc: 0.817624 valid_1's binary_logloss: 0.154204
[5] training's auc: 0.857339 training's binary_logloss: 0.148434 valid_1's auc: 0.817866 valid_1's binary_logloss: 0.152485
[6] training's auc: 0.859814 training's binary_logloss: 0.146458 valid_1's auc: 0.820682 valid_1's binary_logloss: 0.150961
[7] training's auc: 0.861776 training's binary_logloss: 0.144671 valid_1's auc: 0.82228 valid_1's binary_logloss: 0.149573
[8] training's auc: 0.863423 training's binary_logloss: 0.143031 valid_1's auc: 0.82382 valid_1's binary_logloss: 0.148364
[9] training's auc: 0.864733 training's binary_logloss: 0.141485 valid_1's auc: 0.824847 valid_1's binary_logloss: 0.147309
[10] training's auc: 0.865481 training's binary_logloss: 0.140102 valid_1's auc: 0.824906 valid_1's binary_logloss: 0.146407
[11] training's auc: 0.867742 training's binary_logloss: 0.138858 valid_1's auc: 0.826195 valid_1's binary_logloss: 0.145489
[12] training's auc: 0.86866 training's binary_logloss: 0.137709 valid_1's auc: 0.827161 valid_1's binary_logloss: 0.144616
[13] training's auc: 0.869626 training's binary_logloss: 0.136617 valid_1's auc: 0.827478 valid_1's binary_logloss: 0.14384
[14] training's auc: 0.870417 training's binary_logloss: 0.135593 valid_1's auc: 0.827942 valid_1's binary_logloss: 0.143205
[15] training's auc: 0.872478 training's binary_logloss: 0.13468 valid_1's auc: 0.829017 valid_1's binary_logloss: 0.142546
[16] training's auc: 0.873608 training's binary_logloss: 0.133793 valid_1's auc: 0.829375 valid_1's binary_logloss: 0.141923
[17] training's auc: 0.87458 training's binary_logloss: 0.132938 valid_1's auc: 0.829441 valid_1's binary_logloss: 0.141405
[18] training's auc: 0.875345 training's binary_logloss: 0.132145 valid_1's auc: 0.829657 valid_1's binary_logloss: 0.140927
[19] training's auc: 0.876353 training's binary_logloss: 0.131391 valid_1's auc: 0.829835 valid_1's binary_logloss: 0.140485
[20] training's auc: 0.877432 training's binary_logloss: 0.130679 valid_1's auc: 0.829853 valid_1's binary_logloss: 0.140103
[21] training's auc: 0.879221 training's binary_logloss: 0.129997 valid_1's auc: 0.830168 valid_1's binary_logloss: 0.139711
[22] training's auc: 0.880374 training's binary_logloss: 0.129358 valid_1's auc: 0.830206 valid_1's binary_logloss: 0.139386
[23] training's auc: 0.88133 training's binary_logloss: 0.128719 valid_1's auc: 0.830319 valid_1's binary_logloss: 0.139085
[24] training's auc: 0.882337 training's binary_logloss: 0.128093 valid_1's auc: 0.830658 valid_1's binary_logloss: 0.138796
[25] training's auc: 0.882987 training's binary_logloss: 0.127518 valid_1's auc: 0.830922 valid_1's binary_logloss: 0.138506
[26] training's auc: 0.884003 training's binary_logloss: 0.12697 valid_1's auc: 0.83139 valid_1's binary_logloss: 0.138261
[27] training's auc: 0.885043 training's binary_logloss: 0.126446 valid_1's auc: 0.831439 valid_1's binary_logloss: 0.138021
[28] training's auc: 0.885579 training's binary_logloss: 0.125972 valid_1's auc: 0.831555 valid_1's binary_logloss: 0.13781
[29] training's auc: 0.88654 training's binary_logloss: 0.125457 valid_1's auc: 0.831881 valid_1's binary_logloss: 0.137581
[30] training's auc: 0.887563 training's binary_logloss: 0.124981 valid_1's auc: 0.832029 valid_1's binary_logloss: 0.137446
[31] training's auc: 0.888663 training's binary_logloss: 0.124516 valid_1's auc: 0.832112 valid_1's binary_logloss: 0.137286
[32] training's auc: 0.889311 training's binary_logloss: 0.124099 valid_1's auc: 0.832046 valid_1's binary_logloss: 0.137143
[33] training's auc: 0.890452 training's binary_logloss: 0.123644 valid_1's auc: 0.831988 valid_1's binary_logloss: 0.137022
[34] training's auc: 0.891148 training's binary_logloss: 0.12321 valid_1's auc: 0.832061 valid_1's binary_logloss: 0.136889
[35] training's auc: 0.892061 training's binary_logloss: 0.122778 valid_1's auc: 0.831849 valid_1's binary_logloss: 0.136784
[36] training's auc: 0.892755 training's binary_logloss: 0.122376 valid_1's auc: 0.832087 valid_1's binary_logloss: 0.136654
[37] training's auc: 0.893527 training's binary_logloss: 0.121986 valid_1's auc: 0.832777 valid_1's binary_logloss: 0.136553
[38] training's auc: 0.894283 training's binary_logloss: 0.1216 valid_1's auc: 0.832396 valid_1's binary_logloss: 0.136457
[39] training's auc: 0.894848 training's binary_logloss: 0.121224 valid_1's auc: 0.832374 valid_1's binary_logloss: 0.136388
[40] training's auc: 0.895351 training's binary_logloss: 0.120861 valid_1's auc: 0.832183 valid_1's binary_logloss: 0.136345
[41] training's auc: 0.896179 training's binary_logloss: 0.120497 valid_1's auc: 0.832469 valid_1's binary_logloss: 0.136266
[42] training's auc: 0.89709 training's binary_logloss: 0.120116 valid_1's auc: 0.832358 valid_1's binary_logloss: 0.136239
[43] training's auc: 0.897875 training's binary_logloss: 0.119793 valid_1's auc: 0.832427 valid_1's binary_logloss: 0.136185
[44] training's auc: 0.898488 training's binary_logloss: 0.119453 valid_1's auc: 0.832704 valid_1's binary_logloss: 0.136099
[45] training's auc: 0.899305 training's binary_logloss: 0.11909 valid_1's auc: 0.832725 valid_1's binary_logloss: 0.136019
[46] training's auc: 0.899898 training's binary_logloss: 0.118766 valid_1's auc: 0.832877 valid_1's binary_logloss: 0.135943
[47] training's auc: 0.900641 training's binary_logloss: 0.118432 valid_1's auc: 0.832883 valid_1's binary_logloss: 0.135893
[48] training's auc: 0.901679 training's binary_logloss: 0.1181 valid_1's auc: 0.833059 valid_1's binary_logloss: 0.135842
[49] training's auc: 0.902337 training's binary_logloss: 0.117776 valid_1's auc: 0.832864 valid_1's binary_logloss: 0.135839
[50] training's auc: 0.903069 training's binary_logloss: 0.117434 valid_1's auc: 0.832714 valid_1's binary_logloss: 0.135847
[51] training's auc: 0.903651 training's binary_logloss: 0.117147 valid_1's auc: 0.832853 valid_1's binary_logloss: 0.135809
[52] training's auc: 0.904298 training's binary_logloss: 0.116815 valid_1's auc: 0.8327 valid_1's binary_logloss: 0.135785
[53] training's auc: 0.905031 training's binary_logloss: 0.116514 valid_1's auc: 0.832519 valid_1's binary_logloss: 0.135799
[54] training's auc: 0.905667 training's binary_logloss: 0.116218 valid_1's auc: 0.832352 valid_1's binary_logloss: 0.135785
[55] training's auc: 0.906371 training's binary_logloss: 0.11592 valid_1's auc: 0.832385 valid_1's binary_logloss: 0.135792
[56] training's auc: 0.907118 training's binary_logloss: 0.115619 valid_1's auc: 0.832121 valid_1's binary_logloss: 0.135802
[57] training's auc: 0.907789 training's binary_logloss: 0.115363 valid_1's auc: 0.832222 valid_1's binary_logloss: 0.135785
[58] training's auc: 0.908425 training's binary_logloss: 0.115084 valid_1's auc: 0.832036 valid_1's binary_logloss: 0.135787
[59] training's auc: 0.90904 training's binary_logloss: 0.114803 valid_1's auc: 0.832043 valid_1's binary_logloss: 0.13578
[60] training's auc: 0.909495 training's binary_logloss: 0.114562 valid_1's auc: 0.832062 valid_1's binary_logloss: 0.13579
[61] training's auc: 0.910133 training's binary_logloss: 0.114277 valid_1's auc: 0.832104 valid_1's binary_logloss: 0.135771
[62] training's auc: 0.910699 training's binary_logloss: 0.114016 valid_1's auc: 0.832045 valid_1's binary_logloss: 0.135777
[63] training's auc: 0.911164 training's binary_logloss: 0.113751 valid_1's auc: 0.831853 valid_1's binary_logloss: 0.135802
[64] training's auc: 0.911926 training's binary_logloss: 0.113468 valid_1's auc: 0.832034 valid_1's binary_logloss: 0.135756
[65] training's auc: 0.912488 training's binary_logloss: 0.113228 valid_1's auc: 0.831796 valid_1's binary_logloss: 0.135781
[66] training's auc: 0.913168 training's binary_logloss: 0.112968 valid_1's auc: 0.831672 valid_1's binary_logloss: 0.135804
[67] training's auc: 0.913913 training's binary_logloss: 0.112696 valid_1's auc: 0.831679 valid_1's binary_logloss: 0.135795
[68] training's auc: 0.914415 training's binary_logloss: 0.112468 valid_1's auc: 0.831245 valid_1's binary_logloss: 0.13586
[69] training's auc: 0.915225 training's binary_logloss: 0.112221 valid_1's auc: 0.831236 valid_1's binary_logloss: 0.135849
[70] training's auc: 0.915817 training's binary_logloss: 0.111999 valid_1's auc: 0.831195 valid_1's binary_logloss: 0.135843
[71] training's auc: 0.916213 training's binary_logloss: 0.11179 valid_1's auc: 0.831133 valid_1's binary_logloss: 0.135859
[72] training's auc: 0.916669 training's binary_logloss: 0.111562 valid_1's auc: 0.831053 valid_1's binary_logloss: 0.135872
[73] training's auc: 0.917239 training's binary_logloss: 0.11136 valid_1's auc: 0.830796 valid_1's binary_logloss: 0.135925
[74] training's auc: 0.91764 training's binary_logloss: 0.111169 valid_1's auc: 0.830826 valid_1's binary_logloss: 0.135907
[75] training's auc: 0.918135 training's binary_logloss: 0.110922 valid_1's auc: 0.830919 valid_1's binary_logloss: 0.135899
[76] training's auc: 0.918847 training's binary_logloss: 0.110692 valid_1's auc: 0.830835 valid_1's binary_logloss: 0.135913
[77] training's auc: 0.919396 training's binary_logloss: 0.110493 valid_1's auc: 0.83071 valid_1's binary_logloss: 0.135954
[78] training's auc: 0.9199 training's binary_logloss: 0.110274 valid_1's auc: 0.83049 valid_1's binary_logloss: 0.13601
Early stopping, best iteration is:
[48] training's auc: 0.901679 training's binary_logloss: 0.1181 valid_1's auc: 0.833059 valid_1's binary_logloss: 0.135842
[1] training's auc: 0.837949 training's binary_logloss: 0.162283 valid_1's auc: 0.815351 valid_1's binary_logloss: 0.157047
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.843734 training's binary_logloss: 0.15863 valid_1's auc: 0.819177 valid_1's binary_logloss: 0.154047
[3] training's auc: 0.848089 training's binary_logloss: 0.15552 valid_1's auc: 0.8224 valid_1's binary_logloss: 0.151647
[4] training's auc: 0.852706 training's binary_logloss: 0.15295 valid_1's auc: 0.825322 valid_1's binary_logloss: 0.149599
[5] training's auc: 0.855009 training's binary_logloss: 0.150669 valid_1's auc: 0.826778 valid_1's binary_logloss: 0.147814
[6] training's auc: 0.859814 training's binary_logloss: 0.148639 valid_1's auc: 0.828786 valid_1's binary_logloss: 0.146289
[7] training's auc: 0.861126 training's binary_logloss: 0.146821 valid_1's auc: 0.830354 valid_1's binary_logloss: 0.144913
[8] training's auc: 0.863196 training's binary_logloss: 0.145173 valid_1's auc: 0.829836 valid_1's binary_logloss: 0.143701
[9] training's auc: 0.864377 training's binary_logloss: 0.143643 valid_1's auc: 0.831314 valid_1's binary_logloss: 0.142653
[10] training's auc: 0.866081 training's binary_logloss: 0.142283 valid_1's auc: 0.830775 valid_1's binary_logloss: 0.14172
[11] training's auc: 0.867413 training's binary_logloss: 0.140975 valid_1's auc: 0.830769 valid_1's binary_logloss: 0.140896
[12] training's auc: 0.868143 training's binary_logloss: 0.139787 valid_1's auc: 0.831691 valid_1's binary_logloss: 0.140055
[13] training's auc: 0.869441 training's binary_logloss: 0.138671 valid_1's auc: 0.832871 valid_1's binary_logloss: 0.139322
[14] training's auc: 0.870636 training's binary_logloss: 0.137643 valid_1's auc: 0.83312 valid_1's binary_logloss: 0.138673
[15] training's auc: 0.871385 training's binary_logloss: 0.136684 valid_1's auc: 0.833455 valid_1's binary_logloss: 0.138057
[16] training's auc: 0.872498 training's binary_logloss: 0.135766 valid_1's auc: 0.834025 valid_1's binary_logloss: 0.13749
[17] training's auc: 0.873553 training's binary_logloss: 0.134903 valid_1's auc: 0.834081 valid_1's binary_logloss: 0.136996
[18] training's auc: 0.874577 training's binary_logloss: 0.134092 valid_1's auc: 0.833892 valid_1's binary_logloss: 0.136536
[19] training's auc: 0.875456 training's binary_logloss: 0.133333 valid_1's auc: 0.834026 valid_1's binary_logloss: 0.136094
[20] training's auc: 0.876138 training's binary_logloss: 0.132626 valid_1's auc: 0.834154 valid_1's binary_logloss: 0.135685
[21] training's auc: 0.877147 training's binary_logloss: 0.131945 valid_1's auc: 0.834133 valid_1's binary_logloss: 0.135328
[22] training's auc: 0.877853 training's binary_logloss: 0.131288 valid_1's auc: 0.833895 valid_1's binary_logloss: 0.135039
[23] training's auc: 0.879031 training's binary_logloss: 0.130622 valid_1's auc: 0.834257 valid_1's binary_logloss: 0.134682
[24] training's auc: 0.880063 training's binary_logloss: 0.130052 valid_1's auc: 0.834007 valid_1's binary_logloss: 0.134406
[25] training's auc: 0.880863 training's binary_logloss: 0.129477 valid_1's auc: 0.834243 valid_1's binary_logloss: 0.134132
[26] training's auc: 0.881512 training's binary_logloss: 0.128936 valid_1's auc: 0.833904 valid_1's binary_logloss: 0.1339
[27] training's auc: 0.88223 training's binary_logloss: 0.12842 valid_1's auc: 0.833702 valid_1's binary_logloss: 0.133692
[28] training's auc: 0.883159 training's binary_logloss: 0.12791 valid_1's auc: 0.833569 valid_1's binary_logloss: 0.133493
[29] training's auc: 0.883774 training's binary_logloss: 0.127432 valid_1's auc: 0.833414 valid_1's binary_logloss: 0.133321
[30] training's auc: 0.88504 training's binary_logloss: 0.126938 valid_1's auc: 0.833117 valid_1's binary_logloss: 0.133159
[31] training's auc: 0.885678 training's binary_logloss: 0.126474 valid_1's auc: 0.833201 valid_1's binary_logloss: 0.132979
[32] training's auc: 0.886358 training's binary_logloss: 0.126048 valid_1's auc: 0.833096 valid_1's binary_logloss: 0.132844
[33] training's auc: 0.887307 training's binary_logloss: 0.125606 valid_1's auc: 0.833199 valid_1's binary_logloss: 0.132716
[34] training's auc: 0.887914 training's binary_logloss: 0.125196 valid_1's auc: 0.833107 valid_1's binary_logloss: 0.132606
[35] training's auc: 0.888758 training's binary_logloss: 0.124784 valid_1's auc: 0.832932 valid_1's binary_logloss: 0.132507
[36] training's auc: 0.889561 training's binary_logloss: 0.124374 valid_1's auc: 0.832985 valid_1's binary_logloss: 0.132383
[37] training's auc: 0.890384 training's binary_logloss: 0.123998 valid_1's auc: 0.833065 valid_1's binary_logloss: 0.132268
[38] training's auc: 0.891156 training's binary_logloss: 0.123598 valid_1's auc: 0.833131 valid_1's binary_logloss: 0.13216
[39] training's auc: 0.891872 training's binary_logloss: 0.123222 valid_1's auc: 0.833435 valid_1's binary_logloss: 0.132047
[40] training's auc: 0.892594 training's binary_logloss: 0.122865 valid_1's auc: 0.833187 valid_1's binary_logloss: 0.131999
[41] training's auc: 0.89352 training's binary_logloss: 0.122484 valid_1's auc: 0.833153 valid_1's binary_logloss: 0.131916
[42] training's auc: 0.894306 training's binary_logloss: 0.1221 valid_1's auc: 0.833086 valid_1's binary_logloss: 0.13182
[43] training's auc: 0.895121 training's binary_logloss: 0.121696 valid_1's auc: 0.833306 valid_1's binary_logloss: 0.131734
[44] training's auc: 0.895865 training's binary_logloss: 0.121332 valid_1's auc: 0.833442 valid_1's binary_logloss: 0.131669
[45] training's auc: 0.896565 training's binary_logloss: 0.120965 valid_1's auc: 0.833479 valid_1's binary_logloss: 0.131621
[46] training's auc: 0.897271 training's binary_logloss: 0.12063 valid_1's auc: 0.833204 valid_1's binary_logloss: 0.131608
[47] training's auc: 0.898082 training's binary_logloss: 0.12029 valid_1's auc: 0.833167 valid_1's binary_logloss: 0.131553
[48] training's auc: 0.898862 training's binary_logloss: 0.11996 valid_1's auc: 0.833481 valid_1's binary_logloss: 0.131478
[49] training's auc: 0.89966 training's binary_logloss: 0.119618 valid_1's auc: 0.833485 valid_1's binary_logloss: 0.131463
[50] training's auc: 0.900194 training's binary_logloss: 0.119324 valid_1's auc: 0.833566 valid_1's binary_logloss: 0.131443
[51] training's auc: 0.900817 training's binary_logloss: 0.119013 valid_1's auc: 0.833519 valid_1's binary_logloss: 0.131391
[52] training's auc: 0.901597 training's binary_logloss: 0.118704 valid_1's auc: 0.833426 valid_1's binary_logloss: 0.131374
[53] training's auc: 0.902694 training's binary_logloss: 0.118356 valid_1's auc: 0.833346 valid_1's binary_logloss: 0.131379
Early stopping, best iteration is:
[23] training's auc: 0.879031 training's binary_logloss: 0.130622 valid_1's auc: 0.834257 valid_1's binary_logloss: 0.134682
[1] training's auc: 0.838432 training's binary_logloss: 0.158731 valid_1's auc: 0.815042 valid_1's binary_logloss: 0.164692
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.844554 training's binary_logloss: 0.155218 valid_1's auc: 0.821631 valid_1's binary_logloss: 0.161498
[3] training's auc: 0.849537 training's binary_logloss: 0.152273 valid_1's auc: 0.823726 valid_1's binary_logloss: 0.158886
[4] training's auc: 0.853703 training's binary_logloss: 0.149675 valid_1's auc: 0.826664 valid_1's binary_logloss: 0.156484
[5] training's auc: 0.856431 training's binary_logloss: 0.147453 valid_1's auc: 0.827965 valid_1's binary_logloss: 0.154531
[6] training's auc: 0.857713 training's binary_logloss: 0.145463 valid_1's auc: 0.828199 valid_1's binary_logloss: 0.152847
[7] training's auc: 0.859433 training's binary_logloss: 0.143666 valid_1's auc: 0.82775 valid_1's binary_logloss: 0.1514
[8] training's auc: 0.861784 training's binary_logloss: 0.142041 valid_1's auc: 0.828692 valid_1's binary_logloss: 0.150079
[9] training's auc: 0.863004 training's binary_logloss: 0.140528 valid_1's auc: 0.828624 valid_1's binary_logloss: 0.148956
[10] training's auc: 0.863958 training's binary_logloss: 0.139193 valid_1's auc: 0.82916 valid_1's binary_logloss: 0.147885
[11] training's auc: 0.865291 training's binary_logloss: 0.137929 valid_1's auc: 0.830482 valid_1's binary_logloss: 0.146941
[12] training's auc: 0.866291 training's binary_logloss: 0.13679 valid_1's auc: 0.830498 valid_1's binary_logloss: 0.146114
[13] training's auc: 0.867116 training's binary_logloss: 0.135758 valid_1's auc: 0.830383 valid_1's binary_logloss: 0.145361
[14] training's auc: 0.867733 training's binary_logloss: 0.134797 valid_1's auc: 0.830153 valid_1's binary_logloss: 0.144708
[15] training's auc: 0.868328 training's binary_logloss: 0.133915 valid_1's auc: 0.830513 valid_1's binary_logloss: 0.144105
[16] training's auc: 0.868934 training's binary_logloss: 0.133066 valid_1's auc: 0.830706 valid_1's binary_logloss: 0.143519
[17] training's auc: 0.869696 training's binary_logloss: 0.132267 valid_1's auc: 0.83084 valid_1's binary_logloss: 0.143001
[18] training's auc: 0.870761 training's binary_logloss: 0.131512 valid_1's auc: 0.831593 valid_1's binary_logloss: 0.14247
[19] training's auc: 0.873185 training's binary_logloss: 0.130776 valid_1's auc: 0.833455 valid_1's binary_logloss: 0.14203
[20] training's auc: 0.874538 training's binary_logloss: 0.130075 valid_1's auc: 0.8337 valid_1's binary_logloss: 0.141641
[21] training's auc: 0.875355 training's binary_logloss: 0.129411 valid_1's auc: 0.833426 valid_1's binary_logloss: 0.141274
[22] training's auc: 0.876843 training's binary_logloss: 0.128774 valid_1's auc: 0.833567 valid_1's binary_logloss: 0.140939
[23] training's auc: 0.878077 training's binary_logloss: 0.128133 valid_1's auc: 0.834088 valid_1's binary_logloss: 0.140531
[24] training's auc: 0.879382 training's binary_logloss: 0.127589 valid_1's auc: 0.833961 valid_1's binary_logloss: 0.14025
[25] training's auc: 0.880575 training's binary_logloss: 0.126992 valid_1's auc: 0.833788 valid_1's binary_logloss: 0.139979
[26] training's auc: 0.881907 training's binary_logloss: 0.126421 valid_1's auc: 0.833775 valid_1's binary_logloss: 0.139719
[27] training's auc: 0.882714 training's binary_logloss: 0.125884 valid_1's auc: 0.833787 valid_1's binary_logloss: 0.139484
[28] training's auc: 0.884258 training's binary_logloss: 0.12535 valid_1's auc: 0.8343 valid_1's binary_logloss: 0.13926
[29] training's auc: 0.885221 training's binary_logloss: 0.124864 valid_1's auc: 0.834474 valid_1's binary_logloss: 0.139054
[30] training's auc: 0.886135 training's binary_logloss: 0.124386 valid_1's auc: 0.834626 valid_1's binary_logloss: 0.138868
[31] training's auc: 0.887161 training's binary_logloss: 0.123919 valid_1's auc: 0.834973 valid_1's binary_logloss: 0.138672
[32] training's auc: 0.888114 training's binary_logloss: 0.123457 valid_1's auc: 0.834899 valid_1's binary_logloss: 0.138497
[33] training's auc: 0.889017 training's binary_logloss: 0.123025 valid_1's auc: 0.834627 valid_1's binary_logloss: 0.138363
[34] training's auc: 0.890042 training's binary_logloss: 0.122557 valid_1's auc: 0.834476 valid_1's binary_logloss: 0.138212
[35] training's auc: 0.891127 training's binary_logloss: 0.122132 valid_1's auc: 0.834489 valid_1's binary_logloss: 0.138077
[36] training's auc: 0.891975 training's binary_logloss: 0.121728 valid_1's auc: 0.834185 valid_1's binary_logloss: 0.137994
[37] training's auc: 0.892758 training's binary_logloss: 0.121321 valid_1's auc: 0.834019 valid_1's binary_logloss: 0.137926
[38] training's auc: 0.893597 training's binary_logloss: 0.120911 valid_1's auc: 0.833691 valid_1's binary_logloss: 0.137882
[39] training's auc: 0.894168 training's binary_logloss: 0.120518 valid_1's auc: 0.833513 valid_1's binary_logloss: 0.137812
[40] training's auc: 0.895178 training's binary_logloss: 0.12013 valid_1's auc: 0.833283 valid_1's binary_logloss: 0.137753
[41] training's auc: 0.896105 training's binary_logloss: 0.119775 valid_1's auc: 0.833474 valid_1's binary_logloss: 0.137651
[42] training's auc: 0.89712 training's binary_logloss: 0.119385 valid_1's auc: 0.833205 valid_1's binary_logloss: 0.13762
[43] training's auc: 0.898114 training's binary_logloss: 0.119046 valid_1's auc: 0.833054 valid_1's binary_logloss: 0.137581
[44] training's auc: 0.898898 training's binary_logloss: 0.118687 valid_1's auc: 0.832876 valid_1's binary_logloss: 0.137509
[45] training's auc: 0.899523 training's binary_logloss: 0.118363 valid_1's auc: 0.832957 valid_1's binary_logloss: 0.137403
[46] training's auc: 0.900335 training's binary_logloss: 0.118023 valid_1's auc: 0.832984 valid_1's binary_logloss: 0.137331
[47] training's auc: 0.901152 training's binary_logloss: 0.117675 valid_1's auc: 0.832973 valid_1's binary_logloss: 0.137299
[48] training's auc: 0.901802 training's binary_logloss: 0.117349 valid_1's auc: 0.832665 valid_1's binary_logloss: 0.137293
[49] training's auc: 0.902447 training's binary_logloss: 0.117021 valid_1's auc: 0.832505 valid_1's binary_logloss: 0.137324
[50] training's auc: 0.903109 training's binary_logloss: 0.11673 valid_1's auc: 0.832805 valid_1's binary_logloss: 0.137284
[51] training's auc: 0.903765 training's binary_logloss: 0.116419 valid_1's auc: 0.83266 valid_1's binary_logloss: 0.137264
[52] training's auc: 0.904288 training's binary_logloss: 0.116134 valid_1's auc: 0.832811 valid_1's binary_logloss: 0.137228
[53] training's auc: 0.905082 training's binary_logloss: 0.115807 valid_1's auc: 0.832784 valid_1's binary_logloss: 0.137216
[54] training's auc: 0.905655 training's binary_logloss: 0.115535 valid_1's auc: 0.832751 valid_1's binary_logloss: 0.137212
[55] training's auc: 0.906142 training's binary_logloss: 0.115263 valid_1's auc: 0.832793 valid_1's binary_logloss: 0.137199
[56] training's auc: 0.906789 training's binary_logloss: 0.114983 valid_1's auc: 0.832556 valid_1's binary_logloss: 0.137238
[57] training's auc: 0.907395 training's binary_logloss: 0.114706 valid_1's auc: 0.83246 valid_1's binary_logloss: 0.13726
[58] training's auc: 0.907978 training's binary_logloss: 0.114419 valid_1's auc: 0.832514 valid_1's binary_logloss: 0.137258
[59] training's auc: 0.908771 training's binary_logloss: 0.114133 valid_1's auc: 0.83241 valid_1's binary_logloss: 0.137275
[60] training's auc: 0.909424 training's binary_logloss: 0.113881 valid_1's auc: 0.83234 valid_1's binary_logloss: 0.137269
[61] training's auc: 0.909869 training's binary_logloss: 0.113641 valid_1's auc: 0.832231 valid_1's binary_logloss: 0.137297
Early stopping, best iteration is:
[31] training's auc: 0.887161 training's binary_logloss: 0.123919 valid_1's auc: 0.834973 valid_1's binary_logloss: 0.138672
[1] training's auc: 0.836361 training's binary_logloss: 0.156044 valid_1's auc: 0.811205 valid_1's binary_logloss: 0.158509
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.841442 training's binary_logloss: 0.150775 valid_1's auc: 0.811491 valid_1's binary_logloss: 0.154319
[3] training's auc: 0.850747 training's binary_logloss: 0.146812 valid_1's auc: 0.819111 valid_1's binary_logloss: 0.151074
[4] training's auc: 0.853168 training's binary_logloss: 0.143634 valid_1's auc: 0.821937 valid_1's binary_logloss: 0.148511
[5] training's auc: 0.8566 training's binary_logloss: 0.140937 valid_1's auc: 0.823982 valid_1's binary_logloss: 0.146547
[6] training's auc: 0.861456 training's binary_logloss: 0.138709 valid_1's auc: 0.827467 valid_1's binary_logloss: 0.144739
[7] training's auc: 0.864013 training's binary_logloss: 0.136765 valid_1's auc: 0.827654 valid_1's binary_logloss: 0.143425
[8] training's auc: 0.865552 training's binary_logloss: 0.135116 valid_1's auc: 0.827172 valid_1's binary_logloss: 0.142348
[9] training's auc: 0.867029 training's binary_logloss: 0.133623 valid_1's auc: 0.827843 valid_1's binary_logloss: 0.141405
[10] training's auc: 0.868789 training's binary_logloss: 0.132328 valid_1's auc: 0.829148 valid_1's binary_logloss: 0.140512
[11] training's auc: 0.87128 training's binary_logloss: 0.131107 valid_1's auc: 0.829487 valid_1's binary_logloss: 0.139824
[12] training's auc: 0.873824 training's binary_logloss: 0.129941 valid_1's auc: 0.828773 valid_1's binary_logloss: 0.139259
[13] training's auc: 0.875138 training's binary_logloss: 0.128939 valid_1's auc: 0.828888 valid_1's binary_logloss: 0.138751
[14] training's auc: 0.876566 training's binary_logloss: 0.128037 valid_1's auc: 0.83026 valid_1's binary_logloss: 0.138216
[15] training's auc: 0.878125 training's binary_logloss: 0.127198 valid_1's auc: 0.830867 valid_1's binary_logloss: 0.137828
[16] training's auc: 0.880022 training's binary_logloss: 0.126336 valid_1's auc: 0.831473 valid_1's binary_logloss: 0.137435
[17] training's auc: 0.881171 training's binary_logloss: 0.125605 valid_1's auc: 0.831351 valid_1's binary_logloss: 0.13719
[18] training's auc: 0.882962 training's binary_logloss: 0.124875 valid_1's auc: 0.831251 valid_1's binary_logloss: 0.136915
[19] training's auc: 0.884847 training's binary_logloss: 0.12416 valid_1's auc: 0.831375 valid_1's binary_logloss: 0.136713
[20] training's auc: 0.886118 training's binary_logloss: 0.123507 valid_1's auc: 0.83149 valid_1's binary_logloss: 0.136557
[21] training's auc: 0.887648 training's binary_logloss: 0.122842 valid_1's auc: 0.831171 valid_1's binary_logloss: 0.136448
[22] training's auc: 0.888767 training's binary_logloss: 0.122207 valid_1's auc: 0.831085 valid_1's binary_logloss: 0.136285
[23] training's auc: 0.89002 training's binary_logloss: 0.121618 valid_1's auc: 0.831523 valid_1's binary_logloss: 0.136106
[24] training's auc: 0.891243 training's binary_logloss: 0.121065 valid_1's auc: 0.831884 valid_1's binary_logloss: 0.135988
[25] training's auc: 0.892156 training's binary_logloss: 0.120524 valid_1's auc: 0.831596 valid_1's binary_logloss: 0.135934
[26] training's auc: 0.893219 training's binary_logloss: 0.119981 valid_1's auc: 0.831553 valid_1's binary_logloss: 0.13586
[27] training's auc: 0.894826 training's binary_logloss: 0.11946 valid_1's auc: 0.831129 valid_1's binary_logloss: 0.135875
[28] training's auc: 0.895931 training's binary_logloss: 0.118939 valid_1's auc: 0.831223 valid_1's binary_logloss: 0.135835
[29] training's auc: 0.897264 training's binary_logloss: 0.118453 valid_1's auc: 0.831575 valid_1's binary_logloss: 0.135738
[30] training's auc: 0.898249 training's binary_logloss: 0.117993 valid_1's auc: 0.831366 valid_1's binary_logloss: 0.135718
[31] training's auc: 0.899149 training's binary_logloss: 0.117555 valid_1's auc: 0.831483 valid_1's binary_logloss: 0.135642
[32] training's auc: 0.900917 training's binary_logloss: 0.117034 valid_1's auc: 0.831936 valid_1's binary_logloss: 0.135557
[33] training's auc: 0.901947 training's binary_logloss: 0.116594 valid_1's auc: 0.8318 valid_1's binary_logloss: 0.135584
[34] training's auc: 0.902867 training's binary_logloss: 0.1162 valid_1's auc: 0.831085 valid_1's binary_logloss: 0.135672
[35] training's auc: 0.904082 training's binary_logloss: 0.115798 valid_1's auc: 0.831188 valid_1's binary_logloss: 0.135647
[36] training's auc: 0.905342 training's binary_logloss: 0.115401 valid_1's auc: 0.831224 valid_1's binary_logloss: 0.135649
[37] training's auc: 0.906495 training's binary_logloss: 0.115002 valid_1's auc: 0.831093 valid_1's binary_logloss: 0.135691
[38] training's auc: 0.907278 training's binary_logloss: 0.114604 valid_1's auc: 0.830732 valid_1's binary_logloss: 0.135754
[39] training's auc: 0.907981 training's binary_logloss: 0.114296 valid_1's auc: 0.830795 valid_1's binary_logloss: 0.13579
[40] training's auc: 0.908767 training's binary_logloss: 0.113937 valid_1's auc: 0.830754 valid_1's binary_logloss: 0.135834
[41] training's auc: 0.909784 training's binary_logloss: 0.11351 valid_1's auc: 0.830465 valid_1's binary_logloss: 0.13581
[42] training's auc: 0.910823 training's binary_logloss: 0.113111 valid_1's auc: 0.830121 valid_1's binary_logloss: 0.135868
[43] training's auc: 0.911659 training's binary_logloss: 0.112753 valid_1's auc: 0.82985 valid_1's binary_logloss: 0.135978
[44] training's auc: 0.912465 training's binary_logloss: 0.112404 valid_1's auc: 0.829626 valid_1's binary_logloss: 0.136026
[45] training's auc: 0.913289 training's binary_logloss: 0.112065 valid_1's auc: 0.829795 valid_1's binary_logloss: 0.136021
[46] training's auc: 0.914029 training's binary_logloss: 0.111741 valid_1's auc: 0.829807 valid_1's binary_logloss: 0.136029
[47] training's auc: 0.914704 training's binary_logloss: 0.111411 valid_1's auc: 0.829493 valid_1's binary_logloss: 0.136102
[48] training's auc: 0.915446 training's binary_logloss: 0.111065 valid_1's auc: 0.829402 valid_1's binary_logloss: 0.136161
[49] training's auc: 0.916395 training's binary_logloss: 0.110744 valid_1's auc: 0.828996 valid_1's binary_logloss: 0.136236
[50] training's auc: 0.917013 training's binary_logloss: 0.110439 valid_1's auc: 0.82875 valid_1's binary_logloss: 0.136288
[51] training's auc: 0.917564 training's binary_logloss: 0.11016 valid_1's auc: 0.828344 valid_1's binary_logloss: 0.136358
[52] training's auc: 0.918555 training's binary_logloss: 0.109853 valid_1's auc: 0.8281 valid_1's binary_logloss: 0.136416
[53] training's auc: 0.919341 training's binary_logloss: 0.109525 valid_1's auc: 0.827692 valid_1's binary_logloss: 0.136481
[54] training's auc: 0.919798 training's binary_logloss: 0.109235 valid_1's auc: 0.827103 valid_1's binary_logloss: 0.136578
[55] training's auc: 0.920371 training's binary_logloss: 0.108968 valid_1's auc: 0.826868 valid_1's binary_logloss: 0.136635
[56] training's auc: 0.920832 training's binary_logloss: 0.10869 valid_1's auc: 0.826524 valid_1's binary_logloss: 0.136722
[57] training's auc: 0.921218 training's binary_logloss: 0.108457 valid_1's auc: 0.826103 valid_1's binary_logloss: 0.136849
[58] training's auc: 0.921909 training's binary_logloss: 0.108213 valid_1's auc: 0.826259 valid_1's binary_logloss: 0.13684
[59] training's auc: 0.922431 training's binary_logloss: 0.107945 valid_1's auc: 0.825903 valid_1's binary_logloss: 0.136927
[60] training's auc: 0.923187 training's binary_logloss: 0.10768 valid_1's auc: 0.825709 valid_1's binary_logloss: 0.136967
[61] training's auc: 0.923613 training's binary_logloss: 0.107413 valid_1's auc: 0.825182 valid_1's binary_logloss: 0.137085
[62] training's auc: 0.923999 training's binary_logloss: 0.107158 valid_1's auc: 0.825273 valid_1's binary_logloss: 0.137117
Early stopping, best iteration is:
[32] training's auc: 0.900917 training's binary_logloss: 0.117034 valid_1's auc: 0.831936 valid_1's binary_logloss: 0.135557
[1] training's auc: 0.830903 training's binary_logloss: 0.158593 valid_1's auc: 0.81555 valid_1's binary_logloss: 0.154031
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.840856 training's binary_logloss: 0.153033 valid_1's auc: 0.82173 valid_1's binary_logloss: 0.149638
[3] training's auc: 0.849274 training's binary_logloss: 0.148992 valid_1's auc: 0.827214 valid_1's binary_logloss: 0.146333
[4] training's auc: 0.852609 training's binary_logloss: 0.145737 valid_1's auc: 0.830182 valid_1's binary_logloss: 0.143895
[5] training's auc: 0.8558 training's binary_logloss: 0.143093 valid_1's auc: 0.83144 valid_1's binary_logloss: 0.141855
[6] training's auc: 0.860714 training's binary_logloss: 0.140751 valid_1's auc: 0.833456 valid_1's binary_logloss: 0.140208
[7] training's auc: 0.862655 training's binary_logloss: 0.138791 valid_1's auc: 0.834271 valid_1's binary_logloss: 0.138861
[8] training's auc: 0.865164 training's binary_logloss: 0.137102 valid_1's auc: 0.834402 valid_1's binary_logloss: 0.137669
[9] training's auc: 0.867283 training's binary_logloss: 0.135573 valid_1's auc: 0.834302 valid_1's binary_logloss: 0.136709
[10] training's auc: 0.868802 training's binary_logloss: 0.134219 valid_1's auc: 0.833469 valid_1's binary_logloss: 0.135889
[11] training's auc: 0.86992 training's binary_logloss: 0.133056 valid_1's auc: 0.833834 valid_1's binary_logloss: 0.135196
[12] training's auc: 0.871269 training's binary_logloss: 0.131979 valid_1's auc: 0.833992 valid_1's binary_logloss: 0.134585
[13] training's auc: 0.873725 training's binary_logloss: 0.130892 valid_1's auc: 0.833801 valid_1's binary_logloss: 0.134137
[14] training's auc: 0.875087 training's binary_logloss: 0.129982 valid_1's auc: 0.833375 valid_1's binary_logloss: 0.133721
[15] training's auc: 0.877112 training's binary_logloss: 0.12906 valid_1's auc: 0.833369 valid_1's binary_logloss: 0.133416
[16] training's auc: 0.878683 training's binary_logloss: 0.128235 valid_1's auc: 0.832681 valid_1's binary_logloss: 0.133137
[17] training's auc: 0.880127 training's binary_logloss: 0.127484 valid_1's auc: 0.83308 valid_1's binary_logloss: 0.132774
[18] training's auc: 0.881769 training's binary_logloss: 0.126757 valid_1's auc: 0.833471 valid_1's binary_logloss: 0.132508
[19] training's auc: 0.882922 training's binary_logloss: 0.126053 valid_1's auc: 0.83367 valid_1's binary_logloss: 0.1323
[20] training's auc: 0.884882 training's binary_logloss: 0.125358 valid_1's auc: 0.834034 valid_1's binary_logloss: 0.132044
[21] training's auc: 0.886267 training's binary_logloss: 0.124675 valid_1's auc: 0.834283 valid_1's binary_logloss: 0.131805
[22] training's auc: 0.887412 training's binary_logloss: 0.124095 valid_1's auc: 0.834421 valid_1's binary_logloss: 0.131661
[23] training's auc: 0.888467 training's binary_logloss: 0.12353 valid_1's auc: 0.835032 valid_1's binary_logloss: 0.131466
[24] training's auc: 0.889589 training's binary_logloss: 0.122986 valid_1's auc: 0.834498 valid_1's binary_logloss: 0.13143
[25] training's auc: 0.891099 training's binary_logloss: 0.122382 valid_1's auc: 0.834411 valid_1's binary_logloss: 0.131359
[26] training's auc: 0.892328 training's binary_logloss: 0.121859 valid_1's auc: 0.834255 valid_1's binary_logloss: 0.131311
[27] training's auc: 0.89375 training's binary_logloss: 0.121383 valid_1's auc: 0.834194 valid_1's binary_logloss: 0.131245
[28] training's auc: 0.894527 training's binary_logloss: 0.120913 valid_1's auc: 0.83403 valid_1's binary_logloss: 0.1312
[29] training's auc: 0.895493 training's binary_logloss: 0.120464 valid_1's auc: 0.834055 valid_1's binary_logloss: 0.131171
[30] training's auc: 0.896644 training's binary_logloss: 0.11999 valid_1's auc: 0.834089 valid_1's binary_logloss: 0.131119
[31] training's auc: 0.897797 training's binary_logloss: 0.119517 valid_1's auc: 0.834432 valid_1's binary_logloss: 0.131055
[32] training's auc: 0.899538 training's binary_logloss: 0.11903 valid_1's auc: 0.834439 valid_1's binary_logloss: 0.131001
[33] training's auc: 0.900489 training's binary_logloss: 0.118588 valid_1's auc: 0.834301 valid_1's binary_logloss: 0.130996
[34] training's auc: 0.902214 training's binary_logloss: 0.118127 valid_1's auc: 0.834746 valid_1's binary_logloss: 0.130926
[35] training's auc: 0.903368 training's binary_logloss: 0.117687 valid_1's auc: 0.835006 valid_1's binary_logloss: 0.130869
[36] training's auc: 0.904401 training's binary_logloss: 0.117304 valid_1's auc: 0.834676 valid_1's binary_logloss: 0.130893
[37] training's auc: 0.905235 training's binary_logloss: 0.116921 valid_1's auc: 0.834614 valid_1's binary_logloss: 0.13087
[38] training's auc: 0.906391 training's binary_logloss: 0.116525 valid_1's auc: 0.834734 valid_1's binary_logloss: 0.130836
[39] training's auc: 0.907291 training's binary_logloss: 0.116161 valid_1's auc: 0.834886 valid_1's binary_logloss: 0.130799
[40] training's auc: 0.908021 training's binary_logloss: 0.115849 valid_1's auc: 0.834876 valid_1's binary_logloss: 0.130788
[41] training's auc: 0.909111 training's binary_logloss: 0.115514 valid_1's auc: 0.834774 valid_1's binary_logloss: 0.130821
[42] training's auc: 0.909901 training's binary_logloss: 0.115167 valid_1's auc: 0.83481 valid_1's binary_logloss: 0.130801
[43] training's auc: 0.910674 training's binary_logloss: 0.114774 valid_1's auc: 0.834684 valid_1's binary_logloss: 0.130819
[44] training's auc: 0.911622 training's binary_logloss: 0.114457 valid_1's auc: 0.834828 valid_1's binary_logloss: 0.130782
[45] training's auc: 0.912371 training's binary_logloss: 0.114089 valid_1's auc: 0.835251 valid_1's binary_logloss: 0.1307
[46] training's auc: 0.913083 training's binary_logloss: 0.113768 valid_1's auc: 0.835264 valid_1's binary_logloss: 0.130703
[47] training's auc: 0.91382 training's binary_logloss: 0.113411 valid_1's auc: 0.835047 valid_1's binary_logloss: 0.130739
[48] training's auc: 0.914479 training's binary_logloss: 0.113063 valid_1's auc: 0.834874 valid_1's binary_logloss: 0.1308
[49] training's auc: 0.915328 training's binary_logloss: 0.11266 valid_1's auc: 0.834822 valid_1's binary_logloss: 0.130801
[50] training's auc: 0.915978 training's binary_logloss: 0.112332 valid_1's auc: 0.834781 valid_1's binary_logloss: 0.130816
[51] training's auc: 0.916514 training's binary_logloss: 0.112026 valid_1's auc: 0.834748 valid_1's binary_logloss: 0.130819
[52] training's auc: 0.917237 training's binary_logloss: 0.1117 valid_1's auc: 0.834734 valid_1's binary_logloss: 0.130821
[53] training's auc: 0.917934 training's binary_logloss: 0.11135 valid_1's auc: 0.834346 valid_1's binary_logloss: 0.130915
[54] training's auc: 0.918585 training's binary_logloss: 0.11109 valid_1's auc: 0.834435 valid_1's binary_logloss: 0.130917
[55] training's auc: 0.919148 training's binary_logloss: 0.110802 valid_1's auc: 0.834446 valid_1's binary_logloss: 0.130936
[56] training's auc: 0.919777 training's binary_logloss: 0.110489 valid_1's auc: 0.834478 valid_1's binary_logloss: 0.130942
[57] training's auc: 0.920598 training's binary_logloss: 0.110178 valid_1's auc: 0.834547 valid_1's binary_logloss: 0.130936
[58] training's auc: 0.920964 training's binary_logloss: 0.109973 valid_1's auc: 0.834288 valid_1's binary_logloss: 0.130979
[59] training's auc: 0.921741 training's binary_logloss: 0.109705 valid_1's auc: 0.834214 valid_1's binary_logloss: 0.131018
[60] training's auc: 0.922258 training's binary_logloss: 0.109439 valid_1's auc: 0.834163 valid_1's binary_logloss: 0.131028
[61] training's auc: 0.922811 training's binary_logloss: 0.10916 valid_1's auc: 0.833591 valid_1's binary_logloss: 0.131111
[62] training's auc: 0.92352 training's binary_logloss: 0.108803 valid_1's auc: 0.833383 valid_1's binary_logloss: 0.131181
[63] training's auc: 0.924139 training's binary_logloss: 0.108504 valid_1's auc: 0.833038 valid_1's binary_logloss: 0.131276
[64] training's auc: 0.924559 training's binary_logloss: 0.108257 valid_1's auc: 0.833232 valid_1's binary_logloss: 0.131245
[65] training's auc: 0.925098 training's binary_logloss: 0.107985 valid_1's auc: 0.833092 valid_1's binary_logloss: 0.131283
[66] training's auc: 0.925488 training's binary_logloss: 0.107758 valid_1's auc: 0.832778 valid_1's binary_logloss: 0.131353
[67] training's auc: 0.925954 training's binary_logloss: 0.107497 valid_1's auc: 0.83277 valid_1's binary_logloss: 0.131391
[68] training's auc: 0.926241 training's binary_logloss: 0.10729 valid_1's auc: 0.832636 valid_1's binary_logloss: 0.131418
[69] training's auc: 0.92656 training's binary_logloss: 0.107095 valid_1's auc: 0.832714 valid_1's binary_logloss: 0.131429
[70] training's auc: 0.926897 training's binary_logloss: 0.106879 valid_1's auc: 0.832697 valid_1's binary_logloss: 0.131447
[71] training's auc: 0.927256 training's binary_logloss: 0.106652 valid_1's auc: 0.83224 valid_1's binary_logloss: 0.131537
[72] training's auc: 0.927597 training's binary_logloss: 0.106437 valid_1's auc: 0.83233 valid_1's binary_logloss: 0.131516
[73] training's auc: 0.927889 training's binary_logloss: 0.106218 valid_1's auc: 0.832154 valid_1's binary_logloss: 0.131562
[74] training's auc: 0.928203 training's binary_logloss: 0.106024 valid_1's auc: 0.831833 valid_1's binary_logloss: 0.131655
[75] training's auc: 0.928597 training's binary_logloss: 0.105766 valid_1's auc: 0.831648 valid_1's binary_logloss: 0.131722
Early stopping, best iteration is:
[45] training's auc: 0.912371 training's binary_logloss: 0.114089 valid_1's auc: 0.835251 valid_1's binary_logloss: 0.1307
[1] training's auc: 0.83399 training's binary_logloss: 0.155243 valid_1's auc: 0.814273 valid_1's binary_logloss: 0.161398
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.84015 training's binary_logloss: 0.150089 valid_1's auc: 0.82058 valid_1's binary_logloss: 0.156447
[3] training's auc: 0.847645 training's binary_logloss: 0.14605 valid_1's auc: 0.826988 valid_1's binary_logloss: 0.152933
[4] training's auc: 0.852534 training's binary_logloss: 0.142844 valid_1's auc: 0.829106 valid_1's binary_logloss: 0.150246
[5] training's auc: 0.855266 training's binary_logloss: 0.140125 valid_1's auc: 0.829686 valid_1's binary_logloss: 0.148074
[6] training's auc: 0.858044 training's binary_logloss: 0.137935 valid_1's auc: 0.830371 valid_1's binary_logloss: 0.146402
[7] training's auc: 0.859489 training's binary_logloss: 0.136109 valid_1's auc: 0.830229 valid_1's binary_logloss: 0.145027
[8] training's auc: 0.861601 training's binary_logloss: 0.134445 valid_1's auc: 0.830746 valid_1's binary_logloss: 0.143834
[9] training's auc: 0.864006 training's binary_logloss: 0.132973 valid_1's auc: 0.831928 valid_1's binary_logloss: 0.142804
[10] training's auc: 0.865724 training's binary_logloss: 0.131727 valid_1's auc: 0.831516 valid_1's binary_logloss: 0.142058
[11] training's auc: 0.868616 training's binary_logloss: 0.130523 valid_1's auc: 0.833524 valid_1's binary_logloss: 0.141322
[12] training's auc: 0.870306 training's binary_logloss: 0.129388 valid_1's auc: 0.833578 valid_1's binary_logloss: 0.14069
[13] training's auc: 0.872073 training's binary_logloss: 0.128355 valid_1's auc: 0.833253 valid_1's binary_logloss: 0.140193
[14] training's auc: 0.873645 training's binary_logloss: 0.127428 valid_1's auc: 0.83369 valid_1's binary_logloss: 0.139671
[15] training's auc: 0.876011 training's binary_logloss: 0.126554 valid_1's auc: 0.834184 valid_1's binary_logloss: 0.13926
[16] training's auc: 0.877277 training's binary_logloss: 0.125796 valid_1's auc: 0.833759 valid_1's binary_logloss: 0.138966
[17] training's auc: 0.879123 training's binary_logloss: 0.125045 valid_1's auc: 0.833962 valid_1's binary_logloss: 0.138676
[18] training's auc: 0.880997 training's binary_logloss: 0.124376 valid_1's auc: 0.833935 valid_1's binary_logloss: 0.138366
[19] training's auc: 0.883131 training's binary_logloss: 0.123647 valid_1's auc: 0.833321 valid_1's binary_logloss: 0.13816
[20] training's auc: 0.884427 training's binary_logloss: 0.122965 valid_1's auc: 0.833557 valid_1's binary_logloss: 0.137935
[21] training's auc: 0.885927 training's binary_logloss: 0.122334 valid_1's auc: 0.833681 valid_1's binary_logloss: 0.137838
[22] training's auc: 0.887333 training's binary_logloss: 0.121742 valid_1's auc: 0.833498 valid_1's binary_logloss: 0.137721
[23] training's auc: 0.888938 training's binary_logloss: 0.121105 valid_1's auc: 0.833381 valid_1's binary_logloss: 0.137561
[24] training's auc: 0.890522 training's binary_logloss: 0.1205 valid_1's auc: 0.833654 valid_1's binary_logloss: 0.13744
[25] training's auc: 0.891847 training's binary_logloss: 0.119947 valid_1's auc: 0.833849 valid_1's binary_logloss: 0.137363
[26] training's auc: 0.893062 training's binary_logloss: 0.119411 valid_1's auc: 0.833558 valid_1's binary_logloss: 0.137296
[27] training's auc: 0.894082 training's binary_logloss: 0.118906 valid_1's auc: 0.833231 valid_1's binary_logloss: 0.137282
[28] training's auc: 0.89515 training's binary_logloss: 0.11842 valid_1's auc: 0.83316 valid_1's binary_logloss: 0.137261
[29] training's auc: 0.896347 training's binary_logloss: 0.117969 valid_1's auc: 0.833069 valid_1's binary_logloss: 0.137238
[30] training's auc: 0.897458 training's binary_logloss: 0.117511 valid_1's auc: 0.832744 valid_1's binary_logloss: 0.137249
[31] training's auc: 0.898716 training's binary_logloss: 0.117101 valid_1's auc: 0.832201 valid_1's binary_logloss: 0.137283
[32] training's auc: 0.899584 training's binary_logloss: 0.116706 valid_1's auc: 0.832359 valid_1's binary_logloss: 0.137235
[33] training's auc: 0.900724 training's binary_logloss: 0.116276 valid_1's auc: 0.831975 valid_1's binary_logloss: 0.1373
[34] training's auc: 0.901869 training's binary_logloss: 0.115861 valid_1's auc: 0.831782 valid_1's binary_logloss: 0.137338
[35] training's auc: 0.903282 training's binary_logloss: 0.115451 valid_1's auc: 0.831564 valid_1's binary_logloss: 0.137383
[36] training's auc: 0.904385 training's binary_logloss: 0.115072 valid_1's auc: 0.831244 valid_1's binary_logloss: 0.13743
[37] training's auc: 0.905521 training's binary_logloss: 0.114673 valid_1's auc: 0.831092 valid_1's binary_logloss: 0.137468
[38] training's auc: 0.906291 training's binary_logloss: 0.114318 valid_1's auc: 0.830886 valid_1's binary_logloss: 0.137485
[39] training's auc: 0.907121 training's binary_logloss: 0.113959 valid_1's auc: 0.830719 valid_1's binary_logloss: 0.137544
[40] training's auc: 0.907841 training's binary_logloss: 0.113617 valid_1's auc: 0.830576 valid_1's binary_logloss: 0.137542
[41] training's auc: 0.908872 training's binary_logloss: 0.11328 valid_1's auc: 0.830433 valid_1's binary_logloss: 0.137564
[42] training's auc: 0.909941 training's binary_logloss: 0.112879 valid_1's auc: 0.830646 valid_1's binary_logloss: 0.137563
[43] training's auc: 0.910685 training's binary_logloss: 0.112496 valid_1's auc: 0.830606 valid_1's binary_logloss: 0.137609
[44] training's auc: 0.91139 training's binary_logloss: 0.112124 valid_1's auc: 0.830552 valid_1's binary_logloss: 0.137619
[45] training's auc: 0.912237 training's binary_logloss: 0.111734 valid_1's auc: 0.830312 valid_1's binary_logloss: 0.137672
Early stopping, best iteration is:
[15] training's auc: 0.876011 training's binary_logloss: 0.126554 valid_1's auc: 0.834184 valid_1's binary_logloss: 0.13926
[1] training's auc: 0.839482 training's binary_logloss: 0.150359 valid_1's auc: 0.806909 valid_1's binary_logloss: 0.153903
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.85191 training's binary_logloss: 0.143849 valid_1's auc: 0.817909 valid_1's binary_logloss: 0.148759
[3] training's auc: 0.85896 training's binary_logloss: 0.139301 valid_1's auc: 0.824668 valid_1's binary_logloss: 0.145315
[4] training's auc: 0.863432 training's binary_logloss: 0.135891 valid_1's auc: 0.826635 valid_1's binary_logloss: 0.142882
[5] training's auc: 0.866982 training's binary_logloss: 0.133235 valid_1's auc: 0.827698 valid_1's binary_logloss: 0.141073
[6] training's auc: 0.869491 training's binary_logloss: 0.131005 valid_1's auc: 0.828266 valid_1's binary_logloss: 0.139819
[7] training's auc: 0.873829 training's binary_logloss: 0.129061 valid_1's auc: 0.828325 valid_1's binary_logloss: 0.13881
[8] training's auc: 0.8776 training's binary_logloss: 0.127435 valid_1's auc: 0.828208 valid_1's binary_logloss: 0.138193
[9] training's auc: 0.880968 training's binary_logloss: 0.126004 valid_1's auc: 0.82927 valid_1's binary_logloss: 0.137639
[10] training's auc: 0.883566 training's binary_logloss: 0.124624 valid_1's auc: 0.828171 valid_1's binary_logloss: 0.137295
[11] training's auc: 0.886534 training's binary_logloss: 0.123428 valid_1's auc: 0.827857 valid_1's binary_logloss: 0.137121
[12] training's auc: 0.889782 training's binary_logloss: 0.122261 valid_1's auc: 0.828558 valid_1's binary_logloss: 0.136799
[13] training's auc: 0.891968 training's binary_logloss: 0.12119 valid_1's auc: 0.827822 valid_1's binary_logloss: 0.136731
[14] training's auc: 0.894315 training's binary_logloss: 0.120155 valid_1's auc: 0.827781 valid_1's binary_logloss: 0.136643
[15] training's auc: 0.896831 training's binary_logloss: 0.119186 valid_1's auc: 0.827227 valid_1's binary_logloss: 0.136584
[16] training's auc: 0.898874 training's binary_logloss: 0.118147 valid_1's auc: 0.827008 valid_1's binary_logloss: 0.136514
[17] training's auc: 0.901952 training's binary_logloss: 0.117185 valid_1's auc: 0.82902 valid_1's binary_logloss: 0.136251
[18] training's auc: 0.903977 training's binary_logloss: 0.116356 valid_1's auc: 0.828913 valid_1's binary_logloss: 0.136263
[19] training's auc: 0.905905 training's binary_logloss: 0.115484 valid_1's auc: 0.829731 valid_1's binary_logloss: 0.136181
[20] training's auc: 0.90803 training's binary_logloss: 0.114633 valid_1's auc: 0.829063 valid_1's binary_logloss: 0.136329
[21] training's auc: 0.909991 training's binary_logloss: 0.11393 valid_1's auc: 0.829436 valid_1's binary_logloss: 0.13626
[22] training's auc: 0.911713 training's binary_logloss: 0.113194 valid_1's auc: 0.829341 valid_1's binary_logloss: 0.1363
[23] training's auc: 0.913471 training's binary_logloss: 0.112492 valid_1's auc: 0.829289 valid_1's binary_logloss: 0.136334
[24] training's auc: 0.915093 training's binary_logloss: 0.111837 valid_1's auc: 0.828457 valid_1's binary_logloss: 0.136522
[25] training's auc: 0.916548 training's binary_logloss: 0.11121 valid_1's auc: 0.828021 valid_1's binary_logloss: 0.136668
[26] training's auc: 0.917814 training's binary_logloss: 0.110629 valid_1's auc: 0.827622 valid_1's binary_logloss: 0.136786
[27] training's auc: 0.920222 training's binary_logloss: 0.10992 valid_1's auc: 0.827527 valid_1's binary_logloss: 0.136848
[28] training's auc: 0.921149 training's binary_logloss: 0.109368 valid_1's auc: 0.826576 valid_1's binary_logloss: 0.137041
[29] training's auc: 0.922635 training's binary_logloss: 0.10885 valid_1's auc: 0.826038 valid_1's binary_logloss: 0.137194
[30] training's auc: 0.923768 training's binary_logloss: 0.108236 valid_1's auc: 0.825368 valid_1's binary_logloss: 0.137403
[31] training's auc: 0.924726 training's binary_logloss: 0.107763 valid_1's auc: 0.825449 valid_1's binary_logloss: 0.137408
[32] training's auc: 0.92584 training's binary_logloss: 0.10717 valid_1's auc: 0.824919 valid_1's binary_logloss: 0.137538
[33] training's auc: 0.926654 training's binary_logloss: 0.106611 valid_1's auc: 0.82456 valid_1's binary_logloss: 0.137612
[34] training's auc: 0.927192 training's binary_logloss: 0.106235 valid_1's auc: 0.824703 valid_1's binary_logloss: 0.137641
[35] training's auc: 0.928697 training's binary_logloss: 0.105647 valid_1's auc: 0.825147 valid_1's binary_logloss: 0.13755
[36] training's auc: 0.929892 training's binary_logloss: 0.105135 valid_1's auc: 0.824895 valid_1's binary_logloss: 0.137662
[37] training's auc: 0.9309 training's binary_logloss: 0.104707 valid_1's auc: 0.824629 valid_1's binary_logloss: 0.137743
[38] training's auc: 0.932068 training's binary_logloss: 0.104091 valid_1's auc: 0.82399 valid_1's binary_logloss: 0.137955
[39] training's auc: 0.932485 training's binary_logloss: 0.103729 valid_1's auc: 0.823489 valid_1's binary_logloss: 0.138116
[40] training's auc: 0.933286 training's binary_logloss: 0.103278 valid_1's auc: 0.822982 valid_1's binary_logloss: 0.138229
[41] training's auc: 0.934203 training's binary_logloss: 0.102837 valid_1's auc: 0.822748 valid_1's binary_logloss: 0.138324
[42] training's auc: 0.934725 training's binary_logloss: 0.102409 valid_1's auc: 0.821841 valid_1's binary_logloss: 0.138563
[43] training's auc: 0.935512 training's binary_logloss: 0.101918 valid_1's auc: 0.821433 valid_1's binary_logloss: 0.138749
[44] training's auc: 0.935991 training's binary_logloss: 0.10152 valid_1's auc: 0.820356 valid_1's binary_logloss: 0.139078
[45] training's auc: 0.936911 training's binary_logloss: 0.101061 valid_1's auc: 0.820377 valid_1's binary_logloss: 0.139122
[46] training's auc: 0.937313 training's binary_logloss: 0.100715 valid_1's auc: 0.820124 valid_1's binary_logloss: 0.139226
[47] training's auc: 0.937903 training's binary_logloss: 0.100336 valid_1's auc: 0.820295 valid_1's binary_logloss: 0.139242
[48] training's auc: 0.938474 training's binary_logloss: 0.0999462 valid_1's auc: 0.81974 valid_1's binary_logloss: 0.139419
[49] training's auc: 0.939549 training's binary_logloss: 0.0994766 valid_1's auc: 0.819668 valid_1's binary_logloss: 0.139489
Early stopping, best iteration is:
[19] training's auc: 0.905905 training's binary_logloss: 0.115484 valid_1's auc: 0.829731 valid_1's binary_logloss: 0.136181
[1] training's auc: 0.83614 training's binary_logloss: 0.152969 valid_1's auc: 0.815307 valid_1's binary_logloss: 0.149701
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.845615 training's binary_logloss: 0.146112 valid_1's auc: 0.821832 valid_1's binary_logloss: 0.144475
[3] training's auc: 0.855114 training's binary_logloss: 0.1416 valid_1's auc: 0.82494 valid_1's binary_logloss: 0.141341
[4] training's auc: 0.861103 training's binary_logloss: 0.138048 valid_1's auc: 0.830057 valid_1's binary_logloss: 0.138999
[5] training's auc: 0.864985 training's binary_logloss: 0.135249 valid_1's auc: 0.829983 valid_1's binary_logloss: 0.137289
[6] training's auc: 0.869277 training's binary_logloss: 0.132953 valid_1's auc: 0.830121 valid_1's binary_logloss: 0.135961
[7] training's auc: 0.872375 training's binary_logloss: 0.130886 valid_1's auc: 0.829792 valid_1's binary_logloss: 0.135086
[8] training's auc: 0.876088 training's binary_logloss: 0.129139 valid_1's auc: 0.83078 valid_1's binary_logloss: 0.134222
[9] training's auc: 0.879735 training's binary_logloss: 0.127582 valid_1's auc: 0.829229 valid_1's binary_logloss: 0.133834
[10] training's auc: 0.88188 training's binary_logloss: 0.126288 valid_1's auc: 0.829263 valid_1's binary_logloss: 0.133479
[11] training's auc: 0.884394 training's binary_logloss: 0.124978 valid_1's auc: 0.829294 valid_1's binary_logloss: 0.133036
[12] training's auc: 0.886973 training's binary_logloss: 0.123854 valid_1's auc: 0.829122 valid_1's binary_logloss: 0.132817
[13] training's auc: 0.889605 training's binary_logloss: 0.12274 valid_1's auc: 0.829518 valid_1's binary_logloss: 0.132531
[14] training's auc: 0.892192 training's binary_logloss: 0.121736 valid_1's auc: 0.830672 valid_1's binary_logloss: 0.132256
[15] training's auc: 0.894486 training's binary_logloss: 0.120739 valid_1's auc: 0.830836 valid_1's binary_logloss: 0.132168
[16] training's auc: 0.896191 training's binary_logloss: 0.119865 valid_1's auc: 0.83086 valid_1's binary_logloss: 0.132031
[17] training's auc: 0.899226 training's binary_logloss: 0.11891 valid_1's auc: 0.830023 valid_1's binary_logloss: 0.132071
[18] training's auc: 0.900851 training's binary_logloss: 0.118145 valid_1's auc: 0.830214 valid_1's binary_logloss: 0.132091
[19] training's auc: 0.903054 training's binary_logloss: 0.117329 valid_1's auc: 0.830005 valid_1's binary_logloss: 0.132148
[20] training's auc: 0.90456 training's binary_logloss: 0.116642 valid_1's auc: 0.829865 valid_1's binary_logloss: 0.13216
[21] training's auc: 0.906532 training's binary_logloss: 0.115865 valid_1's auc: 0.829126 valid_1's binary_logloss: 0.132288
[22] training's auc: 0.9083 training's binary_logloss: 0.115131 valid_1's auc: 0.828988 valid_1's binary_logloss: 0.132263
[23] training's auc: 0.910497 training's binary_logloss: 0.11446 valid_1's auc: 0.829448 valid_1's binary_logloss: 0.132214
[24] training's auc: 0.911922 training's binary_logloss: 0.11385 valid_1's auc: 0.829772 valid_1's binary_logloss: 0.132151
[25] training's auc: 0.91419 training's binary_logloss: 0.113258 valid_1's auc: 0.830207 valid_1's binary_logloss: 0.132092
[26] training's auc: 0.91565 training's binary_logloss: 0.112573 valid_1's auc: 0.829528 valid_1's binary_logloss: 0.132228
[27] training's auc: 0.916887 training's binary_logloss: 0.111905 valid_1's auc: 0.829428 valid_1's binary_logloss: 0.13227
[28] training's auc: 0.918536 training's binary_logloss: 0.111184 valid_1's auc: 0.829241 valid_1's binary_logloss: 0.132364
[29] training's auc: 0.919784 training's binary_logloss: 0.110595 valid_1's auc: 0.829397 valid_1's binary_logloss: 0.132371
[30] training's auc: 0.921061 training's binary_logloss: 0.110044 valid_1's auc: 0.82911 valid_1's binary_logloss: 0.132427
[31] training's auc: 0.922149 training's binary_logloss: 0.109464 valid_1's auc: 0.828825 valid_1's binary_logloss: 0.132515
[32] training's auc: 0.923403 training's binary_logloss: 0.108878 valid_1's auc: 0.828614 valid_1's binary_logloss: 0.132548
[33] training's auc: 0.924526 training's binary_logloss: 0.108221 valid_1's auc: 0.828686 valid_1's binary_logloss: 0.132605
[34] training's auc: 0.925364 training's binary_logloss: 0.107753 valid_1's auc: 0.82891 valid_1's binary_logloss: 0.132555
[35] training's auc: 0.92639 training's binary_logloss: 0.107224 valid_1's auc: 0.82875 valid_1's binary_logloss: 0.132681
[36] training's auc: 0.927256 training's binary_logloss: 0.106734 valid_1's auc: 0.828636 valid_1's binary_logloss: 0.132756
[37] training's auc: 0.927791 training's binary_logloss: 0.106342 valid_1's auc: 0.828345 valid_1's binary_logloss: 0.132789
[38] training's auc: 0.928568 training's binary_logloss: 0.105863 valid_1's auc: 0.82845 valid_1's binary_logloss: 0.132787
[39] training's auc: 0.929132 training's binary_logloss: 0.105471 valid_1's auc: 0.828506 valid_1's binary_logloss: 0.132846
[40] training's auc: 0.929612 training's binary_logloss: 0.105111 valid_1's auc: 0.828258 valid_1's binary_logloss: 0.132923
[41] training's auc: 0.930304 training's binary_logloss: 0.104688 valid_1's auc: 0.827428 valid_1's binary_logloss: 0.133133
[42] training's auc: 0.931041 training's binary_logloss: 0.104288 valid_1's auc: 0.827292 valid_1's binary_logloss: 0.133194
[43] training's auc: 0.932103 training's binary_logloss: 0.103808 valid_1's auc: 0.827073 valid_1's binary_logloss: 0.133217
[44] training's auc: 0.932673 training's binary_logloss: 0.103388 valid_1's auc: 0.826794 valid_1's binary_logloss: 0.133313
[45] training's auc: 0.933254 training's binary_logloss: 0.10302 valid_1's auc: 0.826388 valid_1's binary_logloss: 0.133468
[46] training's auc: 0.933677 training's binary_logloss: 0.102655 valid_1's auc: 0.826184 valid_1's binary_logloss: 0.133563
Early stopping, best iteration is:
[16] training's auc: 0.896191 training's binary_logloss: 0.119865 valid_1's auc: 0.83086 valid_1's binary_logloss: 0.132031
[1] training's auc: 0.837618 training's binary_logloss: 0.149962 valid_1's auc: 0.816805 valid_1's binary_logloss: 0.156653
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.84807 training's binary_logloss: 0.143276 valid_1's auc: 0.82533 valid_1's binary_logloss: 0.15046
[3] training's auc: 0.854731 training's binary_logloss: 0.138595 valid_1's auc: 0.82827 valid_1's binary_logloss: 0.146766
[4] training's auc: 0.859402 training's binary_logloss: 0.135112 valid_1's auc: 0.830475 valid_1's binary_logloss: 0.144264
[5] training's auc: 0.862744 training's binary_logloss: 0.132453 valid_1's auc: 0.83059 valid_1's binary_logloss: 0.142576
[6] training's auc: 0.866319 training's binary_logloss: 0.130206 valid_1's auc: 0.83214 valid_1's binary_logloss: 0.141167
[7] training's auc: 0.869575 training's binary_logloss: 0.1284 valid_1's auc: 0.832147 valid_1's binary_logloss: 0.140168
[8] training's auc: 0.873188 training's binary_logloss: 0.126662 valid_1's auc: 0.83252 valid_1's binary_logloss: 0.139255
[9] training's auc: 0.878121 training's binary_logloss: 0.125135 valid_1's auc: 0.835373 valid_1's binary_logloss: 0.138485
[10] training's auc: 0.881382 training's binary_logloss: 0.123781 valid_1's auc: 0.836298 valid_1's binary_logloss: 0.137984
[11] training's auc: 0.884177 training's binary_logloss: 0.12257 valid_1's auc: 0.835803 valid_1's binary_logloss: 0.137567
[12] training's auc: 0.887144 training's binary_logloss: 0.121425 valid_1's auc: 0.835573 valid_1's binary_logloss: 0.137336
[13] training's auc: 0.890544 training's binary_logloss: 0.120179 valid_1's auc: 0.837075 valid_1's binary_logloss: 0.136885
[14] training's auc: 0.893242 training's binary_logloss: 0.119094 valid_1's auc: 0.836663 valid_1's binary_logloss: 0.136841
[15] training's auc: 0.895174 training's binary_logloss: 0.118201 valid_1's auc: 0.835753 valid_1's binary_logloss: 0.136803
[16] training's auc: 0.897606 training's binary_logloss: 0.11726 valid_1's auc: 0.83576 valid_1's binary_logloss: 0.136752
[17] training's auc: 0.899496 training's binary_logloss: 0.116452 valid_1's auc: 0.836228 valid_1's binary_logloss: 0.136609
[18] training's auc: 0.90147 training's binary_logloss: 0.115627 valid_1's auc: 0.835438 valid_1's binary_logloss: 0.136692
[19] training's auc: 0.903767 training's binary_logloss: 0.114784 valid_1's auc: 0.834891 valid_1's binary_logloss: 0.136728
[20] training's auc: 0.905734 training's binary_logloss: 0.114047 valid_1's auc: 0.834385 valid_1's binary_logloss: 0.136815
[21] training's auc: 0.907342 training's binary_logloss: 0.113334 valid_1's auc: 0.833542 valid_1's binary_logloss: 0.136971
[22] training's auc: 0.908675 training's binary_logloss: 0.112684 valid_1's auc: 0.832827 valid_1's binary_logloss: 0.137107
[23] training's auc: 0.910867 training's binary_logloss: 0.112049 valid_1's auc: 0.832296 valid_1's binary_logloss: 0.137229
[24] training's auc: 0.912508 training's binary_logloss: 0.111275 valid_1's auc: 0.831934 valid_1's binary_logloss: 0.137329
[25] training's auc: 0.913772 training's binary_logloss: 0.110659 valid_1's auc: 0.831506 valid_1's binary_logloss: 0.137466
[26] training's auc: 0.915002 training's binary_logloss: 0.110086 valid_1's auc: 0.831173 valid_1's binary_logloss: 0.137626
[27] training's auc: 0.916742 training's binary_logloss: 0.10936 valid_1's auc: 0.831221 valid_1's binary_logloss: 0.137681
[28] training's auc: 0.917819 training's binary_logloss: 0.108771 valid_1's auc: 0.830379 valid_1's binary_logloss: 0.137918
[29] training's auc: 0.919513 training's binary_logloss: 0.108121 valid_1's auc: 0.830104 valid_1's binary_logloss: 0.137979
[30] training's auc: 0.920693 training's binary_logloss: 0.10758 valid_1's auc: 0.830599 valid_1's binary_logloss: 0.137971
[31] training's auc: 0.921794 training's binary_logloss: 0.107014 valid_1's auc: 0.83045 valid_1's binary_logloss: 0.138034
[32] training's auc: 0.922939 training's binary_logloss: 0.106421 valid_1's auc: 0.829331 valid_1's binary_logloss: 0.138298
[33] training's auc: 0.923871 training's binary_logloss: 0.10588 valid_1's auc: 0.828482 valid_1's binary_logloss: 0.138545
[34] training's auc: 0.92469 training's binary_logloss: 0.105365 valid_1's auc: 0.828602 valid_1's binary_logloss: 0.138622
[35] training's auc: 0.925873 training's binary_logloss: 0.104873 valid_1's auc: 0.828952 valid_1's binary_logloss: 0.138629
[36] training's auc: 0.926644 training's binary_logloss: 0.104343 valid_1's auc: 0.828352 valid_1's binary_logloss: 0.138819
[37] training's auc: 0.928179 training's binary_logloss: 0.103792 valid_1's auc: 0.828141 valid_1's binary_logloss: 0.138868
[38] training's auc: 0.928724 training's binary_logloss: 0.103377 valid_1's auc: 0.827941 valid_1's binary_logloss: 0.139004
[39] training's auc: 0.929102 training's binary_logloss: 0.103051 valid_1's auc: 0.827633 valid_1's binary_logloss: 0.139112
[40] training's auc: 0.930226 training's binary_logloss: 0.10251 valid_1's auc: 0.827035 valid_1's binary_logloss: 0.139318
[41] training's auc: 0.93111 training's binary_logloss: 0.102024 valid_1's auc: 0.826204 valid_1's binary_logloss: 0.139541
[42] training's auc: 0.931952 training's binary_logloss: 0.101652 valid_1's auc: 0.826865 valid_1's binary_logloss: 0.139448
[43] training's auc: 0.933478 training's binary_logloss: 0.101217 valid_1's auc: 0.827058 valid_1's binary_logloss: 0.139497
Early stopping, best iteration is:
[13] training's auc: 0.890544 training's binary_logloss: 0.120179 valid_1's auc: 0.837075 valid_1's binary_logloss: 0.136885
[1] training's auc: 0.841517 training's binary_logloss: 0.151937 valid_1's auc: 0.807185 valid_1's binary_logloss: 0.155633
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.855574 training's binary_logloss: 0.145505 valid_1's auc: 0.816245 valid_1's binary_logloss: 0.150654
[3] training's auc: 0.862212 training's binary_logloss: 0.140907 valid_1's auc: 0.820816 valid_1's binary_logloss: 0.147356
[4] training's auc: 0.865274 training's binary_logloss: 0.137295 valid_1's auc: 0.823434 valid_1's binary_logloss: 0.144894
[5] training's auc: 0.87003 training's binary_logloss: 0.134393 valid_1's auc: 0.825566 valid_1's binary_logloss: 0.142914
[6] training's auc: 0.87333 training's binary_logloss: 0.131956 valid_1's auc: 0.827664 valid_1's binary_logloss: 0.14143
[7] training's auc: 0.876008 training's binary_logloss: 0.129947 valid_1's auc: 0.827649 valid_1's binary_logloss: 0.140422
[8] training's auc: 0.878832 training's binary_logloss: 0.128076 valid_1's auc: 0.826874 valid_1's binary_logloss: 0.139668
[9] training's auc: 0.88285 training's binary_logloss: 0.126404 valid_1's auc: 0.827516 valid_1's binary_logloss: 0.139052
[10] training's auc: 0.886293 training's binary_logloss: 0.124981 valid_1's auc: 0.828179 valid_1's binary_logloss: 0.138499
[11] training's auc: 0.888333 training's binary_logloss: 0.123721 valid_1's auc: 0.828478 valid_1's binary_logloss: 0.138041
[12] training's auc: 0.890778 training's binary_logloss: 0.122483 valid_1's auc: 0.827885 valid_1's binary_logloss: 0.137744
[13] training's auc: 0.893097 training's binary_logloss: 0.121357 valid_1's auc: 0.827462 valid_1's binary_logloss: 0.137561
[14] training's auc: 0.895807 training's binary_logloss: 0.120328 valid_1's auc: 0.826548 valid_1's binary_logloss: 0.137442
[15] training's auc: 0.897969 training's binary_logloss: 0.119427 valid_1's auc: 0.826755 valid_1's binary_logloss: 0.137284
[16] training's auc: 0.899823 training's binary_logloss: 0.118417 valid_1's auc: 0.827707 valid_1's binary_logloss: 0.137156
[17] training's auc: 0.901859 training's binary_logloss: 0.11747 valid_1's auc: 0.826835 valid_1's binary_logloss: 0.137201
[18] training's auc: 0.904147 training's binary_logloss: 0.1166 valid_1's auc: 0.825949 valid_1's binary_logloss: 0.137252
[19] training's auc: 0.905904 training's binary_logloss: 0.115798 valid_1's auc: 0.825665 valid_1's binary_logloss: 0.137248
[20] training's auc: 0.907618 training's binary_logloss: 0.114916 valid_1's auc: 0.825024 valid_1's binary_logloss: 0.137275
[21] training's auc: 0.910007 training's binary_logloss: 0.114026 valid_1's auc: 0.824583 valid_1's binary_logloss: 0.137234
[22] training's auc: 0.911542 training's binary_logloss: 0.11328 valid_1's auc: 0.824781 valid_1's binary_logloss: 0.137213
[23] training's auc: 0.913324 training's binary_logloss: 0.11262 valid_1's auc: 0.824865 valid_1's binary_logloss: 0.137259
[24] training's auc: 0.915783 training's binary_logloss: 0.111851 valid_1's auc: 0.825172 valid_1's binary_logloss: 0.137212
[25] training's auc: 0.917145 training's binary_logloss: 0.111177 valid_1's auc: 0.825002 valid_1's binary_logloss: 0.137245
[26] training's auc: 0.918816 training's binary_logloss: 0.11059 valid_1's auc: 0.82423 valid_1's binary_logloss: 0.137424
[27] training's auc: 0.920256 training's binary_logloss: 0.109916 valid_1's auc: 0.824426 valid_1's binary_logloss: 0.137446
[28] training's auc: 0.921522 training's binary_logloss: 0.109306 valid_1's auc: 0.824272 valid_1's binary_logloss: 0.137534
[29] training's auc: 0.922851 training's binary_logloss: 0.108638 valid_1's auc: 0.823456 valid_1's binary_logloss: 0.137748
[30] training's auc: 0.924053 training's binary_logloss: 0.107985 valid_1's auc: 0.823293 valid_1's binary_logloss: 0.137854
[31] training's auc: 0.925279 training's binary_logloss: 0.107293 valid_1's auc: 0.822617 valid_1's binary_logloss: 0.13803
[32] training's auc: 0.927588 training's binary_logloss: 0.106674 valid_1's auc: 0.822216 valid_1's binary_logloss: 0.138137
[33] training's auc: 0.928466 training's binary_logloss: 0.106156 valid_1's auc: 0.822085 valid_1's binary_logloss: 0.138209
[34] training's auc: 0.929538 training's binary_logloss: 0.105576 valid_1's auc: 0.821884 valid_1's binary_logloss: 0.138274
[35] training's auc: 0.93052 training's binary_logloss: 0.105043 valid_1's auc: 0.821483 valid_1's binary_logloss: 0.138374
[36] training's auc: 0.931378 training's binary_logloss: 0.104492 valid_1's auc: 0.82164 valid_1's binary_logloss: 0.138405
[37] training's auc: 0.932532 training's binary_logloss: 0.104023 valid_1's auc: 0.821192 valid_1's binary_logloss: 0.138524
[38] training's auc: 0.933403 training's binary_logloss: 0.103526 valid_1's auc: 0.820724 valid_1's binary_logloss: 0.138625
[39] training's auc: 0.933997 training's binary_logloss: 0.103097 valid_1's auc: 0.820081 valid_1's binary_logloss: 0.138772
[40] training's auc: 0.934749 training's binary_logloss: 0.102598 valid_1's auc: 0.819702 valid_1's binary_logloss: 0.13893
[41] training's auc: 0.935281 training's binary_logloss: 0.102167 valid_1's auc: 0.819086 valid_1's binary_logloss: 0.139096
Early stopping, best iteration is:
[11] training's auc: 0.888333 training's binary_logloss: 0.123721 valid_1's auc: 0.828478 valid_1's binary_logloss: 0.138041
[1] training's auc: 0.838975 training's binary_logloss: 0.1545 valid_1's auc: 0.817259 valid_1's binary_logloss: 0.151124
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.852192 training's binary_logloss: 0.147745 valid_1's auc: 0.827049 valid_1's binary_logloss: 0.146015
[3] training's auc: 0.859608 training's binary_logloss: 0.143028 valid_1's auc: 0.830645 valid_1's binary_logloss: 0.142686
[4] training's auc: 0.864826 training's binary_logloss: 0.139438 valid_1's auc: 0.831548 valid_1's binary_logloss: 0.140229
[5] training's auc: 0.86981 training's binary_logloss: 0.136477 valid_1's auc: 0.831619 valid_1's binary_logloss: 0.138418
[6] training's auc: 0.873434 training's binary_logloss: 0.13396 valid_1's auc: 0.832149 valid_1's binary_logloss: 0.136995
[7] training's auc: 0.876124 training's binary_logloss: 0.131889 valid_1's auc: 0.832317 valid_1's binary_logloss: 0.135783
[8] training's auc: 0.87915 training's binary_logloss: 0.129992 valid_1's auc: 0.832595 valid_1's binary_logloss: 0.134862
[9] training's auc: 0.881792 training's binary_logloss: 0.128313 valid_1's auc: 0.832166 valid_1's binary_logloss: 0.1342
[10] training's auc: 0.884415 training's binary_logloss: 0.126858 valid_1's auc: 0.832357 valid_1's binary_logloss: 0.133663
[11] training's auc: 0.886639 training's binary_logloss: 0.12554 valid_1's auc: 0.832477 valid_1's binary_logloss: 0.13309
[12] training's auc: 0.889723 training's binary_logloss: 0.124302 valid_1's auc: 0.831777 valid_1's binary_logloss: 0.132852
[13] training's auc: 0.891455 training's binary_logloss: 0.123081 valid_1's auc: 0.831489 valid_1's binary_logloss: 0.132635
[14] training's auc: 0.893992 training's binary_logloss: 0.121982 valid_1's auc: 0.832338 valid_1's binary_logloss: 0.132265
[15] training's auc: 0.896659 training's binary_logloss: 0.120918 valid_1's auc: 0.832319 valid_1's binary_logloss: 0.132135
[16] training's auc: 0.898766 training's binary_logloss: 0.119951 valid_1's auc: 0.832255 valid_1's binary_logloss: 0.131974
[17] training's auc: 0.90045 training's binary_logloss: 0.119011 valid_1's auc: 0.832437 valid_1's binary_logloss: 0.131813
[18] training's auc: 0.902329 training's binary_logloss: 0.118122 valid_1's auc: 0.832768 valid_1's binary_logloss: 0.131689
[19] training's auc: 0.904318 training's binary_logloss: 0.117246 valid_1's auc: 0.832454 valid_1's binary_logloss: 0.131663
[20] training's auc: 0.906978 training's binary_logloss: 0.116427 valid_1's auc: 0.83306 valid_1's binary_logloss: 0.131493
[21] training's auc: 0.909264 training's binary_logloss: 0.115568 valid_1's auc: 0.832763 valid_1's binary_logloss: 0.131485
[22] training's auc: 0.911106 training's binary_logloss: 0.114792 valid_1's auc: 0.83298 valid_1's binary_logloss: 0.131411
[23] training's auc: 0.913053 training's binary_logloss: 0.113954 valid_1's auc: 0.833052 valid_1's binary_logloss: 0.131413
[24] training's auc: 0.914679 training's binary_logloss: 0.113261 valid_1's auc: 0.833451 valid_1's binary_logloss: 0.131346
[25] training's auc: 0.916185 training's binary_logloss: 0.112631 valid_1's auc: 0.833301 valid_1's binary_logloss: 0.131357
[26] training's auc: 0.918089 training's binary_logloss: 0.111953 valid_1's auc: 0.83329 valid_1's binary_logloss: 0.13136
[27] training's auc: 0.919721 training's binary_logloss: 0.111316 valid_1's auc: 0.83301 valid_1's binary_logloss: 0.131397
[28] training's auc: 0.921116 training's binary_logloss: 0.110648 valid_1's auc: 0.83305 valid_1's binary_logloss: 0.131375
[29] training's auc: 0.922737 training's binary_logloss: 0.110061 valid_1's auc: 0.832499 valid_1's binary_logloss: 0.131482
[30] training's auc: 0.923854 training's binary_logloss: 0.109445 valid_1's auc: 0.83207 valid_1's binary_logloss: 0.131577
[31] training's auc: 0.925311 training's binary_logloss: 0.108773 valid_1's auc: 0.83224 valid_1's binary_logloss: 0.131567
[32] training's auc: 0.926186 training's binary_logloss: 0.108195 valid_1's auc: 0.832257 valid_1's binary_logloss: 0.131626
[33] training's auc: 0.927561 training's binary_logloss: 0.107505 valid_1's auc: 0.831927 valid_1's binary_logloss: 0.131765
[34] training's auc: 0.928776 training's binary_logloss: 0.106989 valid_1's auc: 0.830795 valid_1's binary_logloss: 0.131951
[35] training's auc: 0.929665 training's binary_logloss: 0.106491 valid_1's auc: 0.830594 valid_1's binary_logloss: 0.132058
[36] training's auc: 0.930805 training's binary_logloss: 0.106034 valid_1's auc: 0.830096 valid_1's binary_logloss: 0.132214
[37] training's auc: 0.931468 training's binary_logloss: 0.10561 valid_1's auc: 0.829849 valid_1's binary_logloss: 0.132275
[38] training's auc: 0.932422 training's binary_logloss: 0.105099 valid_1's auc: 0.829379 valid_1's binary_logloss: 0.132381
[39] training's auc: 0.933016 training's binary_logloss: 0.104661 valid_1's auc: 0.829208 valid_1's binary_logloss: 0.13249
[40] training's auc: 0.93386 training's binary_logloss: 0.104254 valid_1's auc: 0.82983 valid_1's binary_logloss: 0.132421
[41] training's auc: 0.934984 training's binary_logloss: 0.103672 valid_1's auc: 0.829733 valid_1's binary_logloss: 0.132409
[42] training's auc: 0.93552 training's binary_logloss: 0.103256 valid_1's auc: 0.829399 valid_1's binary_logloss: 0.13251
[43] training's auc: 0.936148 training's binary_logloss: 0.102822 valid_1's auc: 0.8294 valid_1's binary_logloss: 0.132554
[44] training's auc: 0.936704 training's binary_logloss: 0.102369 valid_1's auc: 0.829215 valid_1's binary_logloss: 0.132571
[45] training's auc: 0.937361 training's binary_logloss: 0.101943 valid_1's auc: 0.828823 valid_1's binary_logloss: 0.132704
[46] training's auc: 0.938701 training's binary_logloss: 0.101464 valid_1's auc: 0.829437 valid_1's binary_logloss: 0.132666
[47] training's auc: 0.939708 training's binary_logloss: 0.100961 valid_1's auc: 0.82908 valid_1's binary_logloss: 0.132726
[48] training's auc: 0.940695 training's binary_logloss: 0.100516 valid_1's auc: 0.828798 valid_1's binary_logloss: 0.132847
[49] training's auc: 0.94168 training's binary_logloss: 0.100148 valid_1's auc: 0.828728 valid_1's binary_logloss: 0.132909
[50] training's auc: 0.94199 training's binary_logloss: 0.0998415 valid_1's auc: 0.82837 valid_1's binary_logloss: 0.132982
[51] training's auc: 0.942309 training's binary_logloss: 0.0995556 valid_1's auc: 0.82803 valid_1's binary_logloss: 0.133091
[52] training's auc: 0.942678 training's binary_logloss: 0.0991938 valid_1's auc: 0.828449 valid_1's binary_logloss: 0.133078
[53] training's auc: 0.942948 training's binary_logloss: 0.0989014 valid_1's auc: 0.827987 valid_1's binary_logloss: 0.133213
[54] training's auc: 0.94366 training's binary_logloss: 0.0984897 valid_1's auc: 0.828089 valid_1's binary_logloss: 0.133259
Early stopping, best iteration is:
[24] training's auc: 0.914679 training's binary_logloss: 0.113261 valid_1's auc: 0.833451 valid_1's binary_logloss: 0.131346
[1] training's auc: 0.839882 training's binary_logloss: 0.151382 valid_1's auc: 0.81627 valid_1's binary_logloss: 0.158355
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.851223 training's binary_logloss: 0.144876 valid_1's auc: 0.822183 valid_1's binary_logloss: 0.152701
[3] training's auc: 0.857837 training's binary_logloss: 0.140104 valid_1's auc: 0.822848 valid_1's binary_logloss: 0.149166
[4] training's auc: 0.862292 training's binary_logloss: 0.13664 valid_1's auc: 0.826217 valid_1's binary_logloss: 0.146527
[5] training's auc: 0.865512 training's binary_logloss: 0.133857 valid_1's auc: 0.826492 valid_1's binary_logloss: 0.144562
[6] training's auc: 0.868633 training's binary_logloss: 0.131579 valid_1's auc: 0.827143 valid_1's binary_logloss: 0.143177
[7] training's auc: 0.871256 training's binary_logloss: 0.129561 valid_1's auc: 0.828651 valid_1's binary_logloss: 0.14199
[8] training's auc: 0.875803 training's binary_logloss: 0.127632 valid_1's auc: 0.829392 valid_1's binary_logloss: 0.14101
[9] training's auc: 0.878337 training's binary_logloss: 0.125995 valid_1's auc: 0.829001 valid_1's binary_logloss: 0.140326
[10] training's auc: 0.880208 training's binary_logloss: 0.124622 valid_1's auc: 0.8284 valid_1's binary_logloss: 0.13976
[11] training's auc: 0.884464 training's binary_logloss: 0.123214 valid_1's auc: 0.830139 valid_1's binary_logloss: 0.139249
[12] training's auc: 0.887153 training's binary_logloss: 0.122011 valid_1's auc: 0.831741 valid_1's binary_logloss: 0.138767
[13] training's auc: 0.890799 training's binary_logloss: 0.120809 valid_1's auc: 0.831652 valid_1's binary_logloss: 0.138417
[14] training's auc: 0.89314 training's binary_logloss: 0.119743 valid_1's auc: 0.831737 valid_1's binary_logloss: 0.138118
[15] training's auc: 0.896071 training's binary_logloss: 0.11864 valid_1's auc: 0.831881 valid_1's binary_logloss: 0.138
[16] training's auc: 0.898518 training's binary_logloss: 0.117607 valid_1's auc: 0.832676 valid_1's binary_logloss: 0.137812
[17] training's auc: 0.900958 training's binary_logloss: 0.116643 valid_1's auc: 0.832432 valid_1's binary_logloss: 0.137726
[18] training's auc: 0.903144 training's binary_logloss: 0.11571 valid_1's auc: 0.83169 valid_1's binary_logloss: 0.137706
[19] training's auc: 0.905396 training's binary_logloss: 0.114789 valid_1's auc: 0.831566 valid_1's binary_logloss: 0.137702
[20] training's auc: 0.907212 training's binary_logloss: 0.114017 valid_1's auc: 0.831855 valid_1's binary_logloss: 0.137637
[21] training's auc: 0.908657 training's binary_logloss: 0.113248 valid_1's auc: 0.831292 valid_1's binary_logloss: 0.137671
[22] training's auc: 0.910507 training's binary_logloss: 0.11248 valid_1's auc: 0.831756 valid_1's binary_logloss: 0.137621
[23] training's auc: 0.912297 training's binary_logloss: 0.111692 valid_1's auc: 0.831624 valid_1's binary_logloss: 0.137672
[24] training's auc: 0.914496 training's binary_logloss: 0.110953 valid_1's auc: 0.83144 valid_1's binary_logloss: 0.137686
[25] training's auc: 0.916451 training's binary_logloss: 0.110268 valid_1's auc: 0.831307 valid_1's binary_logloss: 0.137721
[26] training's auc: 0.91784 training's binary_logloss: 0.109646 valid_1's auc: 0.83065 valid_1's binary_logloss: 0.13781
[27] training's auc: 0.91952 training's binary_logloss: 0.109073 valid_1's auc: 0.830691 valid_1's binary_logloss: 0.137834
[28] training's auc: 0.920923 training's binary_logloss: 0.108368 valid_1's auc: 0.830617 valid_1's binary_logloss: 0.137954
[29] training's auc: 0.922206 training's binary_logloss: 0.107689 valid_1's auc: 0.829763 valid_1's binary_logloss: 0.138187
[30] training's auc: 0.92351 training's binary_logloss: 0.106913 valid_1's auc: 0.829652 valid_1's binary_logloss: 0.138253
[31] training's auc: 0.924786 training's binary_logloss: 0.106302 valid_1's auc: 0.829696 valid_1's binary_logloss: 0.138287
[32] training's auc: 0.92589 training's binary_logloss: 0.105708 valid_1's auc: 0.829452 valid_1's binary_logloss: 0.138379
[33] training's auc: 0.927187 training's binary_logloss: 0.105131 valid_1's auc: 0.82894 valid_1's binary_logloss: 0.138529
[34] training's auc: 0.928362 training's binary_logloss: 0.104465 valid_1's auc: 0.828703 valid_1's binary_logloss: 0.138609
[35] training's auc: 0.929004 training's binary_logloss: 0.10399 valid_1's auc: 0.828467 valid_1's binary_logloss: 0.138755
[36] training's auc: 0.929846 training's binary_logloss: 0.103455 valid_1's auc: 0.827858 valid_1's binary_logloss: 0.138919
[37] training's auc: 0.930643 training's binary_logloss: 0.102932 valid_1's auc: 0.82701 valid_1's binary_logloss: 0.139148
[38] training's auc: 0.931586 training's binary_logloss: 0.102394 valid_1's auc: 0.826711 valid_1's binary_logloss: 0.139216
[39] training's auc: 0.93275 training's binary_logloss: 0.101859 valid_1's auc: 0.826296 valid_1's binary_logloss: 0.13931
[40] training's auc: 0.933276 training's binary_logloss: 0.101418 valid_1's auc: 0.8259 valid_1's binary_logloss: 0.139454
[41] training's auc: 0.934414 training's binary_logloss: 0.101028 valid_1's auc: 0.826087 valid_1's binary_logloss: 0.139501
[42] training's auc: 0.93488 training's binary_logloss: 0.1006 valid_1's auc: 0.825799 valid_1's binary_logloss: 0.139627
[43] training's auc: 0.935718 training's binary_logloss: 0.100084 valid_1's auc: 0.82505 valid_1's binary_logloss: 0.139851
[44] training's auc: 0.936285 training's binary_logloss: 0.0996662 valid_1's auc: 0.824539 valid_1's binary_logloss: 0.139972
[45] training's auc: 0.93728 training's binary_logloss: 0.0992446 valid_1's auc: 0.824292 valid_1's binary_logloss: 0.140039
[46] training's auc: 0.938684 training's binary_logloss: 0.0987412 valid_1's auc: 0.824008 valid_1's binary_logloss: 0.140142
Early stopping, best iteration is:
[16] training's auc: 0.898518 training's binary_logloss: 0.117607 valid_1's auc: 0.832676 valid_1's binary_logloss: 0.137812
[1] training's auc: 0.832348 training's binary_logloss: 0.15501 valid_1's auc: 0.810127 valid_1's binary_logloss: 0.157321
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.83822 training's binary_logloss: 0.149513 valid_1's auc: 0.810345 valid_1's binary_logloss: 0.152841
[3] training's auc: 0.845461 training's binary_logloss: 0.145423 valid_1's auc: 0.817197 valid_1's binary_logloss: 0.149498
[4] training's auc: 0.849325 training's binary_logloss: 0.142232 valid_1's auc: 0.820425 valid_1's binary_logloss: 0.146993
[5] training's auc: 0.853322 training's binary_logloss: 0.1396 valid_1's auc: 0.822311 valid_1's binary_logloss: 0.145023
[6] training's auc: 0.856205 training's binary_logloss: 0.137435 valid_1's auc: 0.823555 valid_1's binary_logloss: 0.143396
[7] training's auc: 0.859502 training's binary_logloss: 0.135652 valid_1's auc: 0.82389 valid_1's binary_logloss: 0.1421
[8] training's auc: 0.860504 training's binary_logloss: 0.134132 valid_1's auc: 0.823977 valid_1's binary_logloss: 0.141065
[9] training's auc: 0.86222 training's binary_logloss: 0.13272 valid_1's auc: 0.826023 valid_1's binary_logloss: 0.140078
[10] training's auc: 0.864844 training's binary_logloss: 0.131537 valid_1's auc: 0.82649 valid_1's binary_logloss: 0.13942
[11] training's auc: 0.865467 training's binary_logloss: 0.130497 valid_1's auc: 0.826677 valid_1's binary_logloss: 0.138865
[12] training's auc: 0.867327 training's binary_logloss: 0.129579 valid_1's auc: 0.827042 valid_1's binary_logloss: 0.1384
[13] training's auc: 0.869304 training's binary_logloss: 0.128682 valid_1's auc: 0.826884 valid_1's binary_logloss: 0.137991
[14] training's auc: 0.871273 training's binary_logloss: 0.127884 valid_1's auc: 0.827457 valid_1's binary_logloss: 0.137613
[15] training's auc: 0.872589 training's binary_logloss: 0.12711 valid_1's auc: 0.827493 valid_1's binary_logloss: 0.137354
[16] training's auc: 0.87499 training's binary_logloss: 0.126358 valid_1's auc: 0.827388 valid_1's binary_logloss: 0.137164
[17] training's auc: 0.876818 training's binary_logloss: 0.12567 valid_1's auc: 0.828763 valid_1's binary_logloss: 0.136876
[18] training's auc: 0.877983 training's binary_logloss: 0.125074 valid_1's auc: 0.829099 valid_1's binary_logloss: 0.13666
[19] training's auc: 0.879524 training's binary_logloss: 0.124484 valid_1's auc: 0.828764 valid_1's binary_logloss: 0.136605
[20] training's auc: 0.88074 training's binary_logloss: 0.123944 valid_1's auc: 0.828807 valid_1's binary_logloss: 0.136516
[21] training's auc: 0.881833 training's binary_logloss: 0.123426 valid_1's auc: 0.829488 valid_1's binary_logloss: 0.136344
[22] training's auc: 0.88305 training's binary_logloss: 0.122948 valid_1's auc: 0.829805 valid_1's binary_logloss: 0.136223
[23] training's auc: 0.884056 training's binary_logloss: 0.122452 valid_1's auc: 0.830064 valid_1's binary_logloss: 0.136113
[24] training's auc: 0.885223 training's binary_logloss: 0.122006 valid_1's auc: 0.830331 valid_1's binary_logloss: 0.136055
[25] training's auc: 0.886128 training's binary_logloss: 0.121582 valid_1's auc: 0.830444 valid_1's binary_logloss: 0.135998
[26] training's auc: 0.887253 training's binary_logloss: 0.121179 valid_1's auc: 0.83071 valid_1's binary_logloss: 0.135898
[27] training's auc: 0.888568 training's binary_logloss: 0.120801 valid_1's auc: 0.831602 valid_1's binary_logloss: 0.135768
[28] training's auc: 0.889807 training's binary_logloss: 0.120427 valid_1's auc: 0.831465 valid_1's binary_logloss: 0.135785
[29] training's auc: 0.891357 training's binary_logloss: 0.120014 valid_1's auc: 0.832403 valid_1's binary_logloss: 0.135642
[30] training's auc: 0.892336 training's binary_logloss: 0.119607 valid_1's auc: 0.832272 valid_1's binary_logloss: 0.135642
[31] training's auc: 0.893557 training's binary_logloss: 0.119229 valid_1's auc: 0.832593 valid_1's binary_logloss: 0.135581
[32] training's auc: 0.894707 training's binary_logloss: 0.118877 valid_1's auc: 0.83243 valid_1's binary_logloss: 0.13562
[33] training's auc: 0.89551 training's binary_logloss: 0.118576 valid_1's auc: 0.832033 valid_1's binary_logloss: 0.135655
[34] training's auc: 0.896619 training's binary_logloss: 0.118185 valid_1's auc: 0.831829 valid_1's binary_logloss: 0.135657
[35] training's auc: 0.897374 training's binary_logloss: 0.117873 valid_1's auc: 0.831816 valid_1's binary_logloss: 0.135673
[36] training's auc: 0.898295 training's binary_logloss: 0.11748 valid_1's auc: 0.831819 valid_1's binary_logloss: 0.135715
[37] training's auc: 0.899561 training's binary_logloss: 0.117113 valid_1's auc: 0.83158 valid_1's binary_logloss: 0.13574
[38] training's auc: 0.900448 training's binary_logloss: 0.116782 valid_1's auc: 0.831149 valid_1's binary_logloss: 0.135847
[39] training's auc: 0.901397 training's binary_logloss: 0.116507 valid_1's auc: 0.831377 valid_1's binary_logloss: 0.13583
[40] training's auc: 0.902099 training's binary_logloss: 0.116202 valid_1's auc: 0.830936 valid_1's binary_logloss: 0.13593
[41] training's auc: 0.903132 training's binary_logloss: 0.115887 valid_1's auc: 0.830861 valid_1's binary_logloss: 0.13597
[42] training's auc: 0.904259 training's binary_logloss: 0.115573 valid_1's auc: 0.830687 valid_1's binary_logloss: 0.13599
[43] training's auc: 0.905132 training's binary_logloss: 0.115261 valid_1's auc: 0.830219 valid_1's binary_logloss: 0.136077
[44] training's auc: 0.905576 training's binary_logloss: 0.115034 valid_1's auc: 0.829872 valid_1's binary_logloss: 0.136151
[45] training's auc: 0.906142 training's binary_logloss: 0.11475 valid_1's auc: 0.829549 valid_1's binary_logloss: 0.136215
[46] training's auc: 0.90692 training's binary_logloss: 0.114456 valid_1's auc: 0.829006 valid_1's binary_logloss: 0.136308
[47] training's auc: 0.907389 training's binary_logloss: 0.11421 valid_1's auc: 0.828807 valid_1's binary_logloss: 0.136368
[48] training's auc: 0.907878 training's binary_logloss: 0.113965 valid_1's auc: 0.828376 valid_1's binary_logloss: 0.136446
[49] training's auc: 0.908871 training's binary_logloss: 0.113639 valid_1's auc: 0.82827 valid_1's binary_logloss: 0.13647
[50] training's auc: 0.909521 training's binary_logloss: 0.11339 valid_1's auc: 0.827693 valid_1's binary_logloss: 0.136594
[51] training's auc: 0.910569 training's binary_logloss: 0.113131 valid_1's auc: 0.827562 valid_1's binary_logloss: 0.136624
[52] training's auc: 0.910932 training's binary_logloss: 0.112945 valid_1's auc: 0.827222 valid_1's binary_logloss: 0.136711
[53] training's auc: 0.911623 training's binary_logloss: 0.112651 valid_1's auc: 0.827184 valid_1's binary_logloss: 0.13674
[54] training's auc: 0.912133 training's binary_logloss: 0.11236 valid_1's auc: 0.826772 valid_1's binary_logloss: 0.13686
[55] training's auc: 0.91256 training's binary_logloss: 0.112151 valid_1's auc: 0.826379 valid_1's binary_logloss: 0.136957
[56] training's auc: 0.912951 training's binary_logloss: 0.111942 valid_1's auc: 0.82592 valid_1's binary_logloss: 0.137037
[57] training's auc: 0.913608 training's binary_logloss: 0.111713 valid_1's auc: 0.826224 valid_1's binary_logloss: 0.137
[58] training's auc: 0.914027 training's binary_logloss: 0.111479 valid_1's auc: 0.82606 valid_1's binary_logloss: 0.137051
[59] training's auc: 0.914511 training's binary_logloss: 0.111228 valid_1's auc: 0.825789 valid_1's binary_logloss: 0.137127
[60] training's auc: 0.915222 training's binary_logloss: 0.11098 valid_1's auc: 0.82563 valid_1's binary_logloss: 0.137143
[61] training's auc: 0.915758 training's binary_logloss: 0.110747 valid_1's auc: 0.8257 valid_1's binary_logloss: 0.137153
Early stopping, best iteration is:
[31] training's auc: 0.893557 training's binary_logloss: 0.119229 valid_1's auc: 0.832593 valid_1's binary_logloss: 0.135581
[1] training's auc: 0.827625 training's binary_logloss: 0.157555 valid_1's auc: 0.813696 valid_1's binary_logloss: 0.152935
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.836182 training's binary_logloss: 0.151888 valid_1's auc: 0.821358 valid_1's binary_logloss: 0.148119
[3] training's auc: 0.844018 training's binary_logloss: 0.147802 valid_1's auc: 0.827494 valid_1's binary_logloss: 0.144767
[4] training's auc: 0.84757 training's binary_logloss: 0.144674 valid_1's auc: 0.829182 valid_1's binary_logloss: 0.142284
[5] training's auc: 0.851702 training's binary_logloss: 0.142092 valid_1's auc: 0.833139 valid_1's binary_logloss: 0.140062
[6] training's auc: 0.854796 training's binary_logloss: 0.139952 valid_1's auc: 0.835891 valid_1's binary_logloss: 0.138464
[7] training's auc: 0.857631 training's binary_logloss: 0.138101 valid_1's auc: 0.836615 valid_1's binary_logloss: 0.137104
[8] training's auc: 0.860411 training's binary_logloss: 0.136548 valid_1's auc: 0.83628 valid_1's binary_logloss: 0.135999
[9] training's auc: 0.862096 training's binary_logloss: 0.135203 valid_1's auc: 0.835476 valid_1's binary_logloss: 0.135135
[10] training's auc: 0.863159 training's binary_logloss: 0.134016 valid_1's auc: 0.835208 valid_1's binary_logloss: 0.134414
[11] training's auc: 0.865332 training's binary_logloss: 0.132868 valid_1's auc: 0.83617 valid_1's binary_logloss: 0.133765
[12] training's auc: 0.866618 training's binary_logloss: 0.13188 valid_1's auc: 0.835715 valid_1's binary_logloss: 0.133337
[13] training's auc: 0.867977 training's binary_logloss: 0.130942 valid_1's auc: 0.834956 valid_1's binary_logloss: 0.132981
[14] training's auc: 0.869614 training's binary_logloss: 0.130141 valid_1's auc: 0.833968 valid_1's binary_logloss: 0.132705
[15] training's auc: 0.87077 training's binary_logloss: 0.129393 valid_1's auc: 0.834188 valid_1's binary_logloss: 0.132369
[16] training's auc: 0.872002 training's binary_logloss: 0.128732 valid_1's auc: 0.834482 valid_1's binary_logloss: 0.132129
[17] training's auc: 0.873369 training's binary_logloss: 0.127993 valid_1's auc: 0.834338 valid_1's binary_logloss: 0.13192
[18] training's auc: 0.874954 training's binary_logloss: 0.127378 valid_1's auc: 0.834311 valid_1's binary_logloss: 0.13169
[19] training's auc: 0.876716 training's binary_logloss: 0.12671 valid_1's auc: 0.834133 valid_1's binary_logloss: 0.131547
[20] training's auc: 0.877948 training's binary_logloss: 0.126143 valid_1's auc: 0.834732 valid_1's binary_logloss: 0.131359
[21] training's auc: 0.879574 training's binary_logloss: 0.125535 valid_1's auc: 0.835056 valid_1's binary_logloss: 0.131236
[22] training's auc: 0.880633 training's binary_logloss: 0.12501 valid_1's auc: 0.835724 valid_1's binary_logloss: 0.131077
[23] training's auc: 0.881984 training's binary_logloss: 0.124467 valid_1's auc: 0.835519 valid_1's binary_logloss: 0.13106
[24] training's auc: 0.88351 training's binary_logloss: 0.123918 valid_1's auc: 0.836197 valid_1's binary_logloss: 0.130918
[25] training's auc: 0.884993 training's binary_logloss: 0.123406 valid_1's auc: 0.836704 valid_1's binary_logloss: 0.130818
[26] training's auc: 0.886084 training's binary_logloss: 0.123 valid_1's auc: 0.836152 valid_1's binary_logloss: 0.130829
[27] training's auc: 0.88724 training's binary_logloss: 0.122551 valid_1's auc: 0.836065 valid_1's binary_logloss: 0.130788
[28] training's auc: 0.888737 training's binary_logloss: 0.122132 valid_1's auc: 0.836353 valid_1's binary_logloss: 0.130764
[29] training's auc: 0.890325 training's binary_logloss: 0.121714 valid_1's auc: 0.836435 valid_1's binary_logloss: 0.130731
[30] training's auc: 0.891267 training's binary_logloss: 0.12131 valid_1's auc: 0.836503 valid_1's binary_logloss: 0.130711
[31] training's auc: 0.892077 training's binary_logloss: 0.120987 valid_1's auc: 0.836502 valid_1's binary_logloss: 0.130664
[32] training's auc: 0.893499 training's binary_logloss: 0.120594 valid_1's auc: 0.83656 valid_1's binary_logloss: 0.130627
[33] training's auc: 0.894775 training's binary_logloss: 0.120201 valid_1's auc: 0.83704 valid_1's binary_logloss: 0.130526
[34] training's auc: 0.896097 training's binary_logloss: 0.11983 valid_1's auc: 0.83695 valid_1's binary_logloss: 0.130514
[35] training's auc: 0.89722 training's binary_logloss: 0.119414 valid_1's auc: 0.836764 valid_1's binary_logloss: 0.130561
[36] training's auc: 0.898199 training's binary_logloss: 0.11907 valid_1's auc: 0.836024 valid_1's binary_logloss: 0.130662
[37] training's auc: 0.899254 training's binary_logloss: 0.118681 valid_1's auc: 0.835888 valid_1's binary_logloss: 0.130685
[38] training's auc: 0.900029 training's binary_logloss: 0.118386 valid_1's auc: 0.835742 valid_1's binary_logloss: 0.130722
[39] training's auc: 0.900882 training's binary_logloss: 0.11806 valid_1's auc: 0.835645 valid_1's binary_logloss: 0.130716
[40] training's auc: 0.901772 training's binary_logloss: 0.117702 valid_1's auc: 0.835608 valid_1's binary_logloss: 0.130718
[41] training's auc: 0.902583 training's binary_logloss: 0.117377 valid_1's auc: 0.835002 valid_1's binary_logloss: 0.130828
[42] training's auc: 0.903228 training's binary_logloss: 0.117067 valid_1's auc: 0.834589 valid_1's binary_logloss: 0.130897
[43] training's auc: 0.904028 training's binary_logloss: 0.116813 valid_1's auc: 0.834539 valid_1's binary_logloss: 0.130899
[44] training's auc: 0.904627 training's binary_logloss: 0.116546 valid_1's auc: 0.834009 valid_1's binary_logloss: 0.131
[45] training's auc: 0.905493 training's binary_logloss: 0.116292 valid_1's auc: 0.83425 valid_1's binary_logloss: 0.13096
[46] training's auc: 0.906218 training's binary_logloss: 0.115976 valid_1's auc: 0.833636 valid_1's binary_logloss: 0.131044
[47] training's auc: 0.90678 training's binary_logloss: 0.115707 valid_1's auc: 0.833622 valid_1's binary_logloss: 0.131086
[48] training's auc: 0.907258 training's binary_logloss: 0.115513 valid_1's auc: 0.833641 valid_1's binary_logloss: 0.131077
[49] training's auc: 0.907743 training's binary_logloss: 0.115293 valid_1's auc: 0.833514 valid_1's binary_logloss: 0.131112
[50] training's auc: 0.908162 training's binary_logloss: 0.11504 valid_1's auc: 0.833272 valid_1's binary_logloss: 0.13117
[51] training's auc: 0.908741 training's binary_logloss: 0.114799 valid_1's auc: 0.83323 valid_1's binary_logloss: 0.131181
[52] training's auc: 0.909207 training's binary_logloss: 0.114564 valid_1's auc: 0.833235 valid_1's binary_logloss: 0.13121
[53] training's auc: 0.909895 training's binary_logloss: 0.1143 valid_1's auc: 0.833071 valid_1's binary_logloss: 0.131214
[54] training's auc: 0.910355 training's binary_logloss: 0.114027 valid_1's auc: 0.83291 valid_1's binary_logloss: 0.131256
[55] training's auc: 0.910725 training's binary_logloss: 0.113811 valid_1's auc: 0.832884 valid_1's binary_logloss: 0.131253
[56] training's auc: 0.911267 training's binary_logloss: 0.113522 valid_1's auc: 0.832659 valid_1's binary_logloss: 0.131291
[57] training's auc: 0.911872 training's binary_logloss: 0.11325 valid_1's auc: 0.832532 valid_1's binary_logloss: 0.131314
[58] training's auc: 0.912306 training's binary_logloss: 0.113064 valid_1's auc: 0.832234 valid_1's binary_logloss: 0.131361
[59] training's auc: 0.912992 training's binary_logloss: 0.112773 valid_1's auc: 0.832378 valid_1's binary_logloss: 0.131332
[60] training's auc: 0.913451 training's binary_logloss: 0.112515 valid_1's auc: 0.832253 valid_1's binary_logloss: 0.13136
[61] training's auc: 0.914033 training's binary_logloss: 0.112267 valid_1's auc: 0.832627 valid_1's binary_logloss: 0.131309
[62] training's auc: 0.914426 training's binary_logloss: 0.112092 valid_1's auc: 0.83231 valid_1's binary_logloss: 0.131362
[63] training's auc: 0.915039 training's binary_logloss: 0.111796 valid_1's auc: 0.832397 valid_1's binary_logloss: 0.131355
Early stopping, best iteration is:
[33] training's auc: 0.894775 training's binary_logloss: 0.120201 valid_1's auc: 0.83704 valid_1's binary_logloss: 0.130526
[1] training's auc: 0.826983 training's binary_logloss: 0.154221 valid_1's auc: 0.810415 valid_1's binary_logloss: 0.160113
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.833381 training's binary_logloss: 0.148812 valid_1's auc: 0.81615 valid_1's binary_logloss: 0.1549
[3] training's auc: 0.842573 training's binary_logloss: 0.144774 valid_1's auc: 0.823548 valid_1's binary_logloss: 0.151333
[4] training's auc: 0.845211 training's binary_logloss: 0.14157 valid_1's auc: 0.825457 valid_1's binary_logloss: 0.14861
[5] training's auc: 0.848266 training's binary_logloss: 0.138904 valid_1's auc: 0.828329 valid_1's binary_logloss: 0.146398
[6] training's auc: 0.850686 training's binary_logloss: 0.136836 valid_1's auc: 0.829386 valid_1's binary_logloss: 0.144722
[7] training's auc: 0.852563 training's binary_logloss: 0.13504 valid_1's auc: 0.829741 valid_1's binary_logloss: 0.143365
[8] training's auc: 0.854589 training's binary_logloss: 0.133554 valid_1's auc: 0.830288 valid_1's binary_logloss: 0.142259
[9] training's auc: 0.857257 training's binary_logloss: 0.132218 valid_1's auc: 0.831608 valid_1's binary_logloss: 0.141247
[10] training's auc: 0.859566 training's binary_logloss: 0.131053 valid_1's auc: 0.832167 valid_1's binary_logloss: 0.140401
[11] training's auc: 0.86097 training's binary_logloss: 0.130097 valid_1's auc: 0.832327 valid_1's binary_logloss: 0.139808
[12] training's auc: 0.864797 training's binary_logloss: 0.129139 valid_1's auc: 0.834129 valid_1's binary_logloss: 0.139274
[13] training's auc: 0.866652 training's binary_logloss: 0.128358 valid_1's auc: 0.83461 valid_1's binary_logloss: 0.138819
[14] training's auc: 0.868356 training's binary_logloss: 0.127563 valid_1's auc: 0.834842 valid_1's binary_logloss: 0.138411
[15] training's auc: 0.869952 training's binary_logloss: 0.126831 valid_1's auc: 0.835067 valid_1's binary_logloss: 0.138043
[16] training's auc: 0.872041 training's binary_logloss: 0.126156 valid_1's auc: 0.835208 valid_1's binary_logloss: 0.137768
[17] training's auc: 0.873564 training's binary_logloss: 0.12553 valid_1's auc: 0.83476 valid_1's binary_logloss: 0.137589
[18] training's auc: 0.875464 training's binary_logloss: 0.124858 valid_1's auc: 0.834585 valid_1's binary_logloss: 0.137416
[19] training's auc: 0.877027 training's binary_logloss: 0.124276 valid_1's auc: 0.834866 valid_1's binary_logloss: 0.137238
[20] training's auc: 0.878141 training's binary_logloss: 0.123707 valid_1's auc: 0.834886 valid_1's binary_logloss: 0.137082
[21] training's auc: 0.879518 training's binary_logloss: 0.123166 valid_1's auc: 0.835017 valid_1's binary_logloss: 0.136901
[22] training's auc: 0.881174 training's binary_logloss: 0.122673 valid_1's auc: 0.834647 valid_1's binary_logloss: 0.136841
[23] training's auc: 0.882501 training's binary_logloss: 0.12216 valid_1's auc: 0.834816 valid_1's binary_logloss: 0.136779
[24] training's auc: 0.883775 training's binary_logloss: 0.121741 valid_1's auc: 0.834882 valid_1's binary_logloss: 0.136726
[25] training's auc: 0.885055 training's binary_logloss: 0.121241 valid_1's auc: 0.83513 valid_1's binary_logloss: 0.136631
[26] training's auc: 0.886175 training's binary_logloss: 0.120791 valid_1's auc: 0.835005 valid_1's binary_logloss: 0.136573
[27] training's auc: 0.887337 training's binary_logloss: 0.120372 valid_1's auc: 0.834658 valid_1's binary_logloss: 0.136573
[28] training's auc: 0.888574 training's binary_logloss: 0.119978 valid_1's auc: 0.834784 valid_1's binary_logloss: 0.136556
[29] training's auc: 0.889776 training's binary_logloss: 0.119575 valid_1's auc: 0.834927 valid_1's binary_logloss: 0.136558
[30] training's auc: 0.890895 training's binary_logloss: 0.119201 valid_1's auc: 0.83466 valid_1's binary_logloss: 0.136565
[31] training's auc: 0.891948 training's binary_logloss: 0.118798 valid_1's auc: 0.834716 valid_1's binary_logloss: 0.13655
[32] training's auc: 0.892881 training's binary_logloss: 0.118448 valid_1's auc: 0.834804 valid_1's binary_logloss: 0.136529
[33] training's auc: 0.894115 training's binary_logloss: 0.118031 valid_1's auc: 0.835149 valid_1's binary_logloss: 0.136472
[34] training's auc: 0.895468 training's binary_logloss: 0.117578 valid_1's auc: 0.834992 valid_1's binary_logloss: 0.136483
[35] training's auc: 0.896237 training's binary_logloss: 0.117239 valid_1's auc: 0.834746 valid_1's binary_logloss: 0.136513
[36] training's auc: 0.8971 training's binary_logloss: 0.116921 valid_1's auc: 0.834469 valid_1's binary_logloss: 0.136563
[37] training's auc: 0.898305 training's binary_logloss: 0.116531 valid_1's auc: 0.834337 valid_1's binary_logloss: 0.136581
[38] training's auc: 0.899276 training's binary_logloss: 0.116121 valid_1's auc: 0.834334 valid_1's binary_logloss: 0.136618
[39] training's auc: 0.899976 training's binary_logloss: 0.115786 valid_1's auc: 0.834934 valid_1's binary_logloss: 0.136506
[40] training's auc: 0.901237 training's binary_logloss: 0.115406 valid_1's auc: 0.834687 valid_1's binary_logloss: 0.13652
[41] training's auc: 0.901902 training's binary_logloss: 0.115082 valid_1's auc: 0.83445 valid_1's binary_logloss: 0.136613
[42] training's auc: 0.902539 training's binary_logloss: 0.114842 valid_1's auc: 0.834615 valid_1's binary_logloss: 0.136627
[43] training's auc: 0.903386 training's binary_logloss: 0.114479 valid_1's auc: 0.834235 valid_1's binary_logloss: 0.136745
[44] training's auc: 0.904017 training's binary_logloss: 0.114177 valid_1's auc: 0.833778 valid_1's binary_logloss: 0.136841
[45] training's auc: 0.90479 training's binary_logloss: 0.113889 valid_1's auc: 0.833258 valid_1's binary_logloss: 0.136972
[46] training's auc: 0.905907 training's binary_logloss: 0.113568 valid_1's auc: 0.833619 valid_1's binary_logloss: 0.136902
Early stopping, best iteration is:
[16] training's auc: 0.872041 training's binary_logloss: 0.126156 valid_1's auc: 0.835208 valid_1's binary_logloss: 0.137768
[1] training's auc: 0.83385 training's binary_logloss: 0.153326 valid_1's auc: 0.809808 valid_1's binary_logloss: 0.156227
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.842615 training's binary_logloss: 0.147302 valid_1's auc: 0.814858 valid_1's binary_logloss: 0.151354
[3] training's auc: 0.851032 training's binary_logloss: 0.142941 valid_1's auc: 0.820753 valid_1's binary_logloss: 0.14802
[4] training's auc: 0.855129 training's binary_logloss: 0.139575 valid_1's auc: 0.822976 valid_1's binary_logloss: 0.145571
[5] training's auc: 0.859792 training's binary_logloss: 0.136902 valid_1's auc: 0.826731 valid_1's binary_logloss: 0.143468
[6] training's auc: 0.862901 training's binary_logloss: 0.134698 valid_1's auc: 0.827557 valid_1's binary_logloss: 0.141863
[7] training's auc: 0.864452 training's binary_logloss: 0.132879 valid_1's auc: 0.827834 valid_1's binary_logloss: 0.140741
[8] training's auc: 0.866295 training's binary_logloss: 0.131219 valid_1's auc: 0.829901 valid_1's binary_logloss: 0.139621
[9] training's auc: 0.869516 training's binary_logloss: 0.129849 valid_1's auc: 0.82946 valid_1's binary_logloss: 0.13892
[10] training's auc: 0.872348 training's binary_logloss: 0.12853 valid_1's auc: 0.830195 valid_1's binary_logloss: 0.138333
[11] training's auc: 0.874648 training's binary_logloss: 0.127351 valid_1's auc: 0.830518 valid_1's binary_logloss: 0.137754
[12] training's auc: 0.876406 training's binary_logloss: 0.126331 valid_1's auc: 0.830408 valid_1's binary_logloss: 0.137422
[13] training's auc: 0.878204 training's binary_logloss: 0.125346 valid_1's auc: 0.830214 valid_1's binary_logloss: 0.137003
[14] training's auc: 0.880597 training's binary_logloss: 0.124418 valid_1's auc: 0.830234 valid_1's binary_logloss: 0.136716
[15] training's auc: 0.882639 training's binary_logloss: 0.123596 valid_1's auc: 0.831313 valid_1's binary_logloss: 0.136384
[16] training's auc: 0.884291 training's binary_logloss: 0.122817 valid_1's auc: 0.83199 valid_1's binary_logloss: 0.136141
[17] training's auc: 0.88636 training's binary_logloss: 0.122039 valid_1's auc: 0.831874 valid_1's binary_logloss: 0.136045
[18] training's auc: 0.888158 training's binary_logloss: 0.121351 valid_1's auc: 0.831458 valid_1's binary_logloss: 0.136013
[19] training's auc: 0.889575 training's binary_logloss: 0.120696 valid_1's auc: 0.831782 valid_1's binary_logloss: 0.135877
[20] training's auc: 0.891036 training's binary_logloss: 0.120125 valid_1's auc: 0.831803 valid_1's binary_logloss: 0.135818
[21] training's auc: 0.892861 training's binary_logloss: 0.119516 valid_1's auc: 0.832282 valid_1's binary_logloss: 0.135744
[22] training's auc: 0.895195 training's binary_logloss: 0.118869 valid_1's auc: 0.83333 valid_1's binary_logloss: 0.135496
[23] training's auc: 0.89655 training's binary_logloss: 0.118294 valid_1's auc: 0.832921 valid_1's binary_logloss: 0.135501
[24] training's auc: 0.897766 training's binary_logloss: 0.117736 valid_1's auc: 0.833552 valid_1's binary_logloss: 0.135422
[25] training's auc: 0.898841 training's binary_logloss: 0.117274 valid_1's auc: 0.833331 valid_1's binary_logloss: 0.135404
[26] training's auc: 0.90062 training's binary_logloss: 0.116791 valid_1's auc: 0.83342 valid_1's binary_logloss: 0.135354
[27] training's auc: 0.902198 training's binary_logloss: 0.116336 valid_1's auc: 0.833467 valid_1's binary_logloss: 0.135335
[28] training's auc: 0.903712 training's binary_logloss: 0.115777 valid_1's auc: 0.833023 valid_1's binary_logloss: 0.135411
[29] training's auc: 0.904824 training's binary_logloss: 0.115408 valid_1's auc: 0.833454 valid_1's binary_logloss: 0.135351
[30] training's auc: 0.906014 training's binary_logloss: 0.114909 valid_1's auc: 0.833391 valid_1's binary_logloss: 0.13535
[31] training's auc: 0.90727 training's binary_logloss: 0.114416 valid_1's auc: 0.832792 valid_1's binary_logloss: 0.135435
[32] training's auc: 0.908537 training's binary_logloss: 0.113972 valid_1's auc: 0.832184 valid_1's binary_logloss: 0.135562
[33] training's auc: 0.909929 training's binary_logloss: 0.113578 valid_1's auc: 0.831878 valid_1's binary_logloss: 0.135643
[34] training's auc: 0.910952 training's binary_logloss: 0.113101 valid_1's auc: 0.831164 valid_1's binary_logloss: 0.13579
[35] training's auc: 0.912008 training's binary_logloss: 0.112668 valid_1's auc: 0.830844 valid_1's binary_logloss: 0.135871
[36] training's auc: 0.912786 training's binary_logloss: 0.112276 valid_1's auc: 0.830761 valid_1's binary_logloss: 0.135938
[37] training's auc: 0.913532 training's binary_logloss: 0.111905 valid_1's auc: 0.830349 valid_1's binary_logloss: 0.136052
[38] training's auc: 0.914322 training's binary_logloss: 0.111529 valid_1's auc: 0.829767 valid_1's binary_logloss: 0.136158
[39] training's auc: 0.915341 training's binary_logloss: 0.111118 valid_1's auc: 0.829528 valid_1's binary_logloss: 0.136202
[40] training's auc: 0.916332 training's binary_logloss: 0.110735 valid_1's auc: 0.829162 valid_1's binary_logloss: 0.136302
[41] training's auc: 0.917011 training's binary_logloss: 0.11035 valid_1's auc: 0.828624 valid_1's binary_logloss: 0.136428
[42] training's auc: 0.917504 training's binary_logloss: 0.11004 valid_1's auc: 0.827903 valid_1's binary_logloss: 0.136604
[43] training's auc: 0.918521 training's binary_logloss: 0.10958 valid_1's auc: 0.827679 valid_1's binary_logloss: 0.136657
[44] training's auc: 0.919148 training's binary_logloss: 0.109337 valid_1's auc: 0.82783 valid_1's binary_logloss: 0.13666
[45] training's auc: 0.919892 training's binary_logloss: 0.108992 valid_1's auc: 0.827744 valid_1's binary_logloss: 0.136697
[46] training's auc: 0.92065 training's binary_logloss: 0.10861 valid_1's auc: 0.828147 valid_1's binary_logloss: 0.136677
[47] training's auc: 0.920975 training's binary_logloss: 0.10838 valid_1's auc: 0.827803 valid_1's binary_logloss: 0.136766
[48] training's auc: 0.921474 training's binary_logloss: 0.108079 valid_1's auc: 0.827221 valid_1's binary_logloss: 0.136926
[49] training's auc: 0.922104 training's binary_logloss: 0.107794 valid_1's auc: 0.827019 valid_1's binary_logloss: 0.137027
[50] training's auc: 0.922545 training's binary_logloss: 0.107552 valid_1's auc: 0.827187 valid_1's binary_logloss: 0.137058
[51] training's auc: 0.923222 training's binary_logloss: 0.107218 valid_1's auc: 0.826961 valid_1's binary_logloss: 0.137156
[52] training's auc: 0.92367 training's binary_logloss: 0.106966 valid_1's auc: 0.826625 valid_1's binary_logloss: 0.137256
[53] training's auc: 0.924097 training's binary_logloss: 0.106645 valid_1's auc: 0.826017 valid_1's binary_logloss: 0.137422
[54] training's auc: 0.924478 training's binary_logloss: 0.106387 valid_1's auc: 0.825438 valid_1's binary_logloss: 0.137541
Early stopping, best iteration is:
[24] training's auc: 0.897766 training's binary_logloss: 0.117736 valid_1's auc: 0.833552 valid_1's binary_logloss: 0.135422
[1] training's auc: 0.827467 training's binary_logloss: 0.156101 valid_1's auc: 0.814208 valid_1's binary_logloss: 0.151837
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.838848 training's binary_logloss: 0.149798 valid_1's auc: 0.822372 valid_1's binary_logloss: 0.14679
[3] training's auc: 0.848884 training's binary_logloss: 0.145361 valid_1's auc: 0.828813 valid_1's binary_logloss: 0.143424
[4] training's auc: 0.852993 training's binary_logloss: 0.142093 valid_1's auc: 0.830412 valid_1's binary_logloss: 0.140958
[5] training's auc: 0.856931 training's binary_logloss: 0.139356 valid_1's auc: 0.832483 valid_1's binary_logloss: 0.138981
[6] training's auc: 0.86054 training's binary_logloss: 0.137046 valid_1's auc: 0.833879 valid_1's binary_logloss: 0.13734
[7] training's auc: 0.862965 training's binary_logloss: 0.135198 valid_1's auc: 0.834465 valid_1's binary_logloss: 0.136141
[8] training's auc: 0.865351 training's binary_logloss: 0.133609 valid_1's auc: 0.834029 valid_1's binary_logloss: 0.135178
[9] training's auc: 0.867642 training's binary_logloss: 0.132219 valid_1's auc: 0.834891 valid_1's binary_logloss: 0.13432
[10] training's auc: 0.870243 training's binary_logloss: 0.130951 valid_1's auc: 0.83556 valid_1's binary_logloss: 0.133685
[11] training's auc: 0.871752 training's binary_logloss: 0.129871 valid_1's auc: 0.834769 valid_1's binary_logloss: 0.133225
[12] training's auc: 0.873744 training's binary_logloss: 0.128839 valid_1's auc: 0.833972 valid_1's binary_logloss: 0.132884
[13] training's auc: 0.875333 training's binary_logloss: 0.127879 valid_1's auc: 0.834432 valid_1's binary_logloss: 0.132412
[14] training's auc: 0.876697 training's binary_logloss: 0.127061 valid_1's auc: 0.834158 valid_1's binary_logloss: 0.132094
[15] training's auc: 0.878607 training's binary_logloss: 0.126188 valid_1's auc: 0.835031 valid_1's binary_logloss: 0.131755
[16] training's auc: 0.880192 training's binary_logloss: 0.125368 valid_1's auc: 0.835188 valid_1's binary_logloss: 0.131579
[17] training's auc: 0.881909 training's binary_logloss: 0.124604 valid_1's auc: 0.835025 valid_1's binary_logloss: 0.131401
[18] training's auc: 0.883318 training's binary_logloss: 0.123866 valid_1's auc: 0.834612 valid_1's binary_logloss: 0.131332
[19] training's auc: 0.886204 training's binary_logloss: 0.123055 valid_1's auc: 0.835524 valid_1's binary_logloss: 0.13111
[20] training's auc: 0.887478 training's binary_logloss: 0.122444 valid_1's auc: 0.835162 valid_1's binary_logloss: 0.131152
[21] training's auc: 0.889414 training's binary_logloss: 0.121844 valid_1's auc: 0.8351 valid_1's binary_logloss: 0.131082
[22] training's auc: 0.891432 training's binary_logloss: 0.121171 valid_1's auc: 0.835955 valid_1's binary_logloss: 0.130998
[23] training's auc: 0.893326 training's binary_logloss: 0.120582 valid_1's auc: 0.835894 valid_1's binary_logloss: 0.130971
[24] training's auc: 0.894895 training's binary_logloss: 0.120052 valid_1's auc: 0.836078 valid_1's binary_logloss: 0.13094
[25] training's auc: 0.896808 training's binary_logloss: 0.119494 valid_1's auc: 0.83606 valid_1's binary_logloss: 0.130895
[26] training's auc: 0.898052 training's binary_logloss: 0.118955 valid_1's auc: 0.836173 valid_1's binary_logloss: 0.130842
[27] training's auc: 0.899975 training's binary_logloss: 0.11845 valid_1's auc: 0.835577 valid_1's binary_logloss: 0.130914
[28] training's auc: 0.901546 training's binary_logloss: 0.117878 valid_1's auc: 0.835445 valid_1's binary_logloss: 0.130934
[29] training's auc: 0.902776 training's binary_logloss: 0.117382 valid_1's auc: 0.835691 valid_1's binary_logloss: 0.130871
[30] training's auc: 0.904137 training's binary_logloss: 0.116813 valid_1's auc: 0.835376 valid_1's binary_logloss: 0.130906
[31] training's auc: 0.905742 training's binary_logloss: 0.116393 valid_1's auc: 0.835086 valid_1's binary_logloss: 0.130938
[32] training's auc: 0.907295 training's binary_logloss: 0.115933 valid_1's auc: 0.835128 valid_1's binary_logloss: 0.130935
[33] training's auc: 0.908537 training's binary_logloss: 0.115427 valid_1's auc: 0.834863 valid_1's binary_logloss: 0.130984
[34] training's auc: 0.909523 training's binary_logloss: 0.114953 valid_1's auc: 0.83521 valid_1's binary_logloss: 0.130936
[35] training's auc: 0.91051 training's binary_logloss: 0.114487 valid_1's auc: 0.835293 valid_1's binary_logloss: 0.130925
[36] training's auc: 0.911342 training's binary_logloss: 0.11414 valid_1's auc: 0.835429 valid_1's binary_logloss: 0.130934
[37] training's auc: 0.912318 training's binary_logloss: 0.113742 valid_1's auc: 0.835214 valid_1's binary_logloss: 0.130966
[38] training's auc: 0.913035 training's binary_logloss: 0.113358 valid_1's auc: 0.835065 valid_1's binary_logloss: 0.130993
[39] training's auc: 0.913994 training's binary_logloss: 0.11294 valid_1's auc: 0.835202 valid_1's binary_logloss: 0.130998
[40] training's auc: 0.914686 training's binary_logloss: 0.112581 valid_1's auc: 0.835029 valid_1's binary_logloss: 0.131031
[41] training's auc: 0.915477 training's binary_logloss: 0.112229 valid_1's auc: 0.834555 valid_1's binary_logloss: 0.131109
[42] training's auc: 0.916408 training's binary_logloss: 0.111888 valid_1's auc: 0.834126 valid_1's binary_logloss: 0.131187
[43] training's auc: 0.917531 training's binary_logloss: 0.111477 valid_1's auc: 0.833916 valid_1's binary_logloss: 0.13123
[44] training's auc: 0.918441 training's binary_logloss: 0.111086 valid_1's auc: 0.833814 valid_1's binary_logloss: 0.131245
[45] training's auc: 0.919176 training's binary_logloss: 0.110715 valid_1's auc: 0.833473 valid_1's binary_logloss: 0.131321
[46] training's auc: 0.919689 training's binary_logloss: 0.11039 valid_1's auc: 0.833262 valid_1's binary_logloss: 0.131365
[47] training's auc: 0.920281 training's binary_logloss: 0.110074 valid_1's auc: 0.833165 valid_1's binary_logloss: 0.131398
[48] training's auc: 0.920922 training's binary_logloss: 0.109744 valid_1's auc: 0.832737 valid_1's binary_logloss: 0.131486
[49] training's auc: 0.922488 training's binary_logloss: 0.109256 valid_1's auc: 0.832277 valid_1's binary_logloss: 0.131595
[50] training's auc: 0.922971 training's binary_logloss: 0.108993 valid_1's auc: 0.832357 valid_1's binary_logloss: 0.131573
[51] training's auc: 0.923713 training's binary_logloss: 0.108697 valid_1's auc: 0.832276 valid_1's binary_logloss: 0.131612
[52] training's auc: 0.924273 training's binary_logloss: 0.108451 valid_1's auc: 0.832083 valid_1's binary_logloss: 0.131649
[53] training's auc: 0.924624 training's binary_logloss: 0.108214 valid_1's auc: 0.831781 valid_1's binary_logloss: 0.131716
[54] training's auc: 0.925053 training's binary_logloss: 0.107984 valid_1's auc: 0.831718 valid_1's binary_logloss: 0.131738
[55] training's auc: 0.925342 training's binary_logloss: 0.107756 valid_1's auc: 0.831526 valid_1's binary_logloss: 0.131803
[56] training's auc: 0.925724 training's binary_logloss: 0.107484 valid_1's auc: 0.831485 valid_1's binary_logloss: 0.131813
Early stopping, best iteration is:
[26] training's auc: 0.898052 training's binary_logloss: 0.118955 valid_1's auc: 0.836173 valid_1's binary_logloss: 0.130842
[1] training's auc: 0.828442 training's binary_logloss: 0.152631 valid_1's auc: 0.806571 valid_1's binary_logloss: 0.159167
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.839923 training's binary_logloss: 0.146667 valid_1's auc: 0.81714 valid_1's binary_logloss: 0.153518
[3] training's auc: 0.848557 training's binary_logloss: 0.142288 valid_1's auc: 0.822769 valid_1's binary_logloss: 0.149941
[4] training's auc: 0.852205 training's binary_logloss: 0.138778 valid_1's auc: 0.825852 valid_1's binary_logloss: 0.147099
[5] training's auc: 0.856581 training's binary_logloss: 0.136166 valid_1's auc: 0.827896 valid_1's binary_logloss: 0.145084
[6] training's auc: 0.859205 training's binary_logloss: 0.133978 valid_1's auc: 0.828864 valid_1's binary_logloss: 0.14348
[7] training's auc: 0.862938 training's binary_logloss: 0.132155 valid_1's auc: 0.830001 valid_1's binary_logloss: 0.142243
[8] training's auc: 0.865591 training's binary_logloss: 0.130648 valid_1's auc: 0.832855 valid_1's binary_logloss: 0.141218
[9] training's auc: 0.868124 training's binary_logloss: 0.129216 valid_1's auc: 0.8324 valid_1's binary_logloss: 0.140482
[10] training's auc: 0.870384 training's binary_logloss: 0.127981 valid_1's auc: 0.832896 valid_1's binary_logloss: 0.139798
[11] training's auc: 0.87346 training's binary_logloss: 0.126797 valid_1's auc: 0.833667 valid_1's binary_logloss: 0.139257
[12] training's auc: 0.877047 training's binary_logloss: 0.12569 valid_1's auc: 0.83431 valid_1's binary_logloss: 0.138759
[13] training's auc: 0.879386 training's binary_logloss: 0.124783 valid_1's auc: 0.83477 valid_1's binary_logloss: 0.138374
[14] training's auc: 0.880335 training's binary_logloss: 0.123957 valid_1's auc: 0.834145 valid_1's binary_logloss: 0.138213
[15] training's auc: 0.882033 training's binary_logloss: 0.123161 valid_1's auc: 0.833892 valid_1's binary_logloss: 0.138017
[16] training's auc: 0.884296 training's binary_logloss: 0.122299 valid_1's auc: 0.833554 valid_1's binary_logloss: 0.137824
[17] training's auc: 0.886453 training's binary_logloss: 0.121546 valid_1's auc: 0.833936 valid_1's binary_logloss: 0.137592
[18] training's auc: 0.888306 training's binary_logloss: 0.120839 valid_1's auc: 0.834487 valid_1's binary_logloss: 0.13739
[19] training's auc: 0.890381 training's binary_logloss: 0.120083 valid_1's auc: 0.834603 valid_1's binary_logloss: 0.137256
[20] training's auc: 0.892127 training's binary_logloss: 0.119392 valid_1's auc: 0.834226 valid_1's binary_logloss: 0.137146
[21] training's auc: 0.894072 training's binary_logloss: 0.118831 valid_1's auc: 0.83451 valid_1's binary_logloss: 0.137004
[22] training's auc: 0.895377 training's binary_logloss: 0.118301 valid_1's auc: 0.83481 valid_1's binary_logloss: 0.136894
[23] training's auc: 0.896779 training's binary_logloss: 0.11777 valid_1's auc: 0.83473 valid_1's binary_logloss: 0.136902
[24] training's auc: 0.898047 training's binary_logloss: 0.11725 valid_1's auc: 0.834406 valid_1's binary_logloss: 0.13686
[25] training's auc: 0.899308 training's binary_logloss: 0.11669 valid_1's auc: 0.83404 valid_1's binary_logloss: 0.136851
[26] training's auc: 0.900287 training's binary_logloss: 0.1162 valid_1's auc: 0.834336 valid_1's binary_logloss: 0.136788
[27] training's auc: 0.901827 training's binary_logloss: 0.115692 valid_1's auc: 0.834366 valid_1's binary_logloss: 0.136809
[28] training's auc: 0.903238 training's binary_logloss: 0.115149 valid_1's auc: 0.834336 valid_1's binary_logloss: 0.136833
[29] training's auc: 0.904591 training's binary_logloss: 0.114601 valid_1's auc: 0.834482 valid_1's binary_logloss: 0.136784
[30] training's auc: 0.906197 training's binary_logloss: 0.114055 valid_1's auc: 0.834166 valid_1's binary_logloss: 0.136833
[31] training's auc: 0.907253 training's binary_logloss: 0.113586 valid_1's auc: 0.833768 valid_1's binary_logloss: 0.13696
[32] training's auc: 0.908491 training's binary_logloss: 0.113194 valid_1's auc: 0.83344 valid_1's binary_logloss: 0.137086
[33] training's auc: 0.909343 training's binary_logloss: 0.112771 valid_1's auc: 0.83296 valid_1's binary_logloss: 0.137193
[34] training's auc: 0.910396 training's binary_logloss: 0.112361 valid_1's auc: 0.832973 valid_1's binary_logloss: 0.137222
[35] training's auc: 0.911051 training's binary_logloss: 0.111973 valid_1's auc: 0.833089 valid_1's binary_logloss: 0.137229
[36] training's auc: 0.911681 training's binary_logloss: 0.111638 valid_1's auc: 0.832396 valid_1's binary_logloss: 0.137385
[37] training's auc: 0.912347 training's binary_logloss: 0.11127 valid_1's auc: 0.831942 valid_1's binary_logloss: 0.137533
[38] training's auc: 0.913351 training's binary_logloss: 0.110814 valid_1's auc: 0.831722 valid_1's binary_logloss: 0.137639
[39] training's auc: 0.914519 training's binary_logloss: 0.110507 valid_1's auc: 0.831797 valid_1's binary_logloss: 0.137663
[40] training's auc: 0.915194 training's binary_logloss: 0.110165 valid_1's auc: 0.831852 valid_1's binary_logloss: 0.137633
[41] training's auc: 0.915927 training's binary_logloss: 0.109781 valid_1's auc: 0.831853 valid_1's binary_logloss: 0.137631
[42] training's auc: 0.916534 training's binary_logloss: 0.109452 valid_1's auc: 0.831853 valid_1's binary_logloss: 0.13764
[43] training's auc: 0.917206 training's binary_logloss: 0.109044 valid_1's auc: 0.831802 valid_1's binary_logloss: 0.137717
[44] training's auc: 0.917982 training's binary_logloss: 0.108722 valid_1's auc: 0.831781 valid_1's binary_logloss: 0.137785
[45] training's auc: 0.918874 training's binary_logloss: 0.108377 valid_1's auc: 0.832101 valid_1's binary_logloss: 0.137753
[46] training's auc: 0.919437 training's binary_logloss: 0.108016 valid_1's auc: 0.831884 valid_1's binary_logloss: 0.137852
[47] training's auc: 0.920171 training's binary_logloss: 0.1076 valid_1's auc: 0.831915 valid_1's binary_logloss: 0.137848
[48] training's auc: 0.920993 training's binary_logloss: 0.107207 valid_1's auc: 0.831953 valid_1's binary_logloss: 0.137885
[49] training's auc: 0.921387 training's binary_logloss: 0.106924 valid_1's auc: 0.83181 valid_1's binary_logloss: 0.137931
[50] training's auc: 0.921994 training's binary_logloss: 0.106561 valid_1's auc: 0.831915 valid_1's binary_logloss: 0.137941
[51] training's auc: 0.922407 training's binary_logloss: 0.106263 valid_1's auc: 0.831409 valid_1's binary_logloss: 0.138116
[52] training's auc: 0.923361 training's binary_logloss: 0.105849 valid_1's auc: 0.831273 valid_1's binary_logloss: 0.138159
Early stopping, best iteration is:
[22] training's auc: 0.895377 training's binary_logloss: 0.118301 valid_1's auc: 0.83481 valid_1's binary_logloss: 0.136894
[1] training's auc: 0.835373 training's binary_logloss: 0.149265 valid_1's auc: 0.810613 valid_1's binary_logloss: 0.153309
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.850291 training's binary_logloss: 0.142663 valid_1's auc: 0.820234 valid_1's binary_logloss: 0.148153
[3] training's auc: 0.859966 training's binary_logloss: 0.138181 valid_1's auc: 0.82443 valid_1's binary_logloss: 0.144769
[4] training's auc: 0.863077 training's binary_logloss: 0.134768 valid_1's auc: 0.82542 valid_1's binary_logloss: 0.142386
[5] training's auc: 0.866848 training's binary_logloss: 0.132144 valid_1's auc: 0.826916 valid_1's binary_logloss: 0.140816
[6] training's auc: 0.871058 training's binary_logloss: 0.129817 valid_1's auc: 0.827705 valid_1's binary_logloss: 0.139568
[7] training's auc: 0.87538 training's binary_logloss: 0.127802 valid_1's auc: 0.827791 valid_1's binary_logloss: 0.138752
[8] training's auc: 0.878219 training's binary_logloss: 0.12618 valid_1's auc: 0.829255 valid_1's binary_logloss: 0.138031
[9] training's auc: 0.881327 training's binary_logloss: 0.124765 valid_1's auc: 0.828565 valid_1's binary_logloss: 0.137672
[10] training's auc: 0.884717 training's binary_logloss: 0.123339 valid_1's auc: 0.828495 valid_1's binary_logloss: 0.137391
[11] training's auc: 0.888104 training's binary_logloss: 0.122106 valid_1's auc: 0.829008 valid_1's binary_logloss: 0.137077
[12] training's auc: 0.891369 training's binary_logloss: 0.120901 valid_1's auc: 0.828239 valid_1's binary_logloss: 0.137062
[13] training's auc: 0.893207 training's binary_logloss: 0.119879 valid_1's auc: 0.82894 valid_1's binary_logloss: 0.136885
[14] training's auc: 0.896744 training's binary_logloss: 0.118872 valid_1's auc: 0.827813 valid_1's binary_logloss: 0.136899
[15] training's auc: 0.899114 training's binary_logloss: 0.117849 valid_1's auc: 0.827555 valid_1's binary_logloss: 0.136941
[16] training's auc: 0.90107 training's binary_logloss: 0.11687 valid_1's auc: 0.826179 valid_1's binary_logloss: 0.137162
[17] training's auc: 0.903329 training's binary_logloss: 0.116032 valid_1's auc: 0.825788 valid_1's binary_logloss: 0.137308
[18] training's auc: 0.905995 training's binary_logloss: 0.115144 valid_1's auc: 0.826941 valid_1's binary_logloss: 0.137152
[19] training's auc: 0.907893 training's binary_logloss: 0.114327 valid_1's auc: 0.827362 valid_1's binary_logloss: 0.137093
[20] training's auc: 0.90976 training's binary_logloss: 0.113558 valid_1's auc: 0.827381 valid_1's binary_logloss: 0.137127
[21] training's auc: 0.911529 training's binary_logloss: 0.112825 valid_1's auc: 0.826254 valid_1's binary_logloss: 0.137339
[22] training's auc: 0.913244 training's binary_logloss: 0.112109 valid_1's auc: 0.826043 valid_1's binary_logloss: 0.137456
[23] training's auc: 0.914643 training's binary_logloss: 0.111412 valid_1's auc: 0.825392 valid_1's binary_logloss: 0.13763
[24] training's auc: 0.916127 training's binary_logloss: 0.110792 valid_1's auc: 0.824571 valid_1's binary_logloss: 0.137816
[25] training's auc: 0.917399 training's binary_logloss: 0.110082 valid_1's auc: 0.823503 valid_1's binary_logloss: 0.138039
[26] training's auc: 0.91943 training's binary_logloss: 0.109333 valid_1's auc: 0.823328 valid_1's binary_logloss: 0.13808
[27] training's auc: 0.920943 training's binary_logloss: 0.108652 valid_1's auc: 0.823195 valid_1's binary_logloss: 0.138183
[28] training's auc: 0.921939 training's binary_logloss: 0.108128 valid_1's auc: 0.822439 valid_1's binary_logloss: 0.138419
[29] training's auc: 0.92292 training's binary_logloss: 0.107535 valid_1's auc: 0.822412 valid_1's binary_logloss: 0.138507
[30] training's auc: 0.923492 training's binary_logloss: 0.107101 valid_1's auc: 0.821865 valid_1's binary_logloss: 0.138691
[31] training's auc: 0.924473 training's binary_logloss: 0.106484 valid_1's auc: 0.821178 valid_1's binary_logloss: 0.138907
[32] training's auc: 0.925193 training's binary_logloss: 0.106081 valid_1's auc: 0.820427 valid_1's binary_logloss: 0.139146
[33] training's auc: 0.92661 training's binary_logloss: 0.105438 valid_1's auc: 0.81989 valid_1's binary_logloss: 0.139294
[34] training's auc: 0.927857 training's binary_logloss: 0.10487 valid_1's auc: 0.820212 valid_1's binary_logloss: 0.139266
[35] training's auc: 0.92833 training's binary_logloss: 0.104488 valid_1's auc: 0.820136 valid_1's binary_logloss: 0.139349
[36] training's auc: 0.928709 training's binary_logloss: 0.104147 valid_1's auc: 0.819396 valid_1's binary_logloss: 0.139497
[37] training's auc: 0.929308 training's binary_logloss: 0.103683 valid_1's auc: 0.818971 valid_1's binary_logloss: 0.139737
[38] training's auc: 0.930423 training's binary_logloss: 0.103103 valid_1's auc: 0.818467 valid_1's binary_logloss: 0.139923
Early stopping, best iteration is:
[8] training's auc: 0.878219 training's binary_logloss: 0.12618 valid_1's auc: 0.829255 valid_1's binary_logloss: 0.138031
[1] training's auc: 0.830551 training's binary_logloss: 0.152159 valid_1's auc: 0.817535 valid_1's binary_logloss: 0.148877
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.848423 training's binary_logloss: 0.145266 valid_1's auc: 0.828368 valid_1's binary_logloss: 0.143709
[3] training's auc: 0.856737 training's binary_logloss: 0.140611 valid_1's auc: 0.832766 valid_1's binary_logloss: 0.140265
[4] training's auc: 0.8608 training's binary_logloss: 0.137118 valid_1's auc: 0.836226 valid_1's binary_logloss: 0.137812
[5] training's auc: 0.866853 training's binary_logloss: 0.134301 valid_1's auc: 0.836942 valid_1's binary_logloss: 0.135979
[6] training's auc: 0.86967 training's binary_logloss: 0.132052 valid_1's auc: 0.835799 valid_1's binary_logloss: 0.134771
[7] training's auc: 0.873371 training's binary_logloss: 0.130106 valid_1's auc: 0.835559 valid_1's binary_logloss: 0.133842
[8] training's auc: 0.876713 training's binary_logloss: 0.128419 valid_1's auc: 0.834706 valid_1's binary_logloss: 0.133231
[9] training's auc: 0.880184 training's binary_logloss: 0.126819 valid_1's auc: 0.834058 valid_1's binary_logloss: 0.132804
[10] training's auc: 0.883617 training's binary_logloss: 0.125453 valid_1's auc: 0.834833 valid_1's binary_logloss: 0.132311
[11] training's auc: 0.886007 training's binary_logloss: 0.124238 valid_1's auc: 0.834085 valid_1's binary_logloss: 0.132102
[12] training's auc: 0.888592 training's binary_logloss: 0.123018 valid_1's auc: 0.834609 valid_1's binary_logloss: 0.131895
[13] training's auc: 0.890685 training's binary_logloss: 0.121926 valid_1's auc: 0.834673 valid_1's binary_logloss: 0.131602
[14] training's auc: 0.89358 training's binary_logloss: 0.120911 valid_1's auc: 0.834808 valid_1's binary_logloss: 0.131498
[15] training's auc: 0.895956 training's binary_logloss: 0.119946 valid_1's auc: 0.835248 valid_1's binary_logloss: 0.131223
[16] training's auc: 0.898162 training's binary_logloss: 0.118983 valid_1's auc: 0.835652 valid_1's binary_logloss: 0.131093
[17] training's auc: 0.900148 training's binary_logloss: 0.118192 valid_1's auc: 0.836012 valid_1's binary_logloss: 0.131069
[18] training's auc: 0.902581 training's binary_logloss: 0.117362 valid_1's auc: 0.83635 valid_1's binary_logloss: 0.130953
[19] training's auc: 0.904764 training's binary_logloss: 0.116605 valid_1's auc: 0.835727 valid_1's binary_logloss: 0.130995
[20] training's auc: 0.90651 training's binary_logloss: 0.115839 valid_1's auc: 0.835884 valid_1's binary_logloss: 0.130969
[21] training's auc: 0.908654 training's binary_logloss: 0.115116 valid_1's auc: 0.835306 valid_1's binary_logloss: 0.131044
[22] training's auc: 0.910127 training's binary_logloss: 0.114417 valid_1's auc: 0.835119 valid_1's binary_logloss: 0.131053
[23] training's auc: 0.911628 training's binary_logloss: 0.113787 valid_1's auc: 0.835102 valid_1's binary_logloss: 0.131047
[24] training's auc: 0.913077 training's binary_logloss: 0.112999 valid_1's auc: 0.834766 valid_1's binary_logloss: 0.131109
[25] training's auc: 0.914358 training's binary_logloss: 0.112322 valid_1's auc: 0.834319 valid_1's binary_logloss: 0.131185
[26] training's auc: 0.916226 training's binary_logloss: 0.1116 valid_1's auc: 0.833506 valid_1's binary_logloss: 0.131377
[27] training's auc: 0.917403 training's binary_logloss: 0.111051 valid_1's auc: 0.833246 valid_1's binary_logloss: 0.131419
[28] training's auc: 0.918425 training's binary_logloss: 0.110458 valid_1's auc: 0.832749 valid_1's binary_logloss: 0.131591
[29] training's auc: 0.919253 training's binary_logloss: 0.109967 valid_1's auc: 0.831807 valid_1's binary_logloss: 0.131792
[30] training's auc: 0.919923 training's binary_logloss: 0.109536 valid_1's auc: 0.832345 valid_1's binary_logloss: 0.131738
[31] training's auc: 0.921161 training's binary_logloss: 0.108936 valid_1's auc: 0.832329 valid_1's binary_logloss: 0.131771
[32] training's auc: 0.92207 training's binary_logloss: 0.108476 valid_1's auc: 0.832549 valid_1's binary_logloss: 0.131734
[33] training's auc: 0.923139 training's binary_logloss: 0.107927 valid_1's auc: 0.832325 valid_1's binary_logloss: 0.131775
[34] training's auc: 0.924362 training's binary_logloss: 0.107313 valid_1's auc: 0.831937 valid_1's binary_logloss: 0.131871
[35] training's auc: 0.925685 training's binary_logloss: 0.106689 valid_1's auc: 0.831529 valid_1's binary_logloss: 0.132039
Early stopping, best iteration is:
[5] training's auc: 0.866853 training's binary_logloss: 0.134301 valid_1's auc: 0.836942 valid_1's binary_logloss: 0.135979
[1] training's auc: 0.83168 training's binary_logloss: 0.148845 valid_1's auc: 0.806353 valid_1's binary_logloss: 0.156328
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.848504 training's binary_logloss: 0.142142 valid_1's auc: 0.820763 valid_1's binary_logloss: 0.150528
[3] training's auc: 0.853567 training's binary_logloss: 0.137583 valid_1's auc: 0.8237 valid_1's binary_logloss: 0.146909
[4] training's auc: 0.857322 training's binary_logloss: 0.134275 valid_1's auc: 0.825768 valid_1's binary_logloss: 0.144392
[5] training's auc: 0.862733 training's binary_logloss: 0.131607 valid_1's auc: 0.826819 valid_1's binary_logloss: 0.142737
[6] training's auc: 0.867387 training's binary_logloss: 0.129202 valid_1's auc: 0.82823 valid_1's binary_logloss: 0.141263
[7] training's auc: 0.871018 training's binary_logloss: 0.127266 valid_1's auc: 0.82813 valid_1's binary_logloss: 0.140443
[8] training's auc: 0.875846 training's binary_logloss: 0.125571 valid_1's auc: 0.830488 valid_1's binary_logloss: 0.139658
[9] training's auc: 0.878609 training's binary_logloss: 0.124108 valid_1's auc: 0.830602 valid_1's binary_logloss: 0.138974
[10] training's auc: 0.881699 training's binary_logloss: 0.122672 valid_1's auc: 0.830833 valid_1's binary_logloss: 0.138618
[11] training's auc: 0.885217 training's binary_logloss: 0.121474 valid_1's auc: 0.831271 valid_1's binary_logloss: 0.138246
[12] training's auc: 0.889003 training's binary_logloss: 0.120207 valid_1's auc: 0.830876 valid_1's binary_logloss: 0.138077
[13] training's auc: 0.892384 training's binary_logloss: 0.118993 valid_1's auc: 0.831433 valid_1's binary_logloss: 0.137848
[14] training's auc: 0.895356 training's binary_logloss: 0.117878 valid_1's auc: 0.830888 valid_1's binary_logloss: 0.137861
[15] training's auc: 0.898278 training's binary_logloss: 0.116861 valid_1's auc: 0.830187 valid_1's binary_logloss: 0.137858
[16] training's auc: 0.901578 training's binary_logloss: 0.115863 valid_1's auc: 0.830241 valid_1's binary_logloss: 0.137773
[17] training's auc: 0.90385 training's binary_logloss: 0.114967 valid_1's auc: 0.829555 valid_1's binary_logloss: 0.137817
[18] training's auc: 0.906087 training's binary_logloss: 0.114198 valid_1's auc: 0.829636 valid_1's binary_logloss: 0.137724
[19] training's auc: 0.907893 training's binary_logloss: 0.113333 valid_1's auc: 0.829595 valid_1's binary_logloss: 0.137694
[20] training's auc: 0.909661 training's binary_logloss: 0.112602 valid_1's auc: 0.828962 valid_1's binary_logloss: 0.137823
[21] training's auc: 0.911378 training's binary_logloss: 0.111895 valid_1's auc: 0.829341 valid_1's binary_logloss: 0.137802
[22] training's auc: 0.912993 training's binary_logloss: 0.111126 valid_1's auc: 0.829141 valid_1's binary_logloss: 0.137894
[23] training's auc: 0.914348 training's binary_logloss: 0.11038 valid_1's auc: 0.828517 valid_1's binary_logloss: 0.138032
[24] training's auc: 0.915838 training's binary_logloss: 0.109717 valid_1's auc: 0.828659 valid_1's binary_logloss: 0.138074
[25] training's auc: 0.917044 training's binary_logloss: 0.109073 valid_1's auc: 0.828317 valid_1's binary_logloss: 0.138207
[26] training's auc: 0.918463 training's binary_logloss: 0.108358 valid_1's auc: 0.828023 valid_1's binary_logloss: 0.138355
[27] training's auc: 0.920527 training's binary_logloss: 0.107755 valid_1's auc: 0.828088 valid_1's binary_logloss: 0.138419
[28] training's auc: 0.921479 training's binary_logloss: 0.10725 valid_1's auc: 0.828029 valid_1's binary_logloss: 0.138478
[29] training's auc: 0.922442 training's binary_logloss: 0.106658 valid_1's auc: 0.828544 valid_1's binary_logloss: 0.138383
[30] training's auc: 0.923455 training's binary_logloss: 0.106128 valid_1's auc: 0.828448 valid_1's binary_logloss: 0.138447
[31] training's auc: 0.924571 training's binary_logloss: 0.105493 valid_1's auc: 0.828059 valid_1's binary_logloss: 0.138576
[32] training's auc: 0.925292 training's binary_logloss: 0.104955 valid_1's auc: 0.827748 valid_1's binary_logloss: 0.138647
[33] training's auc: 0.926859 training's binary_logloss: 0.104294 valid_1's auc: 0.827443 valid_1's binary_logloss: 0.138772
[34] training's auc: 0.927957 training's binary_logloss: 0.103827 valid_1's auc: 0.827791 valid_1's binary_logloss: 0.138763
[35] training's auc: 0.928971 training's binary_logloss: 0.103341 valid_1's auc: 0.82768 valid_1's binary_logloss: 0.138884
[36] training's auc: 0.929773 training's binary_logloss: 0.102815 valid_1's auc: 0.827573 valid_1's binary_logloss: 0.138948
[37] training's auc: 0.931022 training's binary_logloss: 0.102405 valid_1's auc: 0.827856 valid_1's binary_logloss: 0.139031
[38] training's auc: 0.931841 training's binary_logloss: 0.101905 valid_1's auc: 0.827528 valid_1's binary_logloss: 0.139196
[39] training's auc: 0.933416 training's binary_logloss: 0.101179 valid_1's auc: 0.826624 valid_1's binary_logloss: 0.13939
[40] training's auc: 0.934501 training's binary_logloss: 0.100795 valid_1's auc: 0.827258 valid_1's binary_logloss: 0.139357
[41] training's auc: 0.935485 training's binary_logloss: 0.100462 valid_1's auc: 0.827649 valid_1's binary_logloss: 0.139385
[42] training's auc: 0.936098 training's binary_logloss: 0.100007 valid_1's auc: 0.827137 valid_1's binary_logloss: 0.139482
[43] training's auc: 0.936603 training's binary_logloss: 0.0997109 valid_1's auc: 0.827312 valid_1's binary_logloss: 0.13949
Early stopping, best iteration is:
[13] training's auc: 0.892384 training's binary_logloss: 0.118993 valid_1's auc: 0.831433 valid_1's binary_logloss: 0.137848
[1] training's auc: 0.832598 training's binary_logloss: 0.159372 valid_1's auc: 0.810237 valid_1's binary_logloss: 0.161043
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.835363 training's binary_logloss: 0.155488 valid_1's auc: 0.812752 valid_1's binary_logloss: 0.157738
[3] training's auc: 0.840085 training's binary_logloss: 0.152386 valid_1's auc: 0.813739 valid_1's binary_logloss: 0.155177
[4] training's auc: 0.842616 training's binary_logloss: 0.149774 valid_1's auc: 0.816987 valid_1's binary_logloss: 0.152946
[5] training's auc: 0.846905 training's binary_logloss: 0.147547 valid_1's auc: 0.818122 valid_1's binary_logloss: 0.151148
[6] training's auc: 0.848321 training's binary_logloss: 0.145585 valid_1's auc: 0.819529 valid_1's binary_logloss: 0.149534
[7] training's auc: 0.85046 training's binary_logloss: 0.143859 valid_1's auc: 0.821066 valid_1's binary_logloss: 0.148169
[8] training's auc: 0.851987 training's binary_logloss: 0.142261 valid_1's auc: 0.821328 valid_1's binary_logloss: 0.146974
[9] training's auc: 0.852837 training's binary_logloss: 0.140816 valid_1's auc: 0.821721 valid_1's binary_logloss: 0.14591
[10] training's auc: 0.855685 training's binary_logloss: 0.139542 valid_1's auc: 0.823357 valid_1's binary_logloss: 0.144932
[11] training's auc: 0.856744 training's binary_logloss: 0.138375 valid_1's auc: 0.823863 valid_1's binary_logloss: 0.144082
[12] training's auc: 0.857534 training's binary_logloss: 0.1373 valid_1's auc: 0.823979 valid_1's binary_logloss: 0.143307
[13] training's auc: 0.859263 training's binary_logloss: 0.136349 valid_1's auc: 0.824933 valid_1's binary_logloss: 0.142614
[14] training's auc: 0.860351 training's binary_logloss: 0.135456 valid_1's auc: 0.824862 valid_1's binary_logloss: 0.141998
[15] training's auc: 0.86134 training's binary_logloss: 0.134641 valid_1's auc: 0.825073 valid_1's binary_logloss: 0.141401
[16] training's auc: 0.862013 training's binary_logloss: 0.133887 valid_1's auc: 0.8253 valid_1's binary_logloss: 0.140878
[17] training's auc: 0.863982 training's binary_logloss: 0.133139 valid_1's auc: 0.826512 valid_1's binary_logloss: 0.140444
[18] training's auc: 0.865151 training's binary_logloss: 0.132409 valid_1's auc: 0.827001 valid_1's binary_logloss: 0.139963
[19] training's auc: 0.865968 training's binary_logloss: 0.131747 valid_1's auc: 0.827444 valid_1's binary_logloss: 0.139592
[20] training's auc: 0.867508 training's binary_logloss: 0.131118 valid_1's auc: 0.827834 valid_1's binary_logloss: 0.139206
[21] training's auc: 0.868035 training's binary_logloss: 0.130571 valid_1's auc: 0.828232 valid_1's binary_logloss: 0.138899
[22] training's auc: 0.868929 training's binary_logloss: 0.130038 valid_1's auc: 0.82813 valid_1's binary_logloss: 0.138638
[23] training's auc: 0.869679 training's binary_logloss: 0.129549 valid_1's auc: 0.828473 valid_1's binary_logloss: 0.138358
[24] training's auc: 0.870702 training's binary_logloss: 0.129033 valid_1's auc: 0.828781 valid_1's binary_logloss: 0.138123
[25] training's auc: 0.871239 training's binary_logloss: 0.128603 valid_1's auc: 0.829289 valid_1's binary_logloss: 0.137874
[26] training's auc: 0.872182 training's binary_logloss: 0.128151 valid_1's auc: 0.829116 valid_1's binary_logloss: 0.137675
[27] training's auc: 0.873523 training's binary_logloss: 0.127718 valid_1's auc: 0.829461 valid_1's binary_logloss: 0.137501
[28] training's auc: 0.874591 training's binary_logloss: 0.127273 valid_1's auc: 0.829203 valid_1's binary_logloss: 0.137368
[29] training's auc: 0.875323 training's binary_logloss: 0.126833 valid_1's auc: 0.82907 valid_1's binary_logloss: 0.137208
[30] training's auc: 0.875911 training's binary_logloss: 0.126428 valid_1's auc: 0.829121 valid_1's binary_logloss: 0.137035
[31] training's auc: 0.876866 training's binary_logloss: 0.126044 valid_1's auc: 0.829346 valid_1's binary_logloss: 0.136899
[32] training's auc: 0.87839 training's binary_logloss: 0.125647 valid_1's auc: 0.829888 valid_1's binary_logloss: 0.13672
[33] training's auc: 0.878815 training's binary_logloss: 0.125316 valid_1's auc: 0.82961 valid_1's binary_logloss: 0.136659
[34] training's auc: 0.879826 training's binary_logloss: 0.124951 valid_1's auc: 0.829699 valid_1's binary_logloss: 0.136559
[35] training's auc: 0.880952 training's binary_logloss: 0.124593 valid_1's auc: 0.829835 valid_1's binary_logloss: 0.136471
[36] training's auc: 0.881669 training's binary_logloss: 0.124273 valid_1's auc: 0.830329 valid_1's binary_logloss: 0.136359
[37] training's auc: 0.882418 training's binary_logloss: 0.123959 valid_1's auc: 0.830521 valid_1's binary_logloss: 0.136258
[38] training's auc: 0.8831 training's binary_logloss: 0.123662 valid_1's auc: 0.830686 valid_1's binary_logloss: 0.136169
[39] training's auc: 0.883808 training's binary_logloss: 0.12338 valid_1's auc: 0.830997 valid_1's binary_logloss: 0.136093
[40] training's auc: 0.884374 training's binary_logloss: 0.123097 valid_1's auc: 0.830812 valid_1's binary_logloss: 0.136084
[41] training's auc: 0.884966 training's binary_logloss: 0.12281 valid_1's auc: 0.831182 valid_1's binary_logloss: 0.136036
[42] training's auc: 0.886082 training's binary_logloss: 0.122491 valid_1's auc: 0.831815 valid_1's binary_logloss: 0.135923
[43] training's auc: 0.886654 training's binary_logloss: 0.122228 valid_1's auc: 0.831754 valid_1's binary_logloss: 0.135864
[44] training's auc: 0.88732 training's binary_logloss: 0.121977 valid_1's auc: 0.83202 valid_1's binary_logloss: 0.135789
[45] training's auc: 0.887878 training's binary_logloss: 0.121725 valid_1's auc: 0.831826 valid_1's binary_logloss: 0.135791
[46] training's auc: 0.888435 training's binary_logloss: 0.121486 valid_1's auc: 0.831758 valid_1's binary_logloss: 0.135779
[47] training's auc: 0.889179 training's binary_logloss: 0.121215 valid_1's auc: 0.831957 valid_1's binary_logloss: 0.135734
[48] training's auc: 0.88979 training's binary_logloss: 0.12095 valid_1's auc: 0.831862 valid_1's binary_logloss: 0.135727
[49] training's auc: 0.890748 training's binary_logloss: 0.120692 valid_1's auc: 0.832087 valid_1's binary_logloss: 0.135646
[50] training's auc: 0.891434 training's binary_logloss: 0.120455 valid_1's auc: 0.832313 valid_1's binary_logloss: 0.135616
[51] training's auc: 0.89211 training's binary_logloss: 0.120219 valid_1's auc: 0.83221 valid_1's binary_logloss: 0.135604
[52] training's auc: 0.892609 training's binary_logloss: 0.119996 valid_1's auc: 0.832451 valid_1's binary_logloss: 0.135567
[53] training's auc: 0.893119 training's binary_logloss: 0.119771 valid_1's auc: 0.832482 valid_1's binary_logloss: 0.135531
[54] training's auc: 0.893593 training's binary_logloss: 0.11954 valid_1's auc: 0.832349 valid_1's binary_logloss: 0.135572
[55] training's auc: 0.894155 training's binary_logloss: 0.119335 valid_1's auc: 0.832392 valid_1's binary_logloss: 0.135554
[56] training's auc: 0.894715 training's binary_logloss: 0.119109 valid_1's auc: 0.832306 valid_1's binary_logloss: 0.135545
[57] training's auc: 0.895229 training's binary_logloss: 0.11891 valid_1's auc: 0.832412 valid_1's binary_logloss: 0.135523
[58] training's auc: 0.895851 training's binary_logloss: 0.118735 valid_1's auc: 0.832372 valid_1's binary_logloss: 0.135522
[59] training's auc: 0.896445 training's binary_logloss: 0.118554 valid_1's auc: 0.832113 valid_1's binary_logloss: 0.135562
[60] training's auc: 0.896897 training's binary_logloss: 0.118387 valid_1's auc: 0.832232 valid_1's binary_logloss: 0.135548
[61] training's auc: 0.897381 training's binary_logloss: 0.118198 valid_1's auc: 0.832123 valid_1's binary_logloss: 0.135571
[62] training's auc: 0.897945 training's binary_logloss: 0.117994 valid_1's auc: 0.832123 valid_1's binary_logloss: 0.135572
[63] training's auc: 0.898674 training's binary_logloss: 0.117814 valid_1's auc: 0.832159 valid_1's binary_logloss: 0.135546
[64] training's auc: 0.899114 training's binary_logloss: 0.117647 valid_1's auc: 0.832053 valid_1's binary_logloss: 0.135562
[65] training's auc: 0.899714 training's binary_logloss: 0.117439 valid_1's auc: 0.831979 valid_1's binary_logloss: 0.135572
[66] training's auc: 0.900187 training's binary_logloss: 0.117257 valid_1's auc: 0.831904 valid_1's binary_logloss: 0.135613
[67] training's auc: 0.90084 training's binary_logloss: 0.117054 valid_1's auc: 0.831723 valid_1's binary_logloss: 0.135658
[68] training's auc: 0.901229 training's binary_logloss: 0.116876 valid_1's auc: 0.831476 valid_1's binary_logloss: 0.135711
[69] training's auc: 0.901775 training's binary_logloss: 0.116692 valid_1's auc: 0.831488 valid_1's binary_logloss: 0.13572
[70] training's auc: 0.902492 training's binary_logloss: 0.116535 valid_1's auc: 0.83131 valid_1's binary_logloss: 0.135734
[71] training's auc: 0.902892 training's binary_logloss: 0.116363 valid_1's auc: 0.831089 valid_1's binary_logloss: 0.135767
[72] training's auc: 0.903406 training's binary_logloss: 0.116178 valid_1's auc: 0.830963 valid_1's binary_logloss: 0.135796
[73] training's auc: 0.903802 training's binary_logloss: 0.116011 valid_1's auc: 0.830985 valid_1's binary_logloss: 0.135792
[74] training's auc: 0.904269 training's binary_logloss: 0.115811 valid_1's auc: 0.830677 valid_1's binary_logloss: 0.135861
[75] training's auc: 0.904903 training's binary_logloss: 0.115612 valid_1's auc: 0.830636 valid_1's binary_logloss: 0.135865
[76] training's auc: 0.905418 training's binary_logloss: 0.115412 valid_1's auc: 0.830615 valid_1's binary_logloss: 0.135878
[77] training's auc: 0.905876 training's binary_logloss: 0.115235 valid_1's auc: 0.830545 valid_1's binary_logloss: 0.135915
[78] training's auc: 0.906231 training's binary_logloss: 0.115108 valid_1's auc: 0.830499 valid_1's binary_logloss: 0.135931
[79] training's auc: 0.906628 training's binary_logloss: 0.114917 valid_1's auc: 0.830566 valid_1's binary_logloss: 0.135911
[80] training's auc: 0.907056 training's binary_logloss: 0.114744 valid_1's auc: 0.830436 valid_1's binary_logloss: 0.135927
[81] training's auc: 0.907551 training's binary_logloss: 0.114543 valid_1's auc: 0.830362 valid_1's binary_logloss: 0.135939
[82] training's auc: 0.908125 training's binary_logloss: 0.114379 valid_1's auc: 0.830257 valid_1's binary_logloss: 0.13595
[83] training's auc: 0.90856 training's binary_logloss: 0.114222 valid_1's auc: 0.829978 valid_1's binary_logloss: 0.136001
Early stopping, best iteration is:
[53] training's auc: 0.893119 training's binary_logloss: 0.119771 valid_1's auc: 0.832482 valid_1's binary_logloss: 0.135531
[1] training's auc: 0.828087 training's binary_logloss: 0.161843 valid_1's auc: 0.813101 valid_1's binary_logloss: 0.156528
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.833499 training's binary_logloss: 0.157968 valid_1's auc: 0.819859 valid_1's binary_logloss: 0.153175
[3] training's auc: 0.837831 training's binary_logloss: 0.154818 valid_1's auc: 0.822037 valid_1's binary_logloss: 0.150511
[4] training's auc: 0.840598 training's binary_logloss: 0.152189 valid_1's auc: 0.824158 valid_1's binary_logloss: 0.148283
[5] training's auc: 0.84275 training's binary_logloss: 0.149918 valid_1's auc: 0.825594 valid_1's binary_logloss: 0.146429
[6] training's auc: 0.84705 training's binary_logloss: 0.147918 valid_1's auc: 0.82966 valid_1's binary_logloss: 0.144844
[7] training's auc: 0.849444 training's binary_logloss: 0.146189 valid_1's auc: 0.831301 valid_1's binary_logloss: 0.143419
[8] training's auc: 0.850541 training's binary_logloss: 0.144628 valid_1's auc: 0.831379 valid_1's binary_logloss: 0.142232
[9] training's auc: 0.853081 training's binary_logloss: 0.143169 valid_1's auc: 0.832183 valid_1's binary_logloss: 0.141122
[10] training's auc: 0.854995 training's binary_logloss: 0.141867 valid_1's auc: 0.834331 valid_1's binary_logloss: 0.140136
[11] training's auc: 0.856214 training's binary_logloss: 0.14067 valid_1's auc: 0.834743 valid_1's binary_logloss: 0.139206
[12] training's auc: 0.857622 training's binary_logloss: 0.139591 valid_1's auc: 0.835055 valid_1's binary_logloss: 0.138396
[13] training's auc: 0.858794 training's binary_logloss: 0.138589 valid_1's auc: 0.83556 valid_1's binary_logloss: 0.13767
[14] training's auc: 0.860263 training's binary_logloss: 0.137662 valid_1's auc: 0.835838 valid_1's binary_logloss: 0.137005
[15] training's auc: 0.861585 training's binary_logloss: 0.136809 valid_1's auc: 0.835968 valid_1's binary_logloss: 0.136426
[16] training's auc: 0.862649 training's binary_logloss: 0.136017 valid_1's auc: 0.835887 valid_1's binary_logloss: 0.135871
[17] training's auc: 0.863644 training's binary_logloss: 0.135284 valid_1's auc: 0.835746 valid_1's binary_logloss: 0.135387
[18] training's auc: 0.864551 training's binary_logloss: 0.134597 valid_1's auc: 0.835468 valid_1's binary_logloss: 0.134976
[19] training's auc: 0.865338 training's binary_logloss: 0.133972 valid_1's auc: 0.835595 valid_1's binary_logloss: 0.134555
[20] training's auc: 0.866692 training's binary_logloss: 0.133348 valid_1's auc: 0.835964 valid_1's binary_logloss: 0.1342
[21] training's auc: 0.867301 training's binary_logloss: 0.132781 valid_1's auc: 0.835845 valid_1's binary_logloss: 0.133888
[22] training's auc: 0.867791 training's binary_logloss: 0.132264 valid_1's auc: 0.835435 valid_1's binary_logloss: 0.133637
[23] training's auc: 0.868506 training's binary_logloss: 0.131749 valid_1's auc: 0.835276 valid_1's binary_logloss: 0.133385
[24] training's auc: 0.869059 training's binary_logloss: 0.131242 valid_1's auc: 0.835176 valid_1's binary_logloss: 0.133159
[25] training's auc: 0.869892 training's binary_logloss: 0.130772 valid_1's auc: 0.83502 valid_1's binary_logloss: 0.132952
[26] training's auc: 0.870854 training's binary_logloss: 0.130282 valid_1's auc: 0.834435 valid_1's binary_logloss: 0.132794
[27] training's auc: 0.871826 training's binary_logloss: 0.129848 valid_1's auc: 0.834621 valid_1's binary_logloss: 0.132589
[28] training's auc: 0.872635 training's binary_logloss: 0.12939 valid_1's auc: 0.834802 valid_1's binary_logloss: 0.132435
[29] training's auc: 0.87332 training's binary_logloss: 0.129019 valid_1's auc: 0.834445 valid_1's binary_logloss: 0.132296
[30] training's auc: 0.874137 training's binary_logloss: 0.128666 valid_1's auc: 0.834011 valid_1's binary_logloss: 0.132197
[31] training's auc: 0.874641 training's binary_logloss: 0.1283 valid_1's auc: 0.834124 valid_1's binary_logloss: 0.132048
[32] training's auc: 0.87522 training's binary_logloss: 0.127932 valid_1's auc: 0.834133 valid_1's binary_logloss: 0.131913
[33] training's auc: 0.876892 training's binary_logloss: 0.127507 valid_1's auc: 0.834297 valid_1's binary_logloss: 0.131796
[34] training's auc: 0.877796 training's binary_logloss: 0.127124 valid_1's auc: 0.834318 valid_1's binary_logloss: 0.131695
[35] training's auc: 0.878805 training's binary_logloss: 0.126746 valid_1's auc: 0.834561 valid_1's binary_logloss: 0.131595
[36] training's auc: 0.879653 training's binary_logloss: 0.126411 valid_1's auc: 0.835004 valid_1's binary_logloss: 0.131492
[37] training's auc: 0.880447 training's binary_logloss: 0.126056 valid_1's auc: 0.835223 valid_1's binary_logloss: 0.13139
[38] training's auc: 0.881028 training's binary_logloss: 0.12577 valid_1's auc: 0.834767 valid_1's binary_logloss: 0.131378
[39] training's auc: 0.881811 training's binary_logloss: 0.125453 valid_1's auc: 0.834893 valid_1's binary_logloss: 0.131303
[40] training's auc: 0.882517 training's binary_logloss: 0.125148 valid_1's auc: 0.834618 valid_1's binary_logloss: 0.131248
[41] training's auc: 0.88307 training's binary_logloss: 0.124865 valid_1's auc: 0.834969 valid_1's binary_logloss: 0.131147
[42] training's auc: 0.883658 training's binary_logloss: 0.12458 valid_1's auc: 0.834805 valid_1's binary_logloss: 0.131128
[43] training's auc: 0.884215 training's binary_logloss: 0.124295 valid_1's auc: 0.834835 valid_1's binary_logloss: 0.131097
[44] training's auc: 0.884616 training's binary_logloss: 0.124045 valid_1's auc: 0.83485 valid_1's binary_logloss: 0.131049
[45] training's auc: 0.885297 training's binary_logloss: 0.123795 valid_1's auc: 0.834929 valid_1's binary_logloss: 0.130974
Early stopping, best iteration is:
[15] training's auc: 0.861585 training's binary_logloss: 0.136809 valid_1's auc: 0.835968 valid_1's binary_logloss: 0.136426
[1] training's auc: 0.827289 training's binary_logloss: 0.158285 valid_1's auc: 0.809418 valid_1's binary_logloss: 0.163993
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.83214 training's binary_logloss: 0.154567 valid_1's auc: 0.814139 valid_1's binary_logloss: 0.160546
[3] training's auc: 0.836925 training's binary_logloss: 0.151526 valid_1's auc: 0.816896 valid_1's binary_logloss: 0.157773
[4] training's auc: 0.842292 training's binary_logloss: 0.148907 valid_1's auc: 0.823864 valid_1's binary_logloss: 0.155309
[5] training's auc: 0.843988 training's binary_logloss: 0.146726 valid_1's auc: 0.825594 valid_1's binary_logloss: 0.153263
[6] training's auc: 0.847432 training's binary_logloss: 0.144762 valid_1's auc: 0.827384 valid_1's binary_logloss: 0.151494
[7] training's auc: 0.849304 training's binary_logloss: 0.142982 valid_1's auc: 0.828499 valid_1's binary_logloss: 0.149938
[8] training's auc: 0.850653 training's binary_logloss: 0.141443 valid_1's auc: 0.828713 valid_1's binary_logloss: 0.148659
[9] training's auc: 0.852185 training's binary_logloss: 0.140036 valid_1's auc: 0.829648 valid_1's binary_logloss: 0.147487
[10] training's auc: 0.852981 training's binary_logloss: 0.138804 valid_1's auc: 0.830104 valid_1's binary_logloss: 0.146454
[11] training's auc: 0.854233 training's binary_logloss: 0.137684 valid_1's auc: 0.830329 valid_1's binary_logloss: 0.14556
[12] training's auc: 0.855289 training's binary_logloss: 0.13655 valid_1's auc: 0.8304 valid_1's binary_logloss: 0.144696
[13] training's auc: 0.856488 training's binary_logloss: 0.13557 valid_1's auc: 0.830519 valid_1's binary_logloss: 0.143946
[14] training's auc: 0.857678 training's binary_logloss: 0.134675 valid_1's auc: 0.831484 valid_1's binary_logloss: 0.143261
[15] training's auc: 0.858606 training's binary_logloss: 0.133861 valid_1's auc: 0.832042 valid_1's binary_logloss: 0.142662
[16] training's auc: 0.859715 training's binary_logloss: 0.133091 valid_1's auc: 0.831853 valid_1's binary_logloss: 0.142136
[17] training's auc: 0.861169 training's binary_logloss: 0.132379 valid_1's auc: 0.832607 valid_1's binary_logloss: 0.141582
[18] training's auc: 0.861691 training's binary_logloss: 0.131712 valid_1's auc: 0.832552 valid_1's binary_logloss: 0.141089
[19] training's auc: 0.863443 training's binary_logloss: 0.131062 valid_1's auc: 0.832843 valid_1's binary_logloss: 0.140666
[20] training's auc: 0.864432 training's binary_logloss: 0.130487 valid_1's auc: 0.83326 valid_1's binary_logloss: 0.140238
[21] training's auc: 0.866535 training's binary_logloss: 0.129905 valid_1's auc: 0.834141 valid_1's binary_logloss: 0.139895
[22] training's auc: 0.868379 training's binary_logloss: 0.129342 valid_1's auc: 0.835552 valid_1's binary_logloss: 0.139532
[23] training's auc: 0.869139 training's binary_logloss: 0.128847 valid_1's auc: 0.835659 valid_1's binary_logloss: 0.139233
[24] training's auc: 0.870167 training's binary_logloss: 0.128386 valid_1's auc: 0.836 valid_1's binary_logloss: 0.138956
[25] training's auc: 0.87105 training's binary_logloss: 0.1279 valid_1's auc: 0.83602 valid_1's binary_logloss: 0.138707
[26] training's auc: 0.871974 training's binary_logloss: 0.127462 valid_1's auc: 0.836399 valid_1's binary_logloss: 0.138486
[27] training's auc: 0.872537 training's binary_logloss: 0.127082 valid_1's auc: 0.836453 valid_1's binary_logloss: 0.138273
[28] training's auc: 0.873425 training's binary_logloss: 0.126682 valid_1's auc: 0.836692 valid_1's binary_logloss: 0.138065
[29] training's auc: 0.874074 training's binary_logloss: 0.126296 valid_1's auc: 0.837022 valid_1's binary_logloss: 0.137916
[30] training's auc: 0.875044 training's binary_logloss: 0.125893 valid_1's auc: 0.837415 valid_1's binary_logloss: 0.137737
[31] training's auc: 0.87598 training's binary_logloss: 0.12551 valid_1's auc: 0.837265 valid_1's binary_logloss: 0.13759
[32] training's auc: 0.876825 training's binary_logloss: 0.125152 valid_1's auc: 0.837097 valid_1's binary_logloss: 0.137477
[33] training's auc: 0.877646 training's binary_logloss: 0.124802 valid_1's auc: 0.837025 valid_1's binary_logloss: 0.137351
[34] training's auc: 0.878509 training's binary_logloss: 0.124461 valid_1's auc: 0.837053 valid_1's binary_logloss: 0.137233
[35] training's auc: 0.879438 training's binary_logloss: 0.124121 valid_1's auc: 0.837008 valid_1's binary_logloss: 0.137131
[36] training's auc: 0.880231 training's binary_logloss: 0.123786 valid_1's auc: 0.836543 valid_1's binary_logloss: 0.137079
[37] training's auc: 0.880815 training's binary_logloss: 0.123467 valid_1's auc: 0.837106 valid_1's binary_logloss: 0.13695
[38] training's auc: 0.881576 training's binary_logloss: 0.12318 valid_1's auc: 0.837247 valid_1's binary_logloss: 0.136829
[39] training's auc: 0.882199 training's binary_logloss: 0.122895 valid_1's auc: 0.837156 valid_1's binary_logloss: 0.136758
[40] training's auc: 0.883032 training's binary_logloss: 0.122588 valid_1's auc: 0.837088 valid_1's binary_logloss: 0.136708
[41] training's auc: 0.883951 training's binary_logloss: 0.122286 valid_1's auc: 0.837308 valid_1's binary_logloss: 0.136612
[42] training's auc: 0.884843 training's binary_logloss: 0.12198 valid_1's auc: 0.836944 valid_1's binary_logloss: 0.13659
[43] training's auc: 0.885546 training's binary_logloss: 0.121689 valid_1's auc: 0.837127 valid_1's binary_logloss: 0.136528
[44] training's auc: 0.886 training's binary_logloss: 0.121454 valid_1's auc: 0.837128 valid_1's binary_logloss: 0.136485
[45] training's auc: 0.886807 training's binary_logloss: 0.121162 valid_1's auc: 0.836879 valid_1's binary_logloss: 0.136454
[46] training's auc: 0.887575 training's binary_logloss: 0.120875 valid_1's auc: 0.836495 valid_1's binary_logloss: 0.13644
[47] training's auc: 0.888274 training's binary_logloss: 0.120628 valid_1's auc: 0.836384 valid_1's binary_logloss: 0.136429
[48] training's auc: 0.889019 training's binary_logloss: 0.120359 valid_1's auc: 0.836317 valid_1's binary_logloss: 0.136407
[49] training's auc: 0.889646 training's binary_logloss: 0.120083 valid_1's auc: 0.836217 valid_1's binary_logloss: 0.136383
[50] training's auc: 0.890344 training's binary_logloss: 0.119841 valid_1's auc: 0.836063 valid_1's binary_logloss: 0.136378
[51] training's auc: 0.891339 training's binary_logloss: 0.119596 valid_1's auc: 0.836138 valid_1's binary_logloss: 0.136339
[52] training's auc: 0.89198 training's binary_logloss: 0.119375 valid_1's auc: 0.835978 valid_1's binary_logloss: 0.136328
[53] training's auc: 0.89252 training's binary_logloss: 0.119159 valid_1's auc: 0.835835 valid_1's binary_logloss: 0.13632
[54] training's auc: 0.893085 training's binary_logloss: 0.118955 valid_1's auc: 0.836068 valid_1's binary_logloss: 0.136273
[55] training's auc: 0.893889 training's binary_logloss: 0.118739 valid_1's auc: 0.836056 valid_1's binary_logloss: 0.136263
[56] training's auc: 0.894524 training's binary_logloss: 0.118522 valid_1's auc: 0.836129 valid_1's binary_logloss: 0.136252
[57] training's auc: 0.895018 training's binary_logloss: 0.118326 valid_1's auc: 0.836122 valid_1's binary_logloss: 0.136239
[58] training's auc: 0.895548 training's binary_logloss: 0.118112 valid_1's auc: 0.836156 valid_1's binary_logloss: 0.136245
[59] training's auc: 0.896061 training's binary_logloss: 0.117931 valid_1's auc: 0.836076 valid_1's binary_logloss: 0.136254
[60] training's auc: 0.896484 training's binary_logloss: 0.117726 valid_1's auc: 0.835952 valid_1's binary_logloss: 0.136264
Early stopping, best iteration is:
[30] training's auc: 0.875044 training's binary_logloss: 0.125893 valid_1's auc: 0.837415 valid_1's binary_logloss: 0.137737
[1] training's auc: 0.833782 training's binary_logloss: 0.160679 valid_1's auc: 0.808274 valid_1's binary_logloss: 0.162279
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.835264 training's binary_logloss: 0.157537 valid_1's auc: 0.80942 valid_1's binary_logloss: 0.159725
[3] training's auc: 0.840336 training's binary_logloss: 0.154927 valid_1's auc: 0.81078 valid_1's binary_logloss: 0.157584
[4] training's auc: 0.841104 training's binary_logloss: 0.152664 valid_1's auc: 0.810751 valid_1's binary_logloss: 0.15576
[5] training's auc: 0.845026 training's binary_logloss: 0.150645 valid_1's auc: 0.813823 valid_1's binary_logloss: 0.154138
[6] training's auc: 0.850139 training's binary_logloss: 0.148851 valid_1's auc: 0.816983 valid_1's binary_logloss: 0.15269
[7] training's auc: 0.850881 training's binary_logloss: 0.147226 valid_1's auc: 0.816133 valid_1's binary_logloss: 0.15139
[8] training's auc: 0.852178 training's binary_logloss: 0.145728 valid_1's auc: 0.818503 valid_1's binary_logloss: 0.150221
[9] training's auc: 0.85332 training's binary_logloss: 0.144379 valid_1's auc: 0.82011 valid_1's binary_logloss: 0.149144
[10] training's auc: 0.854047 training's binary_logloss: 0.14315 valid_1's auc: 0.820996 valid_1's binary_logloss: 0.148213
[11] training's auc: 0.855788 training's binary_logloss: 0.141962 valid_1's auc: 0.822256 valid_1's binary_logloss: 0.147311
[12] training's auc: 0.856897 training's binary_logloss: 0.140884 valid_1's auc: 0.822487 valid_1's binary_logloss: 0.146535
[13] training's auc: 0.859718 training's binary_logloss: 0.139869 valid_1's auc: 0.824244 valid_1's binary_logloss: 0.14574
[14] training's auc: 0.860634 training's binary_logloss: 0.138897 valid_1's auc: 0.825195 valid_1's binary_logloss: 0.145011
[15] training's auc: 0.86106 training's binary_logloss: 0.138031 valid_1's auc: 0.825496 valid_1's binary_logloss: 0.144378
[16] training's auc: 0.861705 training's binary_logloss: 0.137187 valid_1's auc: 0.825805 valid_1's binary_logloss: 0.14376
[17] training's auc: 0.863339 training's binary_logloss: 0.136407 valid_1's auc: 0.82641 valid_1's binary_logloss: 0.143173
[18] training's auc: 0.863883 training's binary_logloss: 0.135673 valid_1's auc: 0.826566 valid_1's binary_logloss: 0.142673
[19] training's auc: 0.864514 training's binary_logloss: 0.134985 valid_1's auc: 0.826378 valid_1's binary_logloss: 0.14219
[20] training's auc: 0.865183 training's binary_logloss: 0.134321 valid_1's auc: 0.826218 valid_1's binary_logloss: 0.141771
[21] training's auc: 0.866355 training's binary_logloss: 0.133699 valid_1's auc: 0.82687 valid_1's binary_logloss: 0.141385
[22] training's auc: 0.866787 training's binary_logloss: 0.1331 valid_1's auc: 0.826868 valid_1's binary_logloss: 0.140986
[23] training's auc: 0.86839 training's binary_logloss: 0.132528 valid_1's auc: 0.827459 valid_1's binary_logloss: 0.140605
[24] training's auc: 0.869286 training's binary_logloss: 0.131985 valid_1's auc: 0.827501 valid_1's binary_logloss: 0.140315
[25] training's auc: 0.870175 training's binary_logloss: 0.131444 valid_1's auc: 0.827289 valid_1's binary_logloss: 0.140026
[26] training's auc: 0.871515 training's binary_logloss: 0.130924 valid_1's auc: 0.828168 valid_1's binary_logloss: 0.139729
[27] training's auc: 0.871879 training's binary_logloss: 0.130441 valid_1's auc: 0.828212 valid_1's binary_logloss: 0.139462
[28] training's auc: 0.872496 training's binary_logloss: 0.129975 valid_1's auc: 0.828373 valid_1's binary_logloss: 0.139189
[29] training's auc: 0.872974 training's binary_logloss: 0.12954 valid_1's auc: 0.828271 valid_1's binary_logloss: 0.138964
[30] training's auc: 0.873427 training's binary_logloss: 0.129121 valid_1's auc: 0.828426 valid_1's binary_logloss: 0.138775
[31] training's auc: 0.874291 training's binary_logloss: 0.128707 valid_1's auc: 0.828697 valid_1's binary_logloss: 0.138572
[32] training's auc: 0.874697 training's binary_logloss: 0.128336 valid_1's auc: 0.829057 valid_1's binary_logloss: 0.138376
[33] training's auc: 0.875452 training's binary_logloss: 0.127957 valid_1's auc: 0.82902 valid_1's binary_logloss: 0.138223
[34] training's auc: 0.876405 training's binary_logloss: 0.127552 valid_1's auc: 0.828948 valid_1's binary_logloss: 0.138081
[35] training's auc: 0.876881 training's binary_logloss: 0.127189 valid_1's auc: 0.829169 valid_1's binary_logloss: 0.137926
[36] training's auc: 0.877389 training's binary_logloss: 0.126855 valid_1's auc: 0.829213 valid_1's binary_logloss: 0.137777
[37] training's auc: 0.878149 training's binary_logloss: 0.126503 valid_1's auc: 0.829368 valid_1's binary_logloss: 0.13763
[38] training's auc: 0.878817 training's binary_logloss: 0.126154 valid_1's auc: 0.829427 valid_1's binary_logloss: 0.137478
[39] training's auc: 0.879532 training's binary_logloss: 0.125809 valid_1's auc: 0.829478 valid_1's binary_logloss: 0.137342
[40] training's auc: 0.879922 training's binary_logloss: 0.125506 valid_1's auc: 0.82948 valid_1's binary_logloss: 0.137218
[41] training's auc: 0.881026 training's binary_logloss: 0.12516 valid_1's auc: 0.829333 valid_1's binary_logloss: 0.137171
[42] training's auc: 0.88183 training's binary_logloss: 0.124846 valid_1's auc: 0.82949 valid_1's binary_logloss: 0.137046
[43] training's auc: 0.882254 training's binary_logloss: 0.12456 valid_1's auc: 0.829646 valid_1's binary_logloss: 0.136946
[44] training's auc: 0.883204 training's binary_logloss: 0.124255 valid_1's auc: 0.829883 valid_1's binary_logloss: 0.136836
[45] training's auc: 0.883712 training's binary_logloss: 0.123976 valid_1's auc: 0.830033 valid_1's binary_logloss: 0.136748
[46] training's auc: 0.88417 training's binary_logloss: 0.123711 valid_1's auc: 0.830382 valid_1's binary_logloss: 0.136626
[47] training's auc: 0.885051 training's binary_logloss: 0.123421 valid_1's auc: 0.830052 valid_1's binary_logloss: 0.136593
[48] training's auc: 0.885714 training's binary_logloss: 0.123159 valid_1's auc: 0.830018 valid_1's binary_logloss: 0.136537
[49] training's auc: 0.886079 training's binary_logloss: 0.122912 valid_1's auc: 0.830391 valid_1's binary_logloss: 0.136458
[50] training's auc: 0.886537 training's binary_logloss: 0.122657 valid_1's auc: 0.830417 valid_1's binary_logloss: 0.136383
[51] training's auc: 0.887095 training's binary_logloss: 0.122406 valid_1's auc: 0.830621 valid_1's binary_logloss: 0.13632
[52] training's auc: 0.887557 training's binary_logloss: 0.122155 valid_1's auc: 0.830853 valid_1's binary_logloss: 0.136246
[53] training's auc: 0.888005 training's binary_logloss: 0.121904 valid_1's auc: 0.830709 valid_1's binary_logloss: 0.136229
[54] training's auc: 0.888392 training's binary_logloss: 0.121675 valid_1's auc: 0.830791 valid_1's binary_logloss: 0.136166
[55] training's auc: 0.888746 training's binary_logloss: 0.121462 valid_1's auc: 0.830954 valid_1's binary_logloss: 0.136113
[56] training's auc: 0.889775 training's binary_logloss: 0.121222 valid_1's auc: 0.831128 valid_1's binary_logloss: 0.136081
[57] training's auc: 0.890281 training's binary_logloss: 0.12099 valid_1's auc: 0.830823 valid_1's binary_logloss: 0.136087
[58] training's auc: 0.891005 training's binary_logloss: 0.120766 valid_1's auc: 0.830802 valid_1's binary_logloss: 0.136053
[59] training's auc: 0.891355 training's binary_logloss: 0.120587 valid_1's auc: 0.830925 valid_1's binary_logloss: 0.136005
[60] training's auc: 0.891881 training's binary_logloss: 0.120378 valid_1's auc: 0.831011 valid_1's binary_logloss: 0.13596
[61] training's auc: 0.89265 training's binary_logloss: 0.120135 valid_1's auc: 0.831346 valid_1's binary_logloss: 0.135905
[62] training's auc: 0.893225 training's binary_logloss: 0.119921 valid_1's auc: 0.831248 valid_1's binary_logloss: 0.13588
[63] training's auc: 0.89401 training's binary_logloss: 0.119701 valid_1's auc: 0.831534 valid_1's binary_logloss: 0.135835
[64] training's auc: 0.894392 training's binary_logloss: 0.119515 valid_1's auc: 0.83149 valid_1's binary_logloss: 0.135798
[65] training's auc: 0.894931 training's binary_logloss: 0.119298 valid_1's auc: 0.831492 valid_1's binary_logloss: 0.135777
[66] training's auc: 0.895438 training's binary_logloss: 0.119134 valid_1's auc: 0.831456 valid_1's binary_logloss: 0.135751
[67] training's auc: 0.896079 training's binary_logloss: 0.118943 valid_1's auc: 0.831618 valid_1's binary_logloss: 0.135699
[68] training's auc: 0.896521 training's binary_logloss: 0.118772 valid_1's auc: 0.831821 valid_1's binary_logloss: 0.135665
[69] training's auc: 0.897112 training's binary_logloss: 0.118587 valid_1's auc: 0.831738 valid_1's binary_logloss: 0.135678
[70] training's auc: 0.897663 training's binary_logloss: 0.118384 valid_1's auc: 0.831831 valid_1's binary_logloss: 0.135653
[71] training's auc: 0.898112 training's binary_logloss: 0.118191 valid_1's auc: 0.831669 valid_1's binary_logloss: 0.135659
[72] training's auc: 0.898652 training's binary_logloss: 0.118007 valid_1's auc: 0.83172 valid_1's binary_logloss: 0.135646
[73] training's auc: 0.899222 training's binary_logloss: 0.1178 valid_1's auc: 0.831726 valid_1's binary_logloss: 0.135644
[74] training's auc: 0.899644 training's binary_logloss: 0.117624 valid_1's auc: 0.83173 valid_1's binary_logloss: 0.135633
[75] training's auc: 0.900121 training's binary_logloss: 0.11743 valid_1's auc: 0.831548 valid_1's binary_logloss: 0.135668
[76] training's auc: 0.900786 training's binary_logloss: 0.117229 valid_1's auc: 0.83195 valid_1's binary_logloss: 0.135593
[77] training's auc: 0.901286 training's binary_logloss: 0.117069 valid_1's auc: 0.831911 valid_1's binary_logloss: 0.135605
[78] training's auc: 0.901739 training's binary_logloss: 0.116873 valid_1's auc: 0.83182 valid_1's binary_logloss: 0.135591
[79] training's auc: 0.902174 training's binary_logloss: 0.116691 valid_1's auc: 0.831841 valid_1's binary_logloss: 0.135583
[80] training's auc: 0.902657 training's binary_logloss: 0.116515 valid_1's auc: 0.831833 valid_1's binary_logloss: 0.135584
[81] training's auc: 0.903039 training's binary_logloss: 0.116341 valid_1's auc: 0.831694 valid_1's binary_logloss: 0.135602
[82] training's auc: 0.903407 training's binary_logloss: 0.116202 valid_1's auc: 0.831723 valid_1's binary_logloss: 0.135603
[83] training's auc: 0.903922 training's binary_logloss: 0.116042 valid_1's auc: 0.831679 valid_1's binary_logloss: 0.135598
[84] training's auc: 0.904321 training's binary_logloss: 0.11587 valid_1's auc: 0.831645 valid_1's binary_logloss: 0.135602
[85] training's auc: 0.904859 training's binary_logloss: 0.115726 valid_1's auc: 0.831597 valid_1's binary_logloss: 0.135608
[86] training's auc: 0.905189 training's binary_logloss: 0.11556 valid_1's auc: 0.831571 valid_1's binary_logloss: 0.135606
[87] training's auc: 0.905663 training's binary_logloss: 0.115383 valid_1's auc: 0.831663 valid_1's binary_logloss: 0.135599
[88] training's auc: 0.906051 training's binary_logloss: 0.115237 valid_1's auc: 0.831406 valid_1's binary_logloss: 0.135629
[89] training's auc: 0.906423 training's binary_logloss: 0.115083 valid_1's auc: 0.831447 valid_1's binary_logloss: 0.135632
[90] training's auc: 0.906777 training's binary_logloss: 0.11493 valid_1's auc: 0.831335 valid_1's binary_logloss: 0.135663
[91] training's auc: 0.907239 training's binary_logloss: 0.114766 valid_1's auc: 0.831206 valid_1's binary_logloss: 0.135685
[92] training's auc: 0.907757 training's binary_logloss: 0.114611 valid_1's auc: 0.831251 valid_1's binary_logloss: 0.135701
[93] training's auc: 0.908092 training's binary_logloss: 0.114455 valid_1's auc: 0.831301 valid_1's binary_logloss: 0.135697
[94] training's auc: 0.908425 training's binary_logloss: 0.114303 valid_1's auc: 0.83128 valid_1's binary_logloss: 0.135711
[95] training's auc: 0.908776 training's binary_logloss: 0.114155 valid_1's auc: 0.831206 valid_1's binary_logloss: 0.135725
[96] training's auc: 0.909459 training's binary_logloss: 0.114022 valid_1's auc: 0.831318 valid_1's binary_logloss: 0.135698
[97] training's auc: 0.909911 training's binary_logloss: 0.113871 valid_1's auc: 0.831313 valid_1's binary_logloss: 0.135708
[98] training's auc: 0.9103 training's binary_logloss: 0.113714 valid_1's auc: 0.831344 valid_1's binary_logloss: 0.13572
[99] training's auc: 0.910614 training's binary_logloss: 0.113574 valid_1's auc: 0.831372 valid_1's binary_logloss: 0.135737
[100] training's auc: 0.910995 training's binary_logloss: 0.113399 valid_1's auc: 0.831136 valid_1's binary_logloss: 0.135783
Did not meet early stopping. Best iteration is:
[100] training's auc: 0.910995 training's binary_logloss: 0.113399 valid_1's auc: 0.831136 valid_1's binary_logloss: 0.135783
[1] training's auc: 0.829931 training's binary_logloss: 0.16317 valid_1's auc: 0.81758 valid_1's binary_logloss: 0.157601
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.83541 training's binary_logloss: 0.160011 valid_1's auc: 0.822359 valid_1's binary_logloss: 0.154955
[3] training's auc: 0.838238 training's binary_logloss: 0.157367 valid_1's auc: 0.822792 valid_1's binary_logloss: 0.15275
[4] training's auc: 0.841697 training's binary_logloss: 0.155018 valid_1's auc: 0.824241 valid_1's binary_logloss: 0.150844
[5] training's auc: 0.844264 training's binary_logloss: 0.152996 valid_1's auc: 0.825683 valid_1's binary_logloss: 0.149178
[6] training's auc: 0.846438 training's binary_logloss: 0.151181 valid_1's auc: 0.827968 valid_1's binary_logloss: 0.147675
[7] training's auc: 0.851203 training's binary_logloss: 0.149503 valid_1's auc: 0.830862 valid_1's binary_logloss: 0.146401
[8] training's auc: 0.852167 training's binary_logloss: 0.148012 valid_1's auc: 0.831091 valid_1's binary_logloss: 0.145218
[9] training's auc: 0.853328 training's binary_logloss: 0.146653 valid_1's auc: 0.831339 valid_1's binary_logloss: 0.144173
[10] training's auc: 0.855027 training's binary_logloss: 0.145399 valid_1's auc: 0.8325 valid_1's binary_logloss: 0.143185
[11] training's auc: 0.855323 training's binary_logloss: 0.144248 valid_1's auc: 0.832453 valid_1's binary_logloss: 0.142283
[12] training's auc: 0.856844 training's binary_logloss: 0.143189 valid_1's auc: 0.833904 valid_1's binary_logloss: 0.14146
[13] training's auc: 0.857867 training's binary_logloss: 0.142147 valid_1's auc: 0.833721 valid_1's binary_logloss: 0.140732
[14] training's auc: 0.859354 training's binary_logloss: 0.141216 valid_1's auc: 0.834839 valid_1's binary_logloss: 0.139998
[15] training's auc: 0.860447 training's binary_logloss: 0.140312 valid_1's auc: 0.835467 valid_1's binary_logloss: 0.139305
[16] training's auc: 0.861465 training's binary_logloss: 0.139461 valid_1's auc: 0.835378 valid_1's binary_logloss: 0.138723
[17] training's auc: 0.861998 training's binary_logloss: 0.138666 valid_1's auc: 0.836113 valid_1's binary_logloss: 0.138148
[18] training's auc: 0.863126 training's binary_logloss: 0.137927 valid_1's auc: 0.835718 valid_1's binary_logloss: 0.137662
[19] training's auc: 0.864427 training's binary_logloss: 0.137228 valid_1's auc: 0.835475 valid_1's binary_logloss: 0.137219
[20] training's auc: 0.865361 training's binary_logloss: 0.136559 valid_1's auc: 0.835643 valid_1's binary_logloss: 0.136784
[21] training's auc: 0.866287 training's binary_logloss: 0.135916 valid_1's auc: 0.83594 valid_1's binary_logloss: 0.136362
[22] training's auc: 0.866846 training's binary_logloss: 0.135297 valid_1's auc: 0.835794 valid_1's binary_logloss: 0.135964
[23] training's auc: 0.867343 training's binary_logloss: 0.134725 valid_1's auc: 0.835718 valid_1's binary_logloss: 0.135618
[24] training's auc: 0.868057 training's binary_logloss: 0.13418 valid_1's auc: 0.835711 valid_1's binary_logloss: 0.135262
[25] training's auc: 0.868733 training's binary_logloss: 0.133652 valid_1's auc: 0.835742 valid_1's binary_logloss: 0.134948
[26] training's auc: 0.869187 training's binary_logloss: 0.133152 valid_1's auc: 0.83593 valid_1's binary_logloss: 0.134634
[27] training's auc: 0.869956 training's binary_logloss: 0.132662 valid_1's auc: 0.835969 valid_1's binary_logloss: 0.134364
[28] training's auc: 0.870444 training's binary_logloss: 0.132202 valid_1's auc: 0.83582 valid_1's binary_logloss: 0.134103
[29] training's auc: 0.871326 training's binary_logloss: 0.131739 valid_1's auc: 0.835937 valid_1's binary_logloss: 0.133904
[30] training's auc: 0.872142 training's binary_logloss: 0.131269 valid_1's auc: 0.83585 valid_1's binary_logloss: 0.133694
[31] training's auc: 0.873087 training's binary_logloss: 0.130844 valid_1's auc: 0.835707 valid_1's binary_logloss: 0.13352
[32] training's auc: 0.873614 training's binary_logloss: 0.130457 valid_1's auc: 0.8356 valid_1's binary_logloss: 0.133315
[33] training's auc: 0.874286 training's binary_logloss: 0.130086 valid_1's auc: 0.83537 valid_1's binary_logloss: 0.133156
[34] training's auc: 0.875061 training's binary_logloss: 0.12971 valid_1's auc: 0.835347 valid_1's binary_logloss: 0.13301
[35] training's auc: 0.875513 training's binary_logloss: 0.12936 valid_1's auc: 0.835045 valid_1's binary_logloss: 0.132889
[36] training's auc: 0.876243 training's binary_logloss: 0.129024 valid_1's auc: 0.834935 valid_1's binary_logloss: 0.132757
[37] training's auc: 0.876853 training's binary_logloss: 0.128678 valid_1's auc: 0.834909 valid_1's binary_logloss: 0.132631
[38] training's auc: 0.877461 training's binary_logloss: 0.12837 valid_1's auc: 0.834721 valid_1's binary_logloss: 0.132507
[39] training's auc: 0.878088 training's binary_logloss: 0.12803 valid_1's auc: 0.834747 valid_1's binary_logloss: 0.13239
[40] training's auc: 0.878764 training's binary_logloss: 0.127734 valid_1's auc: 0.83475 valid_1's binary_logloss: 0.132278
[41] training's auc: 0.879548 training's binary_logloss: 0.127426 valid_1's auc: 0.835015 valid_1's binary_logloss: 0.132192
[42] training's auc: 0.880738 training's binary_logloss: 0.127061 valid_1's auc: 0.83517 valid_1's binary_logloss: 0.132063
[43] training's auc: 0.881222 training's binary_logloss: 0.126753 valid_1's auc: 0.835157 valid_1's binary_logloss: 0.13196
[44] training's auc: 0.881734 training's binary_logloss: 0.126457 valid_1's auc: 0.834912 valid_1's binary_logloss: 0.131922
[45] training's auc: 0.882559 training's binary_logloss: 0.126125 valid_1's auc: 0.834875 valid_1's binary_logloss: 0.131817
[46] training's auc: 0.883297 training's binary_logloss: 0.125822 valid_1's auc: 0.834752 valid_1's binary_logloss: 0.131744
[47] training's auc: 0.883865 training's binary_logloss: 0.125555 valid_1's auc: 0.834791 valid_1's binary_logloss: 0.131661
Early stopping, best iteration is:
[17] training's auc: 0.861998 training's binary_logloss: 0.138666 valid_1's auc: 0.836113 valid_1's binary_logloss: 0.138148
[1] training's auc: 0.82993 training's binary_logloss: 0.159466 valid_1's auc: 0.807913 valid_1's binary_logloss: 0.165217
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.836536 training's binary_logloss: 0.15647 valid_1's auc: 0.815223 valid_1's binary_logloss: 0.16241
[3] training's auc: 0.84138 training's binary_logloss: 0.153934 valid_1's auc: 0.822186 valid_1's binary_logloss: 0.159944
[4] training's auc: 0.842337 training's binary_logloss: 0.151691 valid_1's auc: 0.824336 valid_1's binary_logloss: 0.157864
[5] training's auc: 0.845186 training's binary_logloss: 0.149749 valid_1's auc: 0.825391 valid_1's binary_logloss: 0.15611
[6] training's auc: 0.847017 training's binary_logloss: 0.14797 valid_1's auc: 0.826645 valid_1's binary_logloss: 0.154508
[7] training's auc: 0.848092 training's binary_logloss: 0.146387 valid_1's auc: 0.826632 valid_1's binary_logloss: 0.15313
[8] training's auc: 0.850773 training's binary_logloss: 0.144902 valid_1's auc: 0.826817 valid_1's binary_logloss: 0.151864
[9] training's auc: 0.851585 training's binary_logloss: 0.143556 valid_1's auc: 0.827969 valid_1's binary_logloss: 0.150711
[10] training's auc: 0.85266 training's binary_logloss: 0.142304 valid_1's auc: 0.829099 valid_1's binary_logloss: 0.149652
[11] training's auc: 0.853982 training's binary_logloss: 0.141152 valid_1's auc: 0.829039 valid_1's binary_logloss: 0.14871
[12] training's auc: 0.854741 training's binary_logloss: 0.14008 valid_1's auc: 0.829038 valid_1's binary_logloss: 0.147855
[13] training's auc: 0.855623 training's binary_logloss: 0.139112 valid_1's auc: 0.829598 valid_1's binary_logloss: 0.147066
[14] training's auc: 0.856885 training's binary_logloss: 0.138203 valid_1's auc: 0.830028 valid_1's binary_logloss: 0.146334
[15] training's auc: 0.857264 training's binary_logloss: 0.137352 valid_1's auc: 0.830274 valid_1's binary_logloss: 0.1457
[16] training's auc: 0.858487 training's binary_logloss: 0.136488 valid_1's auc: 0.830444 valid_1's binary_logloss: 0.145088
[17] training's auc: 0.859577 training's binary_logloss: 0.135682 valid_1's auc: 0.831077 valid_1's binary_logloss: 0.1445
[18] training's auc: 0.860246 training's binary_logloss: 0.134934 valid_1's auc: 0.831484 valid_1's binary_logloss: 0.143962
[19] training's auc: 0.861293 training's binary_logloss: 0.134261 valid_1's auc: 0.83187 valid_1's binary_logloss: 0.143467
[20] training's auc: 0.862085 training's binary_logloss: 0.133618 valid_1's auc: 0.832358 valid_1's binary_logloss: 0.142992
[21] training's auc: 0.86291 training's binary_logloss: 0.133013 valid_1's auc: 0.832611 valid_1's binary_logloss: 0.142567
[22] training's auc: 0.864788 training's binary_logloss: 0.132452 valid_1's auc: 0.834792 valid_1's binary_logloss: 0.142174
[23] training's auc: 0.865597 training's binary_logloss: 0.131923 valid_1's auc: 0.834588 valid_1's binary_logloss: 0.141806
[24] training's auc: 0.866488 training's binary_logloss: 0.131392 valid_1's auc: 0.834543 valid_1's binary_logloss: 0.141483
[25] training's auc: 0.867193 training's binary_logloss: 0.13088 valid_1's auc: 0.834677 valid_1's binary_logloss: 0.141143
[26] training's auc: 0.867992 training's binary_logloss: 0.130385 valid_1's auc: 0.834752 valid_1's binary_logloss: 0.140815
[27] training's auc: 0.86881 training's binary_logloss: 0.129909 valid_1's auc: 0.834939 valid_1's binary_logloss: 0.140527
[28] training's auc: 0.869705 training's binary_logloss: 0.129443 valid_1's auc: 0.835094 valid_1's binary_logloss: 0.14025
[29] training's auc: 0.870296 training's binary_logloss: 0.129008 valid_1's auc: 0.835074 valid_1's binary_logloss: 0.140007
[30] training's auc: 0.8717 training's binary_logloss: 0.128599 valid_1's auc: 0.83506 valid_1's binary_logloss: 0.139746
[31] training's auc: 0.872679 training's binary_logloss: 0.128192 valid_1's auc: 0.83496 valid_1's binary_logloss: 0.139539
[32] training's auc: 0.873862 training's binary_logloss: 0.127783 valid_1's auc: 0.835493 valid_1's binary_logloss: 0.139328
[33] training's auc: 0.875016 training's binary_logloss: 0.127399 valid_1's auc: 0.835704 valid_1's binary_logloss: 0.13911
[34] training's auc: 0.875686 training's binary_logloss: 0.127027 valid_1's auc: 0.835779 valid_1's binary_logloss: 0.138934
[35] training's auc: 0.876608 training's binary_logloss: 0.126648 valid_1's auc: 0.836382 valid_1's binary_logloss: 0.138765
[36] training's auc: 0.877531 training's binary_logloss: 0.126277 valid_1's auc: 0.836674 valid_1's binary_logloss: 0.138586
[37] training's auc: 0.878119 training's binary_logloss: 0.12595 valid_1's auc: 0.836852 valid_1's binary_logloss: 0.138429
[38] training's auc: 0.878949 training's binary_logloss: 0.12562 valid_1's auc: 0.836723 valid_1's binary_logloss: 0.138286
[39] training's auc: 0.879509 training's binary_logloss: 0.125302 valid_1's auc: 0.836785 valid_1's binary_logloss: 0.13816
[40] training's auc: 0.880027 training's binary_logloss: 0.124999 valid_1's auc: 0.836576 valid_1's binary_logloss: 0.138051
[41] training's auc: 0.880534 training's binary_logloss: 0.124718 valid_1's auc: 0.836679 valid_1's binary_logloss: 0.137924
[42] training's auc: 0.881019 training's binary_logloss: 0.124432 valid_1's auc: 0.836586 valid_1's binary_logloss: 0.137816
[43] training's auc: 0.881976 training's binary_logloss: 0.124119 valid_1's auc: 0.836745 valid_1's binary_logloss: 0.137711
[44] training's auc: 0.882743 training's binary_logloss: 0.123817 valid_1's auc: 0.836931 valid_1's binary_logloss: 0.137615
[45] training's auc: 0.883191 training's binary_logloss: 0.123543 valid_1's auc: 0.836894 valid_1's binary_logloss: 0.137538
[46] training's auc: 0.883998 training's binary_logloss: 0.123263 valid_1's auc: 0.836913 valid_1's binary_logloss: 0.13745
[47] training's auc: 0.884645 training's binary_logloss: 0.122988 valid_1's auc: 0.83672 valid_1's binary_logloss: 0.137372
[48] training's auc: 0.885359 training's binary_logloss: 0.122693 valid_1's auc: 0.83637 valid_1's binary_logloss: 0.137321
[49] training's auc: 0.886009 training's binary_logloss: 0.122424 valid_1's auc: 0.836244 valid_1's binary_logloss: 0.137266
[50] training's auc: 0.886605 training's binary_logloss: 0.122162 valid_1's auc: 0.83613 valid_1's binary_logloss: 0.137188
[51] training's auc: 0.887164 training's binary_logloss: 0.121892 valid_1's auc: 0.836142 valid_1's binary_logloss: 0.13712
[52] training's auc: 0.887721 training's binary_logloss: 0.121656 valid_1's auc: 0.836064 valid_1's binary_logloss: 0.137065
[53] training's auc: 0.888444 training's binary_logloss: 0.121398 valid_1's auc: 0.836295 valid_1's binary_logloss: 0.136997
[54] training's auc: 0.888967 training's binary_logloss: 0.121168 valid_1's auc: 0.836235 valid_1's binary_logloss: 0.136941
[55] training's auc: 0.889449 training's binary_logloss: 0.12093 valid_1's auc: 0.836129 valid_1's binary_logloss: 0.136905
[56] training's auc: 0.889896 training's binary_logloss: 0.120716 valid_1's auc: 0.8363 valid_1's binary_logloss: 0.136852
[57] training's auc: 0.89044 training's binary_logloss: 0.120497 valid_1's auc: 0.836326 valid_1's binary_logloss: 0.136821
[58] training's auc: 0.890976 training's binary_logloss: 0.120276 valid_1's auc: 0.836253 valid_1's binary_logloss: 0.136777
[59] training's auc: 0.891412 training's binary_logloss: 0.120076 valid_1's auc: 0.836075 valid_1's binary_logloss: 0.136754
[60] training's auc: 0.892055 training's binary_logloss: 0.119848 valid_1's auc: 0.836198 valid_1's binary_logloss: 0.136702
[61] training's auc: 0.892585 training's binary_logloss: 0.119629 valid_1's auc: 0.836208 valid_1's binary_logloss: 0.136659
[62] training's auc: 0.89311 training's binary_logloss: 0.119411 valid_1's auc: 0.836166 valid_1's binary_logloss: 0.136636
[63] training's auc: 0.893625 training's binary_logloss: 0.11922 valid_1's auc: 0.836143 valid_1's binary_logloss: 0.136613
[64] training's auc: 0.894107 training's binary_logloss: 0.119007 valid_1's auc: 0.83609 valid_1's binary_logloss: 0.136589
[65] training's auc: 0.894614 training's binary_logloss: 0.118804 valid_1's auc: 0.836161 valid_1's binary_logloss: 0.13656
[66] training's auc: 0.895246 training's binary_logloss: 0.118586 valid_1's auc: 0.836015 valid_1's binary_logloss: 0.136566
[67] training's auc: 0.895843 training's binary_logloss: 0.118367 valid_1's auc: 0.835961 valid_1's binary_logloss: 0.136541
[68] training's auc: 0.896326 training's binary_logloss: 0.118161 valid_1's auc: 0.835748 valid_1's binary_logloss: 0.13653
[69] training's auc: 0.896772 training's binary_logloss: 0.117959 valid_1's auc: 0.835432 valid_1's binary_logloss: 0.136537
[70] training's auc: 0.897311 training's binary_logloss: 0.117773 valid_1's auc: 0.835311 valid_1's binary_logloss: 0.136542
[71] training's auc: 0.89784 training's binary_logloss: 0.117578 valid_1's auc: 0.835255 valid_1's binary_logloss: 0.136541
[72] training's auc: 0.898356 training's binary_logloss: 0.117397 valid_1's auc: 0.835173 valid_1's binary_logloss: 0.13653
[73] training's auc: 0.898648 training's binary_logloss: 0.117235 valid_1's auc: 0.835388 valid_1's binary_logloss: 0.136489
[74] training's auc: 0.899079 training's binary_logloss: 0.117058 valid_1's auc: 0.835311 valid_1's binary_logloss: 0.136478
Early stopping, best iteration is:
[44] training's auc: 0.882743 training's binary_logloss: 0.123817 valid_1's auc: 0.836931 valid_1's binary_logloss: 0.137615
[1] training's auc: 0.831712 training's binary_logloss: 0.162371 valid_1's auc: 0.807701 valid_1's binary_logloss: 0.163615
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.834491 training's binary_logloss: 0.160404 valid_1's auc: 0.809752 valid_1's binary_logloss: 0.161984
[3] training's auc: 0.835561 training's binary_logloss: 0.158667 valid_1's auc: 0.8107 valid_1's binary_logloss: 0.160503
[4] training's auc: 0.836681 training's binary_logloss: 0.157072 valid_1's auc: 0.811337 valid_1's binary_logloss: 0.159195
[5] training's auc: 0.8402 training's binary_logloss: 0.155611 valid_1's auc: 0.812772 valid_1's binary_logloss: 0.157944
[6] training's auc: 0.841705 training's binary_logloss: 0.154277 valid_1's auc: 0.813347 valid_1's binary_logloss: 0.156847
[7] training's auc: 0.841887 training's binary_logloss: 0.15303 valid_1's auc: 0.813989 valid_1's binary_logloss: 0.155854
[8] training's auc: 0.843852 training's binary_logloss: 0.151864 valid_1's auc: 0.816045 valid_1's binary_logloss: 0.154871
[9] training's auc: 0.845734 training's binary_logloss: 0.150778 valid_1's auc: 0.817717 valid_1's binary_logloss: 0.154005
[10] training's auc: 0.846747 training's binary_logloss: 0.14974 valid_1's auc: 0.818194 valid_1's binary_logloss: 0.15315
[11] training's auc: 0.847879 training's binary_logloss: 0.148784 valid_1's auc: 0.818914 valid_1's binary_logloss: 0.152381
[12] training's auc: 0.848675 training's binary_logloss: 0.147873 valid_1's auc: 0.818898 valid_1's binary_logloss: 0.151664
[13] training's auc: 0.850542 training's binary_logloss: 0.147022 valid_1's auc: 0.82064 valid_1's binary_logloss: 0.150943
[14] training's auc: 0.851502 training's binary_logloss: 0.146194 valid_1's auc: 0.821971 valid_1's binary_logloss: 0.150296
[15] training's auc: 0.852091 training's binary_logloss: 0.145411 valid_1's auc: 0.82267 valid_1's binary_logloss: 0.149665
[16] training's auc: 0.85221 training's binary_logloss: 0.144667 valid_1's auc: 0.82264 valid_1's binary_logloss: 0.149058
[17] training's auc: 0.853514 training's binary_logloss: 0.14393 valid_1's auc: 0.823128 valid_1's binary_logloss: 0.148506
[18] training's auc: 0.853991 training's binary_logloss: 0.143242 valid_1's auc: 0.823343 valid_1's binary_logloss: 0.147998
[19] training's auc: 0.854544 training's binary_logloss: 0.142582 valid_1's auc: 0.823286 valid_1's binary_logloss: 0.147501
[20] training's auc: 0.854839 training's binary_logloss: 0.141951 valid_1's auc: 0.823273 valid_1's binary_logloss: 0.147026
[21] training's auc: 0.855152 training's binary_logloss: 0.141326 valid_1's auc: 0.823394 valid_1's binary_logloss: 0.146572
[22] training's auc: 0.858246 training's binary_logloss: 0.140738 valid_1's auc: 0.824959 valid_1's binary_logloss: 0.14613
[23] training's auc: 0.85837 training's binary_logloss: 0.140178 valid_1's auc: 0.825067 valid_1's binary_logloss: 0.145724
[24] training's auc: 0.858524 training's binary_logloss: 0.139639 valid_1's auc: 0.825387 valid_1's binary_logloss: 0.145316
[25] training's auc: 0.858862 training's binary_logloss: 0.139124 valid_1's auc: 0.825637 valid_1's binary_logloss: 0.144918
[26] training's auc: 0.859467 training's binary_logloss: 0.13864 valid_1's auc: 0.825754 valid_1's binary_logloss: 0.144557
[27] training's auc: 0.859478 training's binary_logloss: 0.138168 valid_1's auc: 0.825994 valid_1's binary_logloss: 0.144218
[28] training's auc: 0.860299 training's binary_logloss: 0.137693 valid_1's auc: 0.826315 valid_1's binary_logloss: 0.143863
[29] training's auc: 0.86078 training's binary_logloss: 0.137239 valid_1's auc: 0.82632 valid_1's binary_logloss: 0.143525
[30] training's auc: 0.862013 training's binary_logloss: 0.136806 valid_1's auc: 0.826725 valid_1's binary_logloss: 0.143181
[31] training's auc: 0.862224 training's binary_logloss: 0.136395 valid_1's auc: 0.826718 valid_1's binary_logloss: 0.142889
[32] training's auc: 0.862263 training's binary_logloss: 0.135994 valid_1's auc: 0.826745 valid_1's binary_logloss: 0.142608
[33] training's auc: 0.862491 training's binary_logloss: 0.135611 valid_1's auc: 0.82688 valid_1's binary_logloss: 0.142328
[34] training's auc: 0.862841 training's binary_logloss: 0.135244 valid_1's auc: 0.826938 valid_1's binary_logloss: 0.142057
[35] training's auc: 0.863089 training's binary_logloss: 0.134876 valid_1's auc: 0.826995 valid_1's binary_logloss: 0.141799
[36] training's auc: 0.863865 training's binary_logloss: 0.134507 valid_1's auc: 0.827491 valid_1's binary_logloss: 0.141543
[37] training's auc: 0.86405 training's binary_logloss: 0.134165 valid_1's auc: 0.82734 valid_1's binary_logloss: 0.141315
[38] training's auc: 0.864564 training's binary_logloss: 0.133835 valid_1's auc: 0.827338 valid_1's binary_logloss: 0.141097
[39] training's auc: 0.86537 training's binary_logloss: 0.133498 valid_1's auc: 0.827512 valid_1's binary_logloss: 0.140873
[40] training's auc: 0.865738 training's binary_logloss: 0.133182 valid_1's auc: 0.827443 valid_1's binary_logloss: 0.140665
[41] training's auc: 0.86623 training's binary_logloss: 0.132872 valid_1's auc: 0.827517 valid_1's binary_logloss: 0.140478
[42] training's auc: 0.866707 training's binary_logloss: 0.13257 valid_1's auc: 0.828121 valid_1's binary_logloss: 0.140295
[43] training's auc: 0.867179 training's binary_logloss: 0.132276 valid_1's auc: 0.828177 valid_1's binary_logloss: 0.140124
[44] training's auc: 0.867447 training's binary_logloss: 0.131993 valid_1's auc: 0.828295 valid_1's binary_logloss: 0.139963
[45] training's auc: 0.867729 training's binary_logloss: 0.131705 valid_1's auc: 0.828474 valid_1's binary_logloss: 0.139797
[46] training's auc: 0.867888 training's binary_logloss: 0.131424 valid_1's auc: 0.828411 valid_1's binary_logloss: 0.139634
[47] training's auc: 0.868719 training's binary_logloss: 0.131143 valid_1's auc: 0.828761 valid_1's binary_logloss: 0.139472
[48] training's auc: 0.869373 training's binary_logloss: 0.130877 valid_1's auc: 0.828873 valid_1's binary_logloss: 0.139321
[49] training's auc: 0.869629 training's binary_logloss: 0.130611 valid_1's auc: 0.828982 valid_1's binary_logloss: 0.139182
[50] training's auc: 0.870105 training's binary_logloss: 0.130356 valid_1's auc: 0.829084 valid_1's binary_logloss: 0.139041
[51] training's auc: 0.870378 training's binary_logloss: 0.130105 valid_1's auc: 0.828984 valid_1's binary_logloss: 0.138916
[52] training's auc: 0.870709 training's binary_logloss: 0.129877 valid_1's auc: 0.828981 valid_1's binary_logloss: 0.138805
[53] training's auc: 0.871415 training's binary_logloss: 0.129643 valid_1's auc: 0.82928 valid_1's binary_logloss: 0.138684
[54] training's auc: 0.871871 training's binary_logloss: 0.129416 valid_1's auc: 0.829405 valid_1's binary_logloss: 0.138547
[55] training's auc: 0.872172 training's binary_logloss: 0.129193 valid_1's auc: 0.829627 valid_1's binary_logloss: 0.138437
[56] training's auc: 0.872441 training's binary_logloss: 0.128978 valid_1's auc: 0.829709 valid_1's binary_logloss: 0.138333
[57] training's auc: 0.872589 training's binary_logloss: 0.128778 valid_1's auc: 0.82959 valid_1's binary_logloss: 0.138228
[58] training's auc: 0.873288 training's binary_logloss: 0.128563 valid_1's auc: 0.830014 valid_1's binary_logloss: 0.138117
[59] training's auc: 0.873415 training's binary_logloss: 0.128376 valid_1's auc: 0.830009 valid_1's binary_logloss: 0.138023
[60] training's auc: 0.873908 training's binary_logloss: 0.128179 valid_1's auc: 0.830052 valid_1's binary_logloss: 0.137935
[61] training's auc: 0.874411 training's binary_logloss: 0.12796 valid_1's auc: 0.830056 valid_1's binary_logloss: 0.137835
[62] training's auc: 0.874942 training's binary_logloss: 0.127739 valid_1's auc: 0.830109 valid_1's binary_logloss: 0.137755
[63] training's auc: 0.875209 training's binary_logloss: 0.12754 valid_1's auc: 0.830137 valid_1's binary_logloss: 0.137678
[64] training's auc: 0.875587 training's binary_logloss: 0.127336 valid_1's auc: 0.830024 valid_1's binary_logloss: 0.13761
[65] training's auc: 0.875985 training's binary_logloss: 0.127139 valid_1's auc: 0.829939 valid_1's binary_logloss: 0.137529
[66] training's auc: 0.876293 training's binary_logloss: 0.126945 valid_1's auc: 0.82973 valid_1's binary_logloss: 0.137469
[67] training's auc: 0.876771 training's binary_logloss: 0.126758 valid_1's auc: 0.829635 valid_1's binary_logloss: 0.137413
[68] training's auc: 0.877193 training's binary_logloss: 0.126578 valid_1's auc: 0.829609 valid_1's binary_logloss: 0.137343
[69] training's auc: 0.877783 training's binary_logloss: 0.126399 valid_1's auc: 0.829644 valid_1's binary_logloss: 0.137282
[70] training's auc: 0.878439 training's binary_logloss: 0.126221 valid_1's auc: 0.829841 valid_1's binary_logloss: 0.137195
[71] training's auc: 0.878857 training's binary_logloss: 0.126042 valid_1's auc: 0.82978 valid_1's binary_logloss: 0.137142
[72] training's auc: 0.879062 training's binary_logloss: 0.125878 valid_1's auc: 0.829804 valid_1's binary_logloss: 0.137086
[73] training's auc: 0.879606 training's binary_logloss: 0.125703 valid_1's auc: 0.829934 valid_1's binary_logloss: 0.137027
[74] training's auc: 0.879906 training's binary_logloss: 0.125528 valid_1's auc: 0.830131 valid_1's binary_logloss: 0.136957
[75] training's auc: 0.880535 training's binary_logloss: 0.125346 valid_1's auc: 0.830538 valid_1's binary_logloss: 0.136885
[76] training's auc: 0.880911 training's binary_logloss: 0.125177 valid_1's auc: 0.830704 valid_1's binary_logloss: 0.136821
[77] training's auc: 0.881225 training's binary_logloss: 0.12502 valid_1's auc: 0.830746 valid_1's binary_logloss: 0.136765
[78] training's auc: 0.881436 training's binary_logloss: 0.12487 valid_1's auc: 0.830901 valid_1's binary_logloss: 0.1367
[79] training's auc: 0.881874 training's binary_logloss: 0.124702 valid_1's auc: 0.830998 valid_1's binary_logloss: 0.136652
[80] training's auc: 0.882146 training's binary_logloss: 0.124555 valid_1's auc: 0.830899 valid_1's binary_logloss: 0.136616
[81] training's auc: 0.882816 training's binary_logloss: 0.124391 valid_1's auc: 0.830941 valid_1's binary_logloss: 0.136573
[82] training's auc: 0.883099 training's binary_logloss: 0.124254 valid_1's auc: 0.831 valid_1's binary_logloss: 0.136538
[83] training's auc: 0.883239 training's binary_logloss: 0.124118 valid_1's auc: 0.831045 valid_1's binary_logloss: 0.136483
[84] training's auc: 0.883549 training's binary_logloss: 0.123972 valid_1's auc: 0.831081 valid_1's binary_logloss: 0.136434
[85] training's auc: 0.88394 training's binary_logloss: 0.123834 valid_1's auc: 0.83121 valid_1's binary_logloss: 0.136398
[86] training's auc: 0.884192 training's binary_logloss: 0.123696 valid_1's auc: 0.831377 valid_1's binary_logloss: 0.136338
[87] training's auc: 0.884453 training's binary_logloss: 0.123554 valid_1's auc: 0.831418 valid_1's binary_logloss: 0.136298
[88] training's auc: 0.884698 training's binary_logloss: 0.123416 valid_1's auc: 0.831464 valid_1's binary_logloss: 0.136254
[89] training's auc: 0.884934 training's binary_logloss: 0.123286 valid_1's auc: 0.831532 valid_1's binary_logloss: 0.136216
[90] training's auc: 0.885407 training's binary_logloss: 0.123139 valid_1's auc: 0.831645 valid_1's binary_logloss: 0.136165
[91] training's auc: 0.885655 training's binary_logloss: 0.123007 valid_1's auc: 0.831666 valid_1's binary_logloss: 0.136134
[92] training's auc: 0.885895 training's binary_logloss: 0.12289 valid_1's auc: 0.831795 valid_1's binary_logloss: 0.136097
[93] training's auc: 0.886343 training's binary_logloss: 0.122753 valid_1's auc: 0.831859 valid_1's binary_logloss: 0.136066
[94] training's auc: 0.886617 training's binary_logloss: 0.122624 valid_1's auc: 0.831843 valid_1's binary_logloss: 0.136049
[95] training's auc: 0.886941 training's binary_logloss: 0.122498 valid_1's auc: 0.831918 valid_1's binary_logloss: 0.13601
[96] training's auc: 0.88741 training's binary_logloss: 0.122373 valid_1's auc: 0.832086 valid_1's binary_logloss: 0.135972
[97] training's auc: 0.887635 training's binary_logloss: 0.122253 valid_1's auc: 0.832012 valid_1's binary_logloss: 0.135959
[98] training's auc: 0.887873 training's binary_logloss: 0.122124 valid_1's auc: 0.831907 valid_1's binary_logloss: 0.135954
[99] training's auc: 0.888171 training's binary_logloss: 0.121999 valid_1's auc: 0.831942 valid_1's binary_logloss: 0.135921
[100] training's auc: 0.888607 training's binary_logloss: 0.121853 valid_1's auc: 0.832323 valid_1's binary_logloss: 0.13586
Did not meet early stopping. Best iteration is:
[100] training's auc: 0.888607 training's binary_logloss: 0.121853 valid_1's auc: 0.832323 valid_1's binary_logloss: 0.13586
[1] training's auc: 0.826967 training's binary_logloss: 0.164817 valid_1's auc: 0.8134 valid_1's binary_logloss: 0.158968
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.8288 training's binary_logloss: 0.162886 valid_1's auc: 0.814707 valid_1's binary_logloss: 0.157378
[3] training's auc: 0.8317 training's binary_logloss: 0.161128 valid_1's auc: 0.817432 valid_1's binary_logloss: 0.155876
[4] training's auc: 0.835206 training's binary_logloss: 0.159551 valid_1's auc: 0.819551 valid_1's binary_logloss: 0.154537
[5] training's auc: 0.839603 training's binary_logloss: 0.158061 valid_1's auc: 0.822276 valid_1's binary_logloss: 0.153323
[6] training's auc: 0.840494 training's binary_logloss: 0.156709 valid_1's auc: 0.822701 valid_1's binary_logloss: 0.152173
[7] training's auc: 0.842314 training's binary_logloss: 0.155454 valid_1's auc: 0.825208 valid_1's binary_logloss: 0.151122
[8] training's auc: 0.842838 training's binary_logloss: 0.154264 valid_1's auc: 0.825236 valid_1's binary_logloss: 0.15015
[9] training's auc: 0.843037 training's binary_logloss: 0.153168 valid_1's auc: 0.825291 valid_1's binary_logloss: 0.149243
[10] training's auc: 0.843692 training's binary_logloss: 0.152147 valid_1's auc: 0.825864 valid_1's binary_logloss: 0.148422
[11] training's auc: 0.844871 training's binary_logloss: 0.151172 valid_1's auc: 0.82632 valid_1's binary_logloss: 0.147629
[12] training's auc: 0.848749 training's binary_logloss: 0.15023 valid_1's auc: 0.828873 valid_1's binary_logloss: 0.146922
[13] training's auc: 0.849352 training's binary_logloss: 0.149349 valid_1's auc: 0.828655 valid_1's binary_logloss: 0.146226
[14] training's auc: 0.850025 training's binary_logloss: 0.148514 valid_1's auc: 0.829639 valid_1's binary_logloss: 0.14557
[15] training's auc: 0.850549 training's binary_logloss: 0.14773 valid_1's auc: 0.829662 valid_1's binary_logloss: 0.144953
[16] training's auc: 0.851553 training's binary_logloss: 0.146977 valid_1's auc: 0.829719 valid_1's binary_logloss: 0.144354
[17] training's auc: 0.852017 training's binary_logloss: 0.146268 valid_1's auc: 0.829888 valid_1's binary_logloss: 0.143786
[18] training's auc: 0.853043 training's binary_logloss: 0.145594 valid_1's auc: 0.830508 valid_1's binary_logloss: 0.143262
[19] training's auc: 0.853319 training's binary_logloss: 0.144943 valid_1's auc: 0.830328 valid_1's binary_logloss: 0.142751
[20] training's auc: 0.854191 training's binary_logloss: 0.14428 valid_1's auc: 0.830572 valid_1's binary_logloss: 0.142286
[21] training's auc: 0.855117 training's binary_logloss: 0.143659 valid_1's auc: 0.830972 valid_1's binary_logloss: 0.141797
[22] training's auc: 0.85555 training's binary_logloss: 0.143065 valid_1's auc: 0.831737 valid_1's binary_logloss: 0.141336
[23] training's auc: 0.856067 training's binary_logloss: 0.142503 valid_1's auc: 0.831775 valid_1's binary_logloss: 0.140919
[24] training's auc: 0.856957 training's binary_logloss: 0.141961 valid_1's auc: 0.83341 valid_1's binary_logloss: 0.140525
[25] training's auc: 0.857424 training's binary_logloss: 0.141449 valid_1's auc: 0.833912 valid_1's binary_logloss: 0.140138
[26] training's auc: 0.858088 training's binary_logloss: 0.140955 valid_1's auc: 0.834209 valid_1's binary_logloss: 0.139767
[27] training's auc: 0.858988 training's binary_logloss: 0.140462 valid_1's auc: 0.834327 valid_1's binary_logloss: 0.139438
[28] training's auc: 0.859477 training's binary_logloss: 0.139974 valid_1's auc: 0.83462 valid_1's binary_logloss: 0.139088
[29] training's auc: 0.859875 training's binary_logloss: 0.139527 valid_1's auc: 0.83472 valid_1's binary_logloss: 0.13876
[30] training's auc: 0.860236 training's binary_logloss: 0.139081 valid_1's auc: 0.835145 valid_1's binary_logloss: 0.13842
[31] training's auc: 0.860422 training's binary_logloss: 0.138661 valid_1's auc: 0.835041 valid_1's binary_logloss: 0.138118
[32] training's auc: 0.860711 training's binary_logloss: 0.138256 valid_1's auc: 0.835184 valid_1's binary_logloss: 0.137836
[33] training's auc: 0.861251 training's binary_logloss: 0.137855 valid_1's auc: 0.835023 valid_1's binary_logloss: 0.137576
[34] training's auc: 0.862019 training's binary_logloss: 0.137447 valid_1's auc: 0.834946 valid_1's binary_logloss: 0.137331
[35] training's auc: 0.862509 training's binary_logloss: 0.13707 valid_1's auc: 0.834938 valid_1's binary_logloss: 0.137073
[36] training's auc: 0.863155 training's binary_logloss: 0.136705 valid_1's auc: 0.835221 valid_1's binary_logloss: 0.136813
[37] training's auc: 0.863496 training's binary_logloss: 0.136348 valid_1's auc: 0.834927 valid_1's binary_logloss: 0.136597
[38] training's auc: 0.864253 training's binary_logloss: 0.136015 valid_1's auc: 0.834741 valid_1's binary_logloss: 0.136358
[39] training's auc: 0.86502 training's binary_logloss: 0.135686 valid_1's auc: 0.834648 valid_1's binary_logloss: 0.136145
[40] training's auc: 0.86522 training's binary_logloss: 0.135365 valid_1's auc: 0.834581 valid_1's binary_logloss: 0.135944
[41] training's auc: 0.86543 training's binary_logloss: 0.135054 valid_1's auc: 0.8345 valid_1's binary_logloss: 0.135732
[42] training's auc: 0.865718 training's binary_logloss: 0.134754 valid_1's auc: 0.83462 valid_1's binary_logloss: 0.135527
[43] training's auc: 0.866078 training's binary_logloss: 0.134457 valid_1's auc: 0.834466 valid_1's binary_logloss: 0.135355
[44] training's auc: 0.866874 training's binary_logloss: 0.134169 valid_1's auc: 0.834783 valid_1's binary_logloss: 0.135166
[45] training's auc: 0.867062 training's binary_logloss: 0.133889 valid_1's auc: 0.834746 valid_1's binary_logloss: 0.135002
[46] training's auc: 0.867584 training's binary_logloss: 0.133611 valid_1's auc: 0.834768 valid_1's binary_logloss: 0.134825
[47] training's auc: 0.867873 training's binary_logloss: 0.133349 valid_1's auc: 0.834829 valid_1's binary_logloss: 0.134676
[48] training's auc: 0.868296 training's binary_logloss: 0.133085 valid_1's auc: 0.834847 valid_1's binary_logloss: 0.134527
[49] training's auc: 0.8687 training's binary_logloss: 0.132826 valid_1's auc: 0.834706 valid_1's binary_logloss: 0.134387
[50] training's auc: 0.869089 training's binary_logloss: 0.132574 valid_1's auc: 0.83471 valid_1's binary_logloss: 0.134265
[51] training's auc: 0.869333 training's binary_logloss: 0.132311 valid_1's auc: 0.835176 valid_1's binary_logloss: 0.134137
[52] training's auc: 0.869804 training's binary_logloss: 0.132084 valid_1's auc: 0.834952 valid_1's binary_logloss: 0.134023
[53] training's auc: 0.869874 training's binary_logloss: 0.131842 valid_1's auc: 0.835072 valid_1's binary_logloss: 0.133896
[54] training's auc: 0.870502 training's binary_logloss: 0.131609 valid_1's auc: 0.834985 valid_1's binary_logloss: 0.133789
[55] training's auc: 0.870768 training's binary_logloss: 0.131379 valid_1's auc: 0.834852 valid_1's binary_logloss: 0.133679
[56] training's auc: 0.871172 training's binary_logloss: 0.131158 valid_1's auc: 0.834935 valid_1's binary_logloss: 0.133578
[57] training's auc: 0.871286 training's binary_logloss: 0.130939 valid_1's auc: 0.834696 valid_1's binary_logloss: 0.133489
[58] training's auc: 0.871651 training's binary_logloss: 0.130729 valid_1's auc: 0.834736 valid_1's binary_logloss: 0.133388
[59] training's auc: 0.871978 training's binary_logloss: 0.130522 valid_1's auc: 0.834773 valid_1's binary_logloss: 0.133301
[60] training's auc: 0.872444 training's binary_logloss: 0.130299 valid_1's auc: 0.834598 valid_1's binary_logloss: 0.133217
[61] training's auc: 0.872874 training's binary_logloss: 0.130105 valid_1's auc: 0.834613 valid_1's binary_logloss: 0.13312
[62] training's auc: 0.87321 training's binary_logloss: 0.129893 valid_1's auc: 0.834586 valid_1's binary_logloss: 0.133037
[63] training's auc: 0.873574 training's binary_logloss: 0.129691 valid_1's auc: 0.834441 valid_1's binary_logloss: 0.132969
[64] training's auc: 0.873974 training's binary_logloss: 0.129489 valid_1's auc: 0.834381 valid_1's binary_logloss: 0.132896
[65] training's auc: 0.87425 training's binary_logloss: 0.129313 valid_1's auc: 0.834344 valid_1's binary_logloss: 0.132811
[66] training's auc: 0.874631 training's binary_logloss: 0.129132 valid_1's auc: 0.834377 valid_1's binary_logloss: 0.132732
Early stopping, best iteration is:
[36] training's auc: 0.863155 training's binary_logloss: 0.136705 valid_1's auc: 0.835221 valid_1's binary_logloss: 0.136813
[1] training's auc: 0.827453 training's binary_logloss: 0.161028 valid_1's auc: 0.806813 valid_1's binary_logloss: 0.166611
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.829904 training's binary_logloss: 0.159195 valid_1's auc: 0.811659 valid_1's binary_logloss: 0.164847
[3] training's auc: 0.835377 training's binary_logloss: 0.157552 valid_1's auc: 0.815633 valid_1's binary_logloss: 0.163292
[4] training's auc: 0.836032 training's binary_logloss: 0.156024 valid_1's auc: 0.817413 valid_1's binary_logloss: 0.161857
[5] training's auc: 0.83622 training's binary_logloss: 0.154622 valid_1's auc: 0.817663 valid_1's binary_logloss: 0.160547
[6] training's auc: 0.837548 training's binary_logloss: 0.153311 valid_1's auc: 0.818915 valid_1's binary_logloss: 0.159314
[7] training's auc: 0.840673 training's binary_logloss: 0.152095 valid_1's auc: 0.822369 valid_1's binary_logloss: 0.158164
[8] training's auc: 0.841312 training's binary_logloss: 0.150971 valid_1's auc: 0.822244 valid_1's binary_logloss: 0.157066
[9] training's auc: 0.842299 training's binary_logloss: 0.149892 valid_1's auc: 0.82273 valid_1's binary_logloss: 0.156102
[10] training's auc: 0.844993 training's binary_logloss: 0.148896 valid_1's auc: 0.825534 valid_1's binary_logloss: 0.155202
[11] training's auc: 0.845482 training's binary_logloss: 0.147956 valid_1's auc: 0.82592 valid_1's binary_logloss: 0.154351
[12] training's auc: 0.845832 training's binary_logloss: 0.147063 valid_1's auc: 0.826337 valid_1's binary_logloss: 0.153541
[13] training's auc: 0.84748 training's binary_logloss: 0.146194 valid_1's auc: 0.82735 valid_1's binary_logloss: 0.15279
[14] training's auc: 0.848537 training's binary_logloss: 0.145368 valid_1's auc: 0.827731 valid_1's binary_logloss: 0.152089
[15] training's auc: 0.849127 training's binary_logloss: 0.144594 valid_1's auc: 0.828079 valid_1's binary_logloss: 0.151406
[16] training's auc: 0.849527 training's binary_logloss: 0.143861 valid_1's auc: 0.828703 valid_1's binary_logloss: 0.150745
[17] training's auc: 0.849871 training's binary_logloss: 0.143176 valid_1's auc: 0.828788 valid_1's binary_logloss: 0.150157
[18] training's auc: 0.8509 training's binary_logloss: 0.142479 valid_1's auc: 0.828882 valid_1's binary_logloss: 0.149583
[19] training's auc: 0.851445 training's binary_logloss: 0.141835 valid_1's auc: 0.829064 valid_1's binary_logloss: 0.149067
[20] training's auc: 0.852088 training's binary_logloss: 0.141212 valid_1's auc: 0.829002 valid_1's binary_logloss: 0.148546
[21] training's auc: 0.852759 training's binary_logloss: 0.140617 valid_1's auc: 0.829517 valid_1's binary_logloss: 0.148056
[22] training's auc: 0.853323 training's binary_logloss: 0.140053 valid_1's auc: 0.830047 valid_1's binary_logloss: 0.147597
[23] training's auc: 0.853741 training's binary_logloss: 0.139521 valid_1's auc: 0.830365 valid_1's binary_logloss: 0.147147
[24] training's auc: 0.854081 training's binary_logloss: 0.138996 valid_1's auc: 0.830242 valid_1's binary_logloss: 0.146728
[25] training's auc: 0.854443 training's binary_logloss: 0.138446 valid_1's auc: 0.830209 valid_1's binary_logloss: 0.146323
[26] training's auc: 0.854638 training's binary_logloss: 0.137928 valid_1's auc: 0.830123 valid_1's binary_logloss: 0.14595
[27] training's auc: 0.85518 training's binary_logloss: 0.137459 valid_1's auc: 0.830445 valid_1's binary_logloss: 0.145582
[28] training's auc: 0.855553 training's binary_logloss: 0.13698 valid_1's auc: 0.830635 valid_1's binary_logloss: 0.145224
[29] training's auc: 0.856056 training's binary_logloss: 0.136523 valid_1's auc: 0.830724 valid_1's binary_logloss: 0.144903
[30] training's auc: 0.856487 training's binary_logloss: 0.136109 valid_1's auc: 0.830631 valid_1's binary_logloss: 0.144586
[31] training's auc: 0.857233 training's binary_logloss: 0.135685 valid_1's auc: 0.831196 valid_1's binary_logloss: 0.144295
[32] training's auc: 0.857892 training's binary_logloss: 0.135278 valid_1's auc: 0.831081 valid_1's binary_logloss: 0.144004
[33] training's auc: 0.858559 training's binary_logloss: 0.1349 valid_1's auc: 0.831258 valid_1's binary_logloss: 0.143718
[34] training's auc: 0.858972 training's binary_logloss: 0.13452 valid_1's auc: 0.831536 valid_1's binary_logloss: 0.143434
[35] training's auc: 0.859366 training's binary_logloss: 0.13416 valid_1's auc: 0.831582 valid_1's binary_logloss: 0.143181
[36] training's auc: 0.859859 training's binary_logloss: 0.133811 valid_1's auc: 0.831559 valid_1's binary_logloss: 0.142948
[37] training's auc: 0.860366 training's binary_logloss: 0.133478 valid_1's auc: 0.831846 valid_1's binary_logloss: 0.142699
[38] training's auc: 0.86112 training's binary_logloss: 0.133158 valid_1's auc: 0.832075 valid_1's binary_logloss: 0.142464
[39] training's auc: 0.861663 training's binary_logloss: 0.132825 valid_1's auc: 0.832203 valid_1's binary_logloss: 0.142235
[40] training's auc: 0.86223 training's binary_logloss: 0.132502 valid_1's auc: 0.832267 valid_1's binary_logloss: 0.142004
[41] training's auc: 0.862577 training's binary_logloss: 0.132196 valid_1's auc: 0.83232 valid_1's binary_logloss: 0.141795
[42] training's auc: 0.863056 training's binary_logloss: 0.131893 valid_1's auc: 0.832342 valid_1's binary_logloss: 0.141598
[43] training's auc: 0.863597 training's binary_logloss: 0.13161 valid_1's auc: 0.832303 valid_1's binary_logloss: 0.141404
[44] training's auc: 0.864352 training's binary_logloss: 0.131321 valid_1's auc: 0.832544 valid_1's binary_logloss: 0.141197
[45] training's auc: 0.864921 training's binary_logloss: 0.131043 valid_1's auc: 0.832576 valid_1's binary_logloss: 0.141012
[46] training's auc: 0.865281 training's binary_logloss: 0.130783 valid_1's auc: 0.832926 valid_1's binary_logloss: 0.140811
[47] training's auc: 0.865848 training's binary_logloss: 0.130518 valid_1's auc: 0.832982 valid_1's binary_logloss: 0.140643
[48] training's auc: 0.866241 training's binary_logloss: 0.130272 valid_1's auc: 0.833318 valid_1's binary_logloss: 0.140466
[49] training's auc: 0.866618 training's binary_logloss: 0.130024 valid_1's auc: 0.833279 valid_1's binary_logloss: 0.140306
[50] training's auc: 0.866986 training's binary_logloss: 0.129774 valid_1's auc: 0.833365 valid_1's binary_logloss: 0.140163
[51] training's auc: 0.867399 training's binary_logloss: 0.129537 valid_1's auc: 0.833405 valid_1's binary_logloss: 0.14002
[52] training's auc: 0.868552 training's binary_logloss: 0.129299 valid_1's auc: 0.834009 valid_1's binary_logloss: 0.139894
[53] training's auc: 0.869718 training's binary_logloss: 0.129057 valid_1's auc: 0.835083 valid_1's binary_logloss: 0.139746
[54] training's auc: 0.870065 training's binary_logloss: 0.128841 valid_1's auc: 0.835267 valid_1's binary_logloss: 0.139614
[55] training's auc: 0.870681 training's binary_logloss: 0.128607 valid_1's auc: 0.835432 valid_1's binary_logloss: 0.139467
[56] training's auc: 0.871394 training's binary_logloss: 0.128381 valid_1's auc: 0.835545 valid_1's binary_logloss: 0.139347
[57] training's auc: 0.871762 training's binary_logloss: 0.128171 valid_1's auc: 0.83563 valid_1's binary_logloss: 0.139204
[58] training's auc: 0.872132 training's binary_logloss: 0.127971 valid_1's auc: 0.835571 valid_1's binary_logloss: 0.139096
[59] training's auc: 0.87267 training's binary_logloss: 0.127765 valid_1's auc: 0.83571 valid_1's binary_logloss: 0.138988
[60] training's auc: 0.873129 training's binary_logloss: 0.127569 valid_1's auc: 0.835774 valid_1's binary_logloss: 0.138883
[61] training's auc: 0.873473 training's binary_logloss: 0.127367 valid_1's auc: 0.836031 valid_1's binary_logloss: 0.138789
[62] training's auc: 0.873955 training's binary_logloss: 0.127179 valid_1's auc: 0.836129 valid_1's binary_logloss: 0.138689
[63] training's auc: 0.874546 training's binary_logloss: 0.126987 valid_1's auc: 0.836354 valid_1's binary_logloss: 0.1386
[64] training's auc: 0.875058 training's binary_logloss: 0.126782 valid_1's auc: 0.836385 valid_1's binary_logloss: 0.138511
[65] training's auc: 0.875277 training's binary_logloss: 0.126612 valid_1's auc: 0.836409 valid_1's binary_logloss: 0.138409
[66] training's auc: 0.875556 training's binary_logloss: 0.126443 valid_1's auc: 0.836437 valid_1's binary_logloss: 0.138334
[67] training's auc: 0.875889 training's binary_logloss: 0.126264 valid_1's auc: 0.836612 valid_1's binary_logloss: 0.138242
[68] training's auc: 0.876126 training's binary_logloss: 0.126097 valid_1's auc: 0.836711 valid_1's binary_logloss: 0.138149
[69] training's auc: 0.876572 training's binary_logloss: 0.125905 valid_1's auc: 0.836746 valid_1's binary_logloss: 0.138093
[70] training's auc: 0.876964 training's binary_logloss: 0.125722 valid_1's auc: 0.83685 valid_1's binary_logloss: 0.138018
[71] training's auc: 0.877469 training's binary_logloss: 0.125553 valid_1's auc: 0.836731 valid_1's binary_logloss: 0.137946
[72] training's auc: 0.877826 training's binary_logloss: 0.125382 valid_1's auc: 0.836683 valid_1's binary_logloss: 0.137873
[73] training's auc: 0.87829 training's binary_logloss: 0.125219 valid_1's auc: 0.836404 valid_1's binary_logloss: 0.137818
[74] training's auc: 0.878669 training's binary_logloss: 0.125051 valid_1's auc: 0.836643 valid_1's binary_logloss: 0.137749
[75] training's auc: 0.879322 training's binary_logloss: 0.12486 valid_1's auc: 0.836792 valid_1's binary_logloss: 0.137661
[76] training's auc: 0.879804 training's binary_logloss: 0.124696 valid_1's auc: 0.836818 valid_1's binary_logloss: 0.137606
[77] training's auc: 0.880026 training's binary_logloss: 0.124541 valid_1's auc: 0.836802 valid_1's binary_logloss: 0.137539
[78] training's auc: 0.880426 training's binary_logloss: 0.124379 valid_1's auc: 0.836781 valid_1's binary_logloss: 0.137492
[79] training's auc: 0.880704 training's binary_logloss: 0.124224 valid_1's auc: 0.836874 valid_1's binary_logloss: 0.137412
[80] training's auc: 0.881076 training's binary_logloss: 0.124075 valid_1's auc: 0.836824 valid_1's binary_logloss: 0.137369
[81] training's auc: 0.881396 training's binary_logloss: 0.123924 valid_1's auc: 0.836758 valid_1's binary_logloss: 0.137322
[82] training's auc: 0.881724 training's binary_logloss: 0.123777 valid_1's auc: 0.836599 valid_1's binary_logloss: 0.137289
[83] training's auc: 0.88199 training's binary_logloss: 0.123632 valid_1's auc: 0.836575 valid_1's binary_logloss: 0.137241
[84] training's auc: 0.882298 training's binary_logloss: 0.123487 valid_1's auc: 0.836587 valid_1's binary_logloss: 0.137193
[85] training's auc: 0.882762 training's binary_logloss: 0.123336 valid_1's auc: 0.836623 valid_1's binary_logloss: 0.137145
[86] training's auc: 0.883136 training's binary_logloss: 0.123183 valid_1's auc: 0.836621 valid_1's binary_logloss: 0.137102
[87] training's auc: 0.883507 training's binary_logloss: 0.123035 valid_1's auc: 0.836611 valid_1's binary_logloss: 0.137057
[88] training's auc: 0.883828 training's binary_logloss: 0.122897 valid_1's auc: 0.836621 valid_1's binary_logloss: 0.137008
[89] training's auc: 0.88417 training's binary_logloss: 0.122756 valid_1's auc: 0.836821 valid_1's binary_logloss: 0.136954
[90] training's auc: 0.884509 training's binary_logloss: 0.122613 valid_1's auc: 0.83678 valid_1's binary_logloss: 0.136923
[91] training's auc: 0.884838 training's binary_logloss: 0.122484 valid_1's auc: 0.83692 valid_1's binary_logloss: 0.136869
[92] training's auc: 0.885057 training's binary_logloss: 0.12236 valid_1's auc: 0.837101 valid_1's binary_logloss: 0.136829
[93] training's auc: 0.885389 training's binary_logloss: 0.122229 valid_1's auc: 0.837286 valid_1's binary_logloss: 0.136787
[94] training's auc: 0.885755 training's binary_logloss: 0.122097 valid_1's auc: 0.837154 valid_1's binary_logloss: 0.136771
[95] training's auc: 0.886034 training's binary_logloss: 0.121969 valid_1's auc: 0.837188 valid_1's binary_logloss: 0.136739
[96] training's auc: 0.886282 training's binary_logloss: 0.121842 valid_1's auc: 0.837311 valid_1's binary_logloss: 0.136691
[97] training's auc: 0.886705 training's binary_logloss: 0.1217 valid_1's auc: 0.837263 valid_1's binary_logloss: 0.13665
[98] training's auc: 0.88704 training's binary_logloss: 0.121567 valid_1's auc: 0.837304 valid_1's binary_logloss: 0.136628
[99] training's auc: 0.887295 training's binary_logloss: 0.121454 valid_1's auc: 0.837295 valid_1's binary_logloss: 0.136603
[100] training's auc: 0.887604 training's binary_logloss: 0.121327 valid_1's auc: 0.837311 valid_1's binary_logloss: 0.136583
Did not meet early stopping. Best iteration is:
[100] training's auc: 0.887604 training's binary_logloss: 0.121327 valid_1's auc: 0.837311 valid_1's binary_logloss: 0.136583
[1] training's auc: 0.839834 training's binary_logloss: 0.154352 valid_1's auc: 0.808638 valid_1's binary_logloss: 0.15719
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.847182 training's binary_logloss: 0.148507 valid_1's auc: 0.814277 valid_1's binary_logloss: 0.152497
[3] training's auc: 0.855154 training's binary_logloss: 0.144225 valid_1's auc: 0.818755 valid_1's binary_logloss: 0.149157
[4] training's auc: 0.859281 training's binary_logloss: 0.140782 valid_1's auc: 0.822637 valid_1's binary_logloss: 0.146861
[5] training's auc: 0.863814 training's binary_logloss: 0.138054 valid_1's auc: 0.825609 valid_1's binary_logloss: 0.14482
[6] training's auc: 0.866218 training's binary_logloss: 0.135771 valid_1's auc: 0.826956 valid_1's binary_logloss: 0.143135
[7] training's auc: 0.868683 training's binary_logloss: 0.133794 valid_1's auc: 0.827262 valid_1's binary_logloss: 0.141881
[8] training's auc: 0.870814 training's binary_logloss: 0.132097 valid_1's auc: 0.827959 valid_1's binary_logloss: 0.140796
[9] training's auc: 0.873787 training's binary_logloss: 0.130526 valid_1's auc: 0.828154 valid_1's binary_logloss: 0.139931
[10] training's auc: 0.876825 training's binary_logloss: 0.129125 valid_1's auc: 0.828262 valid_1's binary_logloss: 0.139301
[11] training's auc: 0.878485 training's binary_logloss: 0.127951 valid_1's auc: 0.828864 valid_1's binary_logloss: 0.138705
[12] training's auc: 0.880776 training's binary_logloss: 0.126737 valid_1's auc: 0.828346 valid_1's binary_logloss: 0.13827
[13] training's auc: 0.882487 training's binary_logloss: 0.125728 valid_1's auc: 0.828 valid_1's binary_logloss: 0.137937
[14] training's auc: 0.884676 training's binary_logloss: 0.124695 valid_1's auc: 0.827791 valid_1's binary_logloss: 0.137613
[15] training's auc: 0.887122 training's binary_logloss: 0.123752 valid_1's auc: 0.828267 valid_1's binary_logloss: 0.137327
[16] training's auc: 0.889033 training's binary_logloss: 0.122928 valid_1's auc: 0.827911 valid_1's binary_logloss: 0.137065
[17] training's auc: 0.890965 training's binary_logloss: 0.122067 valid_1's auc: 0.828672 valid_1's binary_logloss: 0.136893
[18] training's auc: 0.89289 training's binary_logloss: 0.121294 valid_1's auc: 0.829732 valid_1's binary_logloss: 0.136636
[19] training's auc: 0.894386 training's binary_logloss: 0.120557 valid_1's auc: 0.829365 valid_1's binary_logloss: 0.136564
[20] training's auc: 0.89607 training's binary_logloss: 0.119786 valid_1's auc: 0.829297 valid_1's binary_logloss: 0.136399
[21] training's auc: 0.897421 training's binary_logloss: 0.119078 valid_1's auc: 0.829351 valid_1's binary_logloss: 0.136359
[22] training's auc: 0.899161 training's binary_logloss: 0.118352 valid_1's auc: 0.829479 valid_1's binary_logloss: 0.13634
[23] training's auc: 0.900434 training's binary_logloss: 0.11768 valid_1's auc: 0.828466 valid_1's binary_logloss: 0.136443
[24] training's auc: 0.902254 training's binary_logloss: 0.116997 valid_1's auc: 0.828857 valid_1's binary_logloss: 0.136452
[25] training's auc: 0.903899 training's binary_logloss: 0.116392 valid_1's auc: 0.829206 valid_1's binary_logloss: 0.136365
[26] training's auc: 0.905423 training's binary_logloss: 0.115853 valid_1's auc: 0.829048 valid_1's binary_logloss: 0.136326
[27] training's auc: 0.906982 training's binary_logloss: 0.115297 valid_1's auc: 0.828895 valid_1's binary_logloss: 0.136391
[28] training's auc: 0.908395 training's binary_logloss: 0.114743 valid_1's auc: 0.828435 valid_1's binary_logloss: 0.136432
[29] training's auc: 0.909781 training's binary_logloss: 0.11424 valid_1's auc: 0.828912 valid_1's binary_logloss: 0.136379
[30] training's auc: 0.910819 training's binary_logloss: 0.113673 valid_1's auc: 0.828319 valid_1's binary_logloss: 0.136549
[31] training's auc: 0.911873 training's binary_logloss: 0.113202 valid_1's auc: 0.827942 valid_1's binary_logloss: 0.136615
[32] training's auc: 0.912951 training's binary_logloss: 0.112786 valid_1's auc: 0.827579 valid_1's binary_logloss: 0.136664
[33] training's auc: 0.914168 training's binary_logloss: 0.112267 valid_1's auc: 0.827508 valid_1's binary_logloss: 0.1367
[34] training's auc: 0.91506 training's binary_logloss: 0.11187 valid_1's auc: 0.827292 valid_1's binary_logloss: 0.1368
[35] training's auc: 0.916022 training's binary_logloss: 0.111425 valid_1's auc: 0.826494 valid_1's binary_logloss: 0.136948
[36] training's auc: 0.917235 training's binary_logloss: 0.110897 valid_1's auc: 0.826351 valid_1's binary_logloss: 0.137005
[37] training's auc: 0.918168 training's binary_logloss: 0.110506 valid_1's auc: 0.82655 valid_1's binary_logloss: 0.137049
[38] training's auc: 0.918849 training's binary_logloss: 0.110079 valid_1's auc: 0.826323 valid_1's binary_logloss: 0.137101
[39] training's auc: 0.919657 training's binary_logloss: 0.109662 valid_1's auc: 0.826186 valid_1's binary_logloss: 0.137125
[40] training's auc: 0.920352 training's binary_logloss: 0.109272 valid_1's auc: 0.82551 valid_1's binary_logloss: 0.137231
[41] training's auc: 0.92126 training's binary_logloss: 0.108884 valid_1's auc: 0.824999 valid_1's binary_logloss: 0.137315
[42] training's auc: 0.922364 training's binary_logloss: 0.108464 valid_1's auc: 0.824728 valid_1's binary_logloss: 0.1374
[43] training's auc: 0.923187 training's binary_logloss: 0.108033 valid_1's auc: 0.82441 valid_1's binary_logloss: 0.137467
[44] training's auc: 0.923774 training's binary_logloss: 0.10768 valid_1's auc: 0.823833 valid_1's binary_logloss: 0.137587
[45] training's auc: 0.924582 training's binary_logloss: 0.107233 valid_1's auc: 0.823662 valid_1's binary_logloss: 0.137582
[46] training's auc: 0.925193 training's binary_logloss: 0.106828 valid_1's auc: 0.823473 valid_1's binary_logloss: 0.13771
[47] training's auc: 0.92567 training's binary_logloss: 0.106524 valid_1's auc: 0.823357 valid_1's binary_logloss: 0.137776
[48] training's auc: 0.926156 training's binary_logloss: 0.106183 valid_1's auc: 0.823247 valid_1's binary_logloss: 0.137843
Early stopping, best iteration is:
[18] training's auc: 0.89289 training's binary_logloss: 0.121294 valid_1's auc: 0.829732 valid_1's binary_logloss: 0.136636
[1] training's auc: 0.835657 training's binary_logloss: 0.156838 valid_1's auc: 0.815021 valid_1's binary_logloss: 0.152806
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.843617 training's binary_logloss: 0.150755 valid_1's auc: 0.82133 valid_1's binary_logloss: 0.147967
[3] training's auc: 0.853356 training's binary_logloss: 0.14636 valid_1's auc: 0.827389 valid_1's binary_logloss: 0.144651
[4] training's auc: 0.858913 training's binary_logloss: 0.142986 valid_1's auc: 0.830279 valid_1's binary_logloss: 0.142089
[5] training's auc: 0.861235 training's binary_logloss: 0.140276 valid_1's auc: 0.831351 valid_1's binary_logloss: 0.140223
[6] training's auc: 0.864021 training's binary_logloss: 0.137883 valid_1's auc: 0.83278 valid_1's binary_logloss: 0.138673
[7] training's auc: 0.866915 training's binary_logloss: 0.135819 valid_1's auc: 0.83239 valid_1's binary_logloss: 0.137495
[8] training's auc: 0.86985 training's binary_logloss: 0.134063 valid_1's auc: 0.832585 valid_1's binary_logloss: 0.136466
[9] training's auc: 0.872192 training's binary_logloss: 0.132472 valid_1's auc: 0.832474 valid_1's binary_logloss: 0.13565
[10] training's auc: 0.873803 training's binary_logloss: 0.131142 valid_1's auc: 0.832642 valid_1's binary_logloss: 0.134896
[11] training's auc: 0.875618 training's binary_logloss: 0.12991 valid_1's auc: 0.832875 valid_1's binary_logloss: 0.134238
[12] training's auc: 0.878024 training's binary_logloss: 0.128758 valid_1's auc: 0.832778 valid_1's binary_logloss: 0.133776
[13] training's auc: 0.880637 training's binary_logloss: 0.127666 valid_1's auc: 0.832027 valid_1's binary_logloss: 0.133475
[14] training's auc: 0.88274 training's binary_logloss: 0.12667 valid_1's auc: 0.831986 valid_1's binary_logloss: 0.133128
[15] training's auc: 0.884263 training's binary_logloss: 0.125831 valid_1's auc: 0.832195 valid_1's binary_logloss: 0.132861
[16] training's auc: 0.885795 training's binary_logloss: 0.12499 valid_1's auc: 0.831992 valid_1's binary_logloss: 0.132605
[17] training's auc: 0.887633 training's binary_logloss: 0.12411 valid_1's auc: 0.832058 valid_1's binary_logloss: 0.13245
[18] training's auc: 0.88958 training's binary_logloss: 0.123241 valid_1's auc: 0.831866 valid_1's binary_logloss: 0.132277
[19] training's auc: 0.891619 training's binary_logloss: 0.122498 valid_1's auc: 0.832179 valid_1's binary_logloss: 0.132096
[20] training's auc: 0.89371 training's binary_logloss: 0.12174 valid_1's auc: 0.832278 valid_1's binary_logloss: 0.132047
[21] training's auc: 0.895436 training's binary_logloss: 0.121027 valid_1's auc: 0.832712 valid_1's binary_logloss: 0.131903
[22] training's auc: 0.896993 training's binary_logloss: 0.120308 valid_1's auc: 0.833078 valid_1's binary_logloss: 0.131844
[23] training's auc: 0.898668 training's binary_logloss: 0.119654 valid_1's auc: 0.833001 valid_1's binary_logloss: 0.131773
[24] training's auc: 0.900084 training's binary_logloss: 0.119046 valid_1's auc: 0.832902 valid_1's binary_logloss: 0.131708
[25] training's auc: 0.901526 training's binary_logloss: 0.118483 valid_1's auc: 0.832565 valid_1's binary_logloss: 0.131682
[26] training's auc: 0.903262 training's binary_logloss: 0.117856 valid_1's auc: 0.832779 valid_1's binary_logloss: 0.131611
[27] training's auc: 0.904362 training's binary_logloss: 0.117342 valid_1's auc: 0.832811 valid_1's binary_logloss: 0.131591
[28] training's auc: 0.905752 training's binary_logloss: 0.116858 valid_1's auc: 0.832756 valid_1's binary_logloss: 0.131596
[29] training's auc: 0.907468 training's binary_logloss: 0.116293 valid_1's auc: 0.832399 valid_1's binary_logloss: 0.13166
[30] training's auc: 0.90867 training's binary_logloss: 0.115809 valid_1's auc: 0.832778 valid_1's binary_logloss: 0.131552
[31] training's auc: 0.910385 training's binary_logloss: 0.115284 valid_1's auc: 0.832799 valid_1's binary_logloss: 0.131549
[32] training's auc: 0.91172 training's binary_logloss: 0.114817 valid_1's auc: 0.833296 valid_1's binary_logloss: 0.131476
[33] training's auc: 0.912797 training's binary_logloss: 0.114343 valid_1's auc: 0.83355 valid_1's binary_logloss: 0.131454
[34] training's auc: 0.913669 training's binary_logloss: 0.113954 valid_1's auc: 0.833106 valid_1's binary_logloss: 0.131499
[35] training's auc: 0.914676 training's binary_logloss: 0.11356 valid_1's auc: 0.833256 valid_1's binary_logloss: 0.131463
[36] training's auc: 0.915768 training's binary_logloss: 0.113156 valid_1's auc: 0.833049 valid_1's binary_logloss: 0.131518
[37] training's auc: 0.917127 training's binary_logloss: 0.112657 valid_1's auc: 0.833034 valid_1's binary_logloss: 0.13152
[38] training's auc: 0.918153 training's binary_logloss: 0.112155 valid_1's auc: 0.832753 valid_1's binary_logloss: 0.131572
[39] training's auc: 0.919159 training's binary_logloss: 0.111705 valid_1's auc: 0.833074 valid_1's binary_logloss: 0.131484
[40] training's auc: 0.920131 training's binary_logloss: 0.111192 valid_1's auc: 0.83266 valid_1's binary_logloss: 0.131553
[41] training's auc: 0.920861 training's binary_logloss: 0.110845 valid_1's auc: 0.832171 valid_1's binary_logloss: 0.131644
[42] training's auc: 0.921545 training's binary_logloss: 0.110453 valid_1's auc: 0.831972 valid_1's binary_logloss: 0.131677
[43] training's auc: 0.922234 training's binary_logloss: 0.110052 valid_1's auc: 0.832376 valid_1's binary_logloss: 0.131621
[44] training's auc: 0.923281 training's binary_logloss: 0.109591 valid_1's auc: 0.832133 valid_1's binary_logloss: 0.131691
[45] training's auc: 0.924071 training's binary_logloss: 0.109168 valid_1's auc: 0.832239 valid_1's binary_logloss: 0.131699
[46] training's auc: 0.924652 training's binary_logloss: 0.108844 valid_1's auc: 0.831973 valid_1's binary_logloss: 0.131747
[47] training's auc: 0.925472 training's binary_logloss: 0.108406 valid_1's auc: 0.832047 valid_1's binary_logloss: 0.131781
[48] training's auc: 0.926147 training's binary_logloss: 0.108 valid_1's auc: 0.831923 valid_1's binary_logloss: 0.131792
[49] training's auc: 0.926885 training's binary_logloss: 0.107717 valid_1's auc: 0.832095 valid_1's binary_logloss: 0.131805
[50] training's auc: 0.927635 training's binary_logloss: 0.107288 valid_1's auc: 0.831885 valid_1's binary_logloss: 0.131851
[51] training's auc: 0.928111 training's binary_logloss: 0.106951 valid_1's auc: 0.83182 valid_1's binary_logloss: 0.131888
[52] training's auc: 0.928766 training's binary_logloss: 0.106577 valid_1's auc: 0.831834 valid_1's binary_logloss: 0.131907
[53] training's auc: 0.929419 training's binary_logloss: 0.106205 valid_1's auc: 0.831599 valid_1's binary_logloss: 0.132009
[54] training's auc: 0.929811 training's binary_logloss: 0.105929 valid_1's auc: 0.831712 valid_1's binary_logloss: 0.132012
[55] training's auc: 0.930174 training's binary_logloss: 0.105682 valid_1's auc: 0.831452 valid_1's binary_logloss: 0.132082
[56] training's auc: 0.930674 training's binary_logloss: 0.105383 valid_1's auc: 0.831178 valid_1's binary_logloss: 0.132115
[57] training's auc: 0.93155 training's binary_logloss: 0.105069 valid_1's auc: 0.830999 valid_1's binary_logloss: 0.132162
[58] training's auc: 0.932056 training's binary_logloss: 0.104761 valid_1's auc: 0.831078 valid_1's binary_logloss: 0.132162
[59] training's auc: 0.932587 training's binary_logloss: 0.104534 valid_1's auc: 0.831255 valid_1's binary_logloss: 0.132167
[60] training's auc: 0.933245 training's binary_logloss: 0.104127 valid_1's auc: 0.831073 valid_1's binary_logloss: 0.132246
[61] training's auc: 0.933747 training's binary_logloss: 0.103792 valid_1's auc: 0.830919 valid_1's binary_logloss: 0.132303
[62] training's auc: 0.93407 training's binary_logloss: 0.103559 valid_1's auc: 0.830777 valid_1's binary_logloss: 0.132368
[63] training's auc: 0.93446 training's binary_logloss: 0.103276 valid_1's auc: 0.831059 valid_1's binary_logloss: 0.132351
Early stopping, best iteration is:
[33] training's auc: 0.912797 training's binary_logloss: 0.114343 valid_1's auc: 0.83355 valid_1's binary_logloss: 0.131454
[1] training's auc: 0.837929 training's binary_logloss: 0.153638 valid_1's auc: 0.815704 valid_1's binary_logloss: 0.160115
Training until validation scores don't improve for 30 rounds.
[2] training's auc: 0.844003 training's binary_logloss: 0.147833 valid_1's auc: 0.821622 valid_1's binary_logloss: 0.15493
[3] training's auc: 0.851685 training's binary_logloss: 0.143459 valid_1's auc: 0.824137 valid_1's binary_logloss: 0.151269
[4] training's auc: 0.857652 training's binary_logloss: 0.139994 valid_1's auc: 0.826429 valid_1's binary_logloss: 0.148515
[5] training's auc: 0.861569 training's binary_logloss: 0.137224 valid_1's auc: 0.827872 valid_1's binary_logloss: 0.14642
[6] training's auc: 0.864246 training's binary_logloss: 0.134984 valid_1's auc: 0.828507 valid_1's binary_logloss: 0.144725
[7] training's auc: 0.865723 training's binary_logloss: 0.132975 valid_1's auc: 0.829461 valid_1's binary_logloss: 0.143362
[8] training's auc: 0.868019 training's binary_logloss: 0.131223 valid_1's auc: 0.830174 valid_1's binary_logloss: 0.14231
[9] training's auc: 0.872453 training's binary_logloss: 0.129694 valid_1's auc: 0.832482 valid_1's binary_logloss: 0.141406
[10] training's auc: 0.874712 training's binary_logloss: 0.128308 valid_1's auc: 0.832976 valid_1's binary_logloss: 0.140651
[11] training's auc: 0.876814 training's binary_logloss: 0.127088 valid_1's auc: 0.832897 valid_1's binary_logloss: 0.140062
[12] training's auc: 0.878875 training's binary_logloss: 0.126047 valid_1's auc: 0.832942 valid_1's binary_logloss: 0.139592
[13] training's auc: 0.881925 training's binary_logloss: 0.124996 valid_1's auc: 0.833155 valid_1's binary_logloss: 0.139166
[14] training's auc: 0.883849 training's binary_logloss: 0.12402 valid_1's auc: 0.833185 valid_1's binary_logloss: 0.138782
[15] training's auc: 0.885704 training's binary_logloss: 0.123094 valid_1's auc: 0.832441 valid_1's binary_logloss: 0.138552
[16] training's auc: 0.887293 training's binary_logloss: 0.122238 valid_1's auc: 0.833005 valid_1's binary_logloss: 0.138227
[17] training's auc: 0.889667 training's binary_logloss: 0.121356 valid_1's auc: 0.83326 valid_1's binary_logloss: 0.138003
[18] training's auc: 0.891218 training's binary_logloss: 0.120597 valid_1's auc: 0.832618 valid_1's binary_logloss: 0.137829
[19] training's auc: 0.893025 training's binary_logloss: 0.119908 valid_1's auc: 0.832511 valid_1's binary_logloss: 0.137682
[20] training's auc: 0.895127 training's binary_logloss: 0.119105 valid_1's auc: 0.83251 valid_1's binary_logloss: 0.137633
[21] training's auc: 0.896346 training's binary_logloss: 0.118452 valid_1's auc: 0.832493 valid_1's binary_logloss: 0.137559
[22] training's auc: 0.897896 training's binary_logloss: 0.117775 valid_1's auc: 0.832428 valid_1's binary_logloss: 0.137519
[23] training's auc: 0.899678 training's binary_logloss: 0.117156 valid_1's auc: 0.832318 valid_1's binary_logloss: 0.137498
[24] training's auc: 0.901432 training's binary_logloss: 0.11655 valid_1's auc: 0.831854 valid_1's binary_logloss: 0.137515
[25] training's auc: 0.902836 training's binary_logloss: 0.115976 valid_1's auc: 0.832573 valid_1's binary_logloss: 0.137372
[26] training's auc: 0.904371 training's binary_logloss: 0.115315 valid_1's auc: 0.832446 valid_1's binary_logloss: 0.137402
[27] training's auc: 0.905307 training's binary_logloss: 0.114786 valid_1's auc: 0.832329 valid_1's binary_logloss: 0.137405
[28] training's auc: 0.906301 training's binary_logloss: 0.114295 valid_1's auc: 0.832219 valid_1's binary_logloss: 0.137385
[29] training's auc: 0.907927 training's binary_logloss: 0.113781 valid_1's auc: 0.831662 valid_1's binary_logloss: 0.137475
[30] training's auc: 0.908883 training's binary_logloss: 0.113294 valid_1's auc: 0.831797 valid_1's binary_logloss: 0.137475
[31] training's auc: 0.910245 training's binary_logloss: 0.11281 valid_1's auc: 0.831672 valid_1's binary_logloss: 0.1375
[32] training's auc: 0.911322 training's binary_logloss: 0.112319 valid_1's auc: 0.831762 valid_1's binary_logloss: 0.137517
[33] training's auc: 0.913073 training's binary_logloss: 0.11185 valid_1's auc: 0.831312 valid_1's binary_logloss: 0.137599
[34] training's auc: 0.914643 training's binary_logloss: 0.111256 valid_1's auc: 0.831441 valid_1's binary_logloss: 0.13757
[35] training's auc: 0.916048 training's binary_logloss: 0.11075 valid_1's auc: 0.831094 valid_1's binary_logloss: 0.137675
[36] training's auc: 0.916862 training's binary_logloss: 0.110311 valid_1's auc: 0.831134 valid_1's binary_logloss: 0.137706
[37] training's auc: 0.91794 training's binary_logloss: 0.10981 valid_1's auc: 0.830993 valid_1's binary_logloss: 0.137799
[38] training's auc: 0.919111 training's binary_logloss: 0.10925 valid_1's auc: 0.830643 valid_1's binary_logloss: 0.137861
[39] training's auc: 0.919913 training's binary_logloss: 0.108831 valid_1's auc: 0.830343 valid_1's binary_logloss: 0.137944
[40] training's auc: 0.92086 training's binary_logloss: 0.108358 valid_1's auc: 0.830274 valid_1's binary_logloss: 0.137985
[41] training's auc: 0.922057 training's binary_logloss: 0.107825 valid_1's auc: 0.830128 valid_1's binary_logloss: 0.138017
[42] training's auc: 0.923063 training's binary_logloss: 0.10739 valid_1's auc: 0.830457 valid_1's binary_logloss: 0.137987
[43] training's auc: 0.924026 training's binary_logloss: 0.106895 valid_1's auc: 0.829871 valid_1's binary_logloss: 0.138128
[44] training's auc: 0.924758 training's binary_logloss: 0.106478 valid_1's auc: 0.829475 valid_1's binary_logloss: 0.138266
[45] training's auc: 0.925409 training's binary_logloss: 0.106107 valid_1's auc: 0.829451 valid_1's binary_logloss: 0.138299
[46] training's auc: 0.926666 training's binary_logloss: 0.105719 valid_1's auc: 0.829492 valid_1's binary_logloss: 0.138339
[47] training's auc: 0.927291 training's binary_logloss: 0.10531 valid_1's auc: 0.829162 valid_1's binary_logloss: 0.138421
Early stopping, best iteration is:
[17] training's auc: 0.889667 training's binary_logloss: 0.121356 valid_1's auc: 0.83326 valid_1's binary_logloss: 0.138003
100%|██████████| 50/50 [14:31<00:00, 17.42s/it, best loss: -0.8357603864531148]
best: {'learning_rate': 0.08438316347746441, 'max_depth': 133.0, 'min_child_samples': 90.0, 'num_leaves': 32.0, 'subsample': 0.8045805798440638}
In [70]:
lgbm_clf = LGBMClassifier(n_estimators=500,
num_leaves=int(best['num_leaves']),
max_depth=int(best['max_depth']),
min_child_samples=int(best['min_child_samples']),
subsample=round(best['subsample'],5),
learning_rate=round(best['learning_rate'],5))
# evaluation metric을 auc로, early stopping은 100으로 설정하고 학습 수행
lgbm_clf.fit(X_tr, y_tr, early_stopping_rounds=100,
eval_metric='auc', eval_set=[(X_tr, y_tr), (X_val, y_val)])
lgbm_roc_score = roc_auc_score(y_test, lgbm_clf.predict_proba(X_test)[:, 1])
print('roc auc: {0:.4f}'.format(lgbm_roc_score))
[1] training's auc: 0.828952 training's binary_logloss: 0.156939 valid_1's auc: 0.804518 valid_1's binary_logloss: 0.158994 Training until validation scores don't improve for 100 rounds. [2] training's auc: 0.835505 training's binary_logloss: 0.151904 valid_1's auc: 0.812524 valid_1's binary_logloss: 0.15466 [3] training's auc: 0.839587 training's binary_logloss: 0.147998 valid_1's auc: 0.81446 valid_1's binary_logloss: 0.151479 [4] training's auc: 0.845427 training's binary_logloss: 0.14493 valid_1's auc: 0.816011 valid_1's binary_logloss: 0.149058 [5] training's auc: 0.849376 training's binary_logloss: 0.142371 valid_1's auc: 0.820716 valid_1's binary_logloss: 0.146941 [6] training's auc: 0.852213 training's binary_logloss: 0.14032 valid_1's auc: 0.82189 valid_1's binary_logloss: 0.145291 [7] training's auc: 0.853833 training's binary_logloss: 0.138494 valid_1's auc: 0.824131 valid_1's binary_logloss: 0.143839 [8] training's auc: 0.855875 training's binary_logloss: 0.136907 valid_1's auc: 0.825237 valid_1's binary_logloss: 0.142685 [9] training's auc: 0.858224 training's binary_logloss: 0.135546 valid_1's auc: 0.827135 valid_1's binary_logloss: 0.141638 [10] training's auc: 0.859108 training's binary_logloss: 0.134329 valid_1's auc: 0.827484 valid_1's binary_logloss: 0.140764 [11] training's auc: 0.86147 training's binary_logloss: 0.13321 valid_1's auc: 0.828404 valid_1's binary_logloss: 0.140018 [12] training's auc: 0.862609 training's binary_logloss: 0.132203 valid_1's auc: 0.829326 valid_1's binary_logloss: 0.139315 [13] training's auc: 0.863387 training's binary_logloss: 0.131312 valid_1's auc: 0.829372 valid_1's binary_logloss: 0.138748 [14] training's auc: 0.865905 training's binary_logloss: 0.130431 valid_1's auc: 0.829889 valid_1's binary_logloss: 0.138252 [15] training's auc: 0.867061 training's binary_logloss: 0.129676 valid_1's auc: 0.830096 valid_1's binary_logloss: 0.137804 [16] training's auc: 0.868237 training's binary_logloss: 0.128948 valid_1's auc: 0.829688 valid_1's binary_logloss: 0.137491 [17] training's auc: 0.869616 training's binary_logloss: 0.128315 valid_1's auc: 0.83007 valid_1's binary_logloss: 0.137157 [18] training's auc: 0.870506 training's binary_logloss: 0.127727 valid_1's auc: 0.830253 valid_1's binary_logloss: 0.136885 [19] training's auc: 0.872239 training's binary_logloss: 0.127066 valid_1's auc: 0.830392 valid_1's binary_logloss: 0.136576 [20] training's auc: 0.873389 training's binary_logloss: 0.126522 valid_1's auc: 0.830267 valid_1's binary_logloss: 0.136385 [21] training's auc: 0.874346 training's binary_logloss: 0.126017 valid_1's auc: 0.830253 valid_1's binary_logloss: 0.136246 [22] training's auc: 0.875758 training's binary_logloss: 0.125454 valid_1's auc: 0.831349 valid_1's binary_logloss: 0.136019 [23] training's auc: 0.87754 training's binary_logloss: 0.124941 valid_1's auc: 0.83168 valid_1's binary_logloss: 0.135832 [24] training's auc: 0.878843 training's binary_logloss: 0.124457 valid_1's auc: 0.831939 valid_1's binary_logloss: 0.13566 [25] training's auc: 0.879882 training's binary_logloss: 0.124009 valid_1's auc: 0.831706 valid_1's binary_logloss: 0.13557 [26] training's auc: 0.880909 training's binary_logloss: 0.123567 valid_1's auc: 0.831842 valid_1's binary_logloss: 0.135446 [27] training's auc: 0.881455 training's binary_logloss: 0.123188 valid_1's auc: 0.832356 valid_1's binary_logloss: 0.135318 [28] training's auc: 0.88225 training's binary_logloss: 0.122824 valid_1's auc: 0.831979 valid_1's binary_logloss: 0.135274 [29] training's auc: 0.882935 training's binary_logloss: 0.122475 valid_1's auc: 0.831907 valid_1's binary_logloss: 0.135238 [30] training's auc: 0.884254 training's binary_logloss: 0.122089 valid_1's auc: 0.831868 valid_1's binary_logloss: 0.135193 [31] training's auc: 0.885496 training's binary_logloss: 0.121759 valid_1's auc: 0.832299 valid_1's binary_logloss: 0.135133 [32] training's auc: 0.886521 training's binary_logloss: 0.121402 valid_1's auc: 0.83183 valid_1's binary_logloss: 0.13514 [33] training's auc: 0.887589 training's binary_logloss: 0.121072 valid_1's auc: 0.831865 valid_1's binary_logloss: 0.135103 [34] training's auc: 0.888516 training's binary_logloss: 0.12077 valid_1's auc: 0.832052 valid_1's binary_logloss: 0.135078 [35] training's auc: 0.889533 training's binary_logloss: 0.120406 valid_1's auc: 0.83234 valid_1's binary_logloss: 0.135018 [36] training's auc: 0.890387 training's binary_logloss: 0.12011 valid_1's auc: 0.832423 valid_1's binary_logloss: 0.134974 [37] training's auc: 0.891308 training's binary_logloss: 0.119784 valid_1's auc: 0.832437 valid_1's binary_logloss: 0.134951 [38] training's auc: 0.892604 training's binary_logloss: 0.11948 valid_1's auc: 0.832533 valid_1's binary_logloss: 0.134918 [39] training's auc: 0.893468 training's binary_logloss: 0.119196 valid_1's auc: 0.832603 valid_1's binary_logloss: 0.134875 [40] training's auc: 0.894568 training's binary_logloss: 0.118867 valid_1's auc: 0.832711 valid_1's binary_logloss: 0.134866 [41] training's auc: 0.895339 training's binary_logloss: 0.118571 valid_1's auc: 0.832797 valid_1's binary_logloss: 0.134839 [42] training's auc: 0.896223 training's binary_logloss: 0.118321 valid_1's auc: 0.832805 valid_1's binary_logloss: 0.134825 [43] training's auc: 0.897108 training's binary_logloss: 0.118067 valid_1's auc: 0.83308 valid_1's binary_logloss: 0.134779 [44] training's auc: 0.897967 training's binary_logloss: 0.117775 valid_1's auc: 0.832975 valid_1's binary_logloss: 0.134804 [45] training's auc: 0.898666 training's binary_logloss: 0.117525 valid_1's auc: 0.832743 valid_1's binary_logloss: 0.134831 [46] training's auc: 0.899621 training's binary_logloss: 0.117243 valid_1's auc: 0.832864 valid_1's binary_logloss: 0.134804 [47] training's auc: 0.900528 training's binary_logloss: 0.116973 valid_1's auc: 0.832608 valid_1's binary_logloss: 0.134852 [48] training's auc: 0.901353 training's binary_logloss: 0.116696 valid_1's auc: 0.832279 valid_1's binary_logloss: 0.134896 [49] training's auc: 0.902131 training's binary_logloss: 0.116428 valid_1's auc: 0.832123 valid_1's binary_logloss: 0.134946 [50] training's auc: 0.902891 training's binary_logloss: 0.11615 valid_1's auc: 0.831868 valid_1's binary_logloss: 0.134988 [51] training's auc: 0.903464 training's binary_logloss: 0.115902 valid_1's auc: 0.831689 valid_1's binary_logloss: 0.135011 [52] training's auc: 0.904104 training's binary_logloss: 0.115634 valid_1's auc: 0.831638 valid_1's binary_logloss: 0.135019 [53] training's auc: 0.905208 training's binary_logloss: 0.115364 valid_1's auc: 0.831822 valid_1's binary_logloss: 0.134999 [54] training's auc: 0.905798 training's binary_logloss: 0.115126 valid_1's auc: 0.831708 valid_1's binary_logloss: 0.135045 [55] training's auc: 0.906411 training's binary_logloss: 0.11496 valid_1's auc: 0.831868 valid_1's binary_logloss: 0.135016 [56] training's auc: 0.907233 training's binary_logloss: 0.11467 valid_1's auc: 0.831671 valid_1's binary_logloss: 0.13504 [57] training's auc: 0.907725 training's binary_logloss: 0.114465 valid_1's auc: 0.83186 valid_1's binary_logloss: 0.135032 [58] training's auc: 0.908122 training's binary_logloss: 0.114236 valid_1's auc: 0.831715 valid_1's binary_logloss: 0.135048 [59] training's auc: 0.90885 training's binary_logloss: 0.113983 valid_1's auc: 0.831609 valid_1's binary_logloss: 0.135071 [60] training's auc: 0.90947 training's binary_logloss: 0.113748 valid_1's auc: 0.831328 valid_1's binary_logloss: 0.135111 [61] training's auc: 0.910035 training's binary_logloss: 0.113547 valid_1's auc: 0.831214 valid_1's binary_logloss: 0.135129 [62] training's auc: 0.910499 training's binary_logloss: 0.113334 valid_1's auc: 0.831231 valid_1's binary_logloss: 0.135118 [63] training's auc: 0.910886 training's binary_logloss: 0.113134 valid_1's auc: 0.831177 valid_1's binary_logloss: 0.135135 [64] training's auc: 0.911353 training's binary_logloss: 0.112907 valid_1's auc: 0.831055 valid_1's binary_logloss: 0.135196 [65] training's auc: 0.911701 training's binary_logloss: 0.112712 valid_1's auc: 0.830995 valid_1's binary_logloss: 0.135227 [66] training's auc: 0.912188 training's binary_logloss: 0.112472 valid_1's auc: 0.830894 valid_1's binary_logloss: 0.135234 [67] training's auc: 0.912568 training's binary_logloss: 0.11225 valid_1's auc: 0.830864 valid_1's binary_logloss: 0.135245 [68] training's auc: 0.913057 training's binary_logloss: 0.112076 valid_1's auc: 0.831012 valid_1's binary_logloss: 0.13524 [69] training's auc: 0.913445 training's binary_logloss: 0.111865 valid_1's auc: 0.830879 valid_1's binary_logloss: 0.135279 [70] training's auc: 0.913759 training's binary_logloss: 0.111696 valid_1's auc: 0.830462 valid_1's binary_logloss: 0.135377 [71] training's auc: 0.914039 training's binary_logloss: 0.11155 valid_1's auc: 0.830415 valid_1's binary_logloss: 0.135389 [72] training's auc: 0.914494 training's binary_logloss: 0.111373 valid_1's auc: 0.830326 valid_1's binary_logloss: 0.135414 [73] training's auc: 0.914948 training's binary_logloss: 0.111162 valid_1's auc: 0.830159 valid_1's binary_logloss: 0.135435 [74] training's auc: 0.915246 training's binary_logloss: 0.110991 valid_1's auc: 0.829953 valid_1's binary_logloss: 0.135466 [75] training's auc: 0.915501 training's binary_logloss: 0.110853 valid_1's auc: 0.829844 valid_1's binary_logloss: 0.135486 [76] training's auc: 0.916085 training's binary_logloss: 0.110631 valid_1's auc: 0.829725 valid_1's binary_logloss: 0.135532 [77] training's auc: 0.916376 training's binary_logloss: 0.110474 valid_1's auc: 0.829672 valid_1's binary_logloss: 0.135551 [78] training's auc: 0.916735 training's binary_logloss: 0.110289 valid_1's auc: 0.829647 valid_1's binary_logloss: 0.135582 [79] training's auc: 0.91712 training's binary_logloss: 0.110114 valid_1's auc: 0.829577 valid_1's binary_logloss: 0.135604 [80] training's auc: 0.917454 training's binary_logloss: 0.109937 valid_1's auc: 0.829513 valid_1's binary_logloss: 0.135635 [81] training's auc: 0.917732 training's binary_logloss: 0.109756 valid_1's auc: 0.829332 valid_1's binary_logloss: 0.13567 [82] training's auc: 0.91816 training's binary_logloss: 0.109566 valid_1's auc: 0.829444 valid_1's binary_logloss: 0.135685 [83] training's auc: 0.918534 training's binary_logloss: 0.109366 valid_1's auc: 0.829404 valid_1's binary_logloss: 0.135716 [84] training's auc: 0.918724 training's binary_logloss: 0.109256 valid_1's auc: 0.829529 valid_1's binary_logloss: 0.135704 [85] training's auc: 0.918989 training's binary_logloss: 0.109102 valid_1's auc: 0.829677 valid_1's binary_logloss: 0.135699 [86] training's auc: 0.91925 training's binary_logloss: 0.108933 valid_1's auc: 0.829711 valid_1's binary_logloss: 0.135695 [87] training's auc: 0.919726 training's binary_logloss: 0.108791 valid_1's auc: 0.829802 valid_1's binary_logloss: 0.135686 [88] training's auc: 0.92001 training's binary_logloss: 0.108645 valid_1's auc: 0.829594 valid_1's binary_logloss: 0.135756 [89] training's auc: 0.920225 training's binary_logloss: 0.108529 valid_1's auc: 0.829446 valid_1's binary_logloss: 0.13579 [90] training's auc: 0.920554 training's binary_logloss: 0.10834 valid_1's auc: 0.829436 valid_1's binary_logloss: 0.135813 [91] training's auc: 0.920704 training's binary_logloss: 0.108233 valid_1's auc: 0.829148 valid_1's binary_logloss: 0.135862 [92] training's auc: 0.921214 training's binary_logloss: 0.108038 valid_1's auc: 0.829169 valid_1's binary_logloss: 0.135877 [93] training's auc: 0.921339 training's binary_logloss: 0.10794 valid_1's auc: 0.829136 valid_1's binary_logloss: 0.135908 [94] training's auc: 0.921642 training's binary_logloss: 0.107743 valid_1's auc: 0.829155 valid_1's binary_logloss: 0.135921 [95] training's auc: 0.921961 training's binary_logloss: 0.107553 valid_1's auc: 0.829164 valid_1's binary_logloss: 0.135931 [96] training's auc: 0.922127 training's binary_logloss: 0.107442 valid_1's auc: 0.829076 valid_1's binary_logloss: 0.135947 [97] training's auc: 0.922343 training's binary_logloss: 0.107296 valid_1's auc: 0.829172 valid_1's binary_logloss: 0.135944 [98] training's auc: 0.922883 training's binary_logloss: 0.107102 valid_1's auc: 0.829161 valid_1's binary_logloss: 0.135969 [99] training's auc: 0.92325 training's binary_logloss: 0.106916 valid_1's auc: 0.829306 valid_1's binary_logloss: 0.135943 [100] training's auc: 0.923618 training's binary_logloss: 0.106722 valid_1's auc: 0.829392 valid_1's binary_logloss: 0.135935 [101] training's auc: 0.923935 training's binary_logloss: 0.106555 valid_1's auc: 0.829392 valid_1's binary_logloss: 0.135952 [102] training's auc: 0.924127 training's binary_logloss: 0.106416 valid_1's auc: 0.829358 valid_1's binary_logloss: 0.135954 [103] training's auc: 0.924457 training's binary_logloss: 0.106318 valid_1's auc: 0.829536 valid_1's binary_logloss: 0.135941 [104] training's auc: 0.924621 training's binary_logloss: 0.106209 valid_1's auc: 0.829471 valid_1's binary_logloss: 0.135976 [105] training's auc: 0.924743 training's binary_logloss: 0.106087 valid_1's auc: 0.829455 valid_1's binary_logloss: 0.135992 [106] training's auc: 0.925035 training's binary_logloss: 0.105919 valid_1's auc: 0.829594 valid_1's binary_logloss: 0.135955 [107] training's auc: 0.925197 training's binary_logloss: 0.105783 valid_1's auc: 0.829526 valid_1's binary_logloss: 0.136 [108] training's auc: 0.925364 training's binary_logloss: 0.105672 valid_1's auc: 0.829335 valid_1's binary_logloss: 0.136038 [109] training's auc: 0.925824 training's binary_logloss: 0.105541 valid_1's auc: 0.829766 valid_1's binary_logloss: 0.135968 [110] training's auc: 0.925988 training's binary_logloss: 0.105422 valid_1's auc: 0.829783 valid_1's binary_logloss: 0.135983 [111] training's auc: 0.926262 training's binary_logloss: 0.105243 valid_1's auc: 0.829584 valid_1's binary_logloss: 0.136041 [112] training's auc: 0.926401 training's binary_logloss: 0.105123 valid_1's auc: 0.829597 valid_1's binary_logloss: 0.136053 [113] training's auc: 0.926655 training's binary_logloss: 0.104965 valid_1's auc: 0.829487 valid_1's binary_logloss: 0.136052 [114] training's auc: 0.926868 training's binary_logloss: 0.104805 valid_1's auc: 0.829505 valid_1's binary_logloss: 0.136082 [115] training's auc: 0.927258 training's binary_logloss: 0.104603 valid_1's auc: 0.829482 valid_1's binary_logloss: 0.136081 [116] training's auc: 0.927547 training's binary_logloss: 0.104422 valid_1's auc: 0.829318 valid_1's binary_logloss: 0.136131 [117] training's auc: 0.927741 training's binary_logloss: 0.104276 valid_1's auc: 0.829524 valid_1's binary_logloss: 0.136106 [118] training's auc: 0.928142 training's binary_logloss: 0.10409 valid_1's auc: 0.829605 valid_1's binary_logloss: 0.136093 [119] training's auc: 0.928452 training's binary_logloss: 0.103902 valid_1's auc: 0.829636 valid_1's binary_logloss: 0.136093 [120] training's auc: 0.928556 training's binary_logloss: 0.103818 valid_1's auc: 0.829551 valid_1's binary_logloss: 0.136114 [121] training's auc: 0.928866 training's binary_logloss: 0.103657 valid_1's auc: 0.829715 valid_1's binary_logloss: 0.136101 [122] training's auc: 0.928989 training's binary_logloss: 0.103545 valid_1's auc: 0.829532 valid_1's binary_logloss: 0.136142 [123] training's auc: 0.929073 training's binary_logloss: 0.10346 valid_1's auc: 0.829395 valid_1's binary_logloss: 0.136167 [124] training's auc: 0.929298 training's binary_logloss: 0.103325 valid_1's auc: 0.829225 valid_1's binary_logloss: 0.136204 [125] training's auc: 0.929631 training's binary_logloss: 0.103151 valid_1's auc: 0.82916 valid_1's binary_logloss: 0.136245 [126] training's auc: 0.930161 training's binary_logloss: 0.102923 valid_1's auc: 0.828965 valid_1's binary_logloss: 0.136313 [127] training's auc: 0.930292 training's binary_logloss: 0.102815 valid_1's auc: 0.828849 valid_1's binary_logloss: 0.136322 [128] training's auc: 0.930414 training's binary_logloss: 0.102716 valid_1's auc: 0.828803 valid_1's binary_logloss: 0.136352 [129] training's auc: 0.930515 training's binary_logloss: 0.102617 valid_1's auc: 0.828612 valid_1's binary_logloss: 0.136424 [130] training's auc: 0.930664 training's binary_logloss: 0.102515 valid_1's auc: 0.828744 valid_1's binary_logloss: 0.136411 [131] training's auc: 0.930979 training's binary_logloss: 0.102408 valid_1's auc: 0.828859 valid_1's binary_logloss: 0.136387 [132] training's auc: 0.931364 training's binary_logloss: 0.102248 valid_1's auc: 0.828929 valid_1's binary_logloss: 0.136364 [133] training's auc: 0.931727 training's binary_logloss: 0.102104 valid_1's auc: 0.828934 valid_1's binary_logloss: 0.136389 [134] training's auc: 0.931796 training's binary_logloss: 0.102018 valid_1's auc: 0.828754 valid_1's binary_logloss: 0.136437 [135] training's auc: 0.931988 training's binary_logloss: 0.101873 valid_1's auc: 0.828591 valid_1's binary_logloss: 0.136478 [136] training's auc: 0.932186 training's binary_logloss: 0.101729 valid_1's auc: 0.828469 valid_1's binary_logloss: 0.136492 [137] training's auc: 0.932399 training's binary_logloss: 0.101565 valid_1's auc: 0.828392 valid_1's binary_logloss: 0.136548 [138] training's auc: 0.932873 training's binary_logloss: 0.10139 valid_1's auc: 0.828293 valid_1's binary_logloss: 0.136573 [139] training's auc: 0.933168 training's binary_logloss: 0.101267 valid_1's auc: 0.828264 valid_1's binary_logloss: 0.136596 [140] training's auc: 0.93326 training's binary_logloss: 0.101183 valid_1's auc: 0.828287 valid_1's binary_logloss: 0.136614 [141] training's auc: 0.933423 training's binary_logloss: 0.101062 valid_1's auc: 0.828192 valid_1's binary_logloss: 0.136646 [142] training's auc: 0.933638 training's binary_logloss: 0.100928 valid_1's auc: 0.828064 valid_1's binary_logloss: 0.136694 [143] training's auc: 0.933725 training's binary_logloss: 0.100839 valid_1's auc: 0.827887 valid_1's binary_logloss: 0.13674 Early stopping, best iteration is: [43] training's auc: 0.897108 training's binary_logloss: 0.118067 valid_1's auc: 0.83308 valid_1's binary_logloss: 0.134779 roc auc: 0.8411