Don't spend too much time tuning hyperparameters
Be patient
Average everything
A lot of libraries to try:
def xgb_score(param):
# run XGBoost with arameters 'param'
def xgb_hyperopt():
space = {
'eta': 0.01,
'max_depth': hp.quniform('max_depth', 10, 30, 1),
'min_child_weight': hp.quniform('min_child_weight', 0, 100, 1),
'subsample': hp.quniform('subsample', 0.1, 1.0, 0.1),
'gamma': hp.quniform('gamma', 0.0, 30, 0.5),
'colsample_bytree': hp.quniform('colsample_bytree', 0.1, 1.0, 0.1),
'objective': 'reg:linear',
'nthread': 28,
'silent': 1,
'num_round': 2500,
'seed': 2441,
'early_stopping_rounds': 100
}
best = fmin(xgb_score, space, algo=tpe.suggest, max_evals=1000)
good)A parameter in red
A parameter in green
Hyperparameter tuning in general
Models, libraries and hyperparameter optimization