运行贝叶斯优化器,执行最大化函数时,"one of the lower bounds is greater than an upper bound."出现错误
Running the Bayesian optimizer, and when the maximize function is executed, "one of the lower bounds is greater than an upper bound." Error occurs
我们是 运行 用于超参数调整的贝叶斯优化器。顺便说一下,我得到了这个错误。即使您尝试更改所有参数范围,也会出现同样的错误。请回答应该怎么做。
def XGB_cv(max_depth,learning_rate, n_estimators, gamma
,min_child_weight, max_delta_step, subsample
,colsample_bytree, silent=True, nthread=-1):
model = xgb.XGBClassifier(max_depth=int(max_depth),
learning_rate=learning_rate,
n_estimators=int(n_estimators),
silent=silent,
nthread=nthread,
gamma=gamma,
min_child_weight=min_child_weight,
max_delta_step=max_delta_step,
subsample=subsample,
colsample_bytree=colsample_bytree)
RMSE = cross_val_score(model, train2, y, scoring='accuracy', cv=5).mean()
return RMSE
pbounds = {'max_depth': (5, 10),
'learning_rate': (0, 0.5),
'n_estimators': (50, 1000),
'gamma': (1, 0.01),
'min_child_weight': (0,10),
'max_delta_step': (0, 0.1),
'subsample': (0, 0.8),
'colsample_bytree' :(0, 0.99),
}
xgboostBO = BayesianOptimization(f = XGB_cv, pbounds = pbounds, verbose = 2, random_state = 1 )
xgboostBO.maximize(init_points=2, n_iter = 10, acq='ei', xi=0.01)
~\Anaconda3\lib\site-packages\scipy\optimize\lbfgsb.py in _minimize_lbfgsb(fun, x0, args, jac, bounds, disp, maxcor, ftol, gtol, eps, maxfun, maxiter, iprint, callback, maxls, finite_diff_rel_step, **unknown_options)
292 # check bounds
293 if (new_bounds[0] > new_bounds[1]).any():
--> 294 raise ValueError("LBFGSB - one of the lower bounds is greater than an upper bound.")
295
296 # initial vector must lie within the bounds. Otherwise ScalarFunction and
ValueError: LBFGSB - one of the lower bounds is greater than an upper bound.
我对贝叶斯的东西一无所知,但在框界优化中,no-no 提供大于上限的下限:
‘伽玛’: (1, 0.01),
不确定这是否是您的问题,但我花了 7 秒才看到它。
我们是 运行 用于超参数调整的贝叶斯优化器。顺便说一下,我得到了这个错误。即使您尝试更改所有参数范围,也会出现同样的错误。请回答应该怎么做。
def XGB_cv(max_depth,learning_rate, n_estimators, gamma
,min_child_weight, max_delta_step, subsample
,colsample_bytree, silent=True, nthread=-1):
model = xgb.XGBClassifier(max_depth=int(max_depth),
learning_rate=learning_rate,
n_estimators=int(n_estimators),
silent=silent,
nthread=nthread,
gamma=gamma,
min_child_weight=min_child_weight,
max_delta_step=max_delta_step,
subsample=subsample,
colsample_bytree=colsample_bytree)
RMSE = cross_val_score(model, train2, y, scoring='accuracy', cv=5).mean()
return RMSE
pbounds = {'max_depth': (5, 10),
'learning_rate': (0, 0.5),
'n_estimators': (50, 1000),
'gamma': (1, 0.01),
'min_child_weight': (0,10),
'max_delta_step': (0, 0.1),
'subsample': (0, 0.8),
'colsample_bytree' :(0, 0.99),
}
xgboostBO = BayesianOptimization(f = XGB_cv, pbounds = pbounds, verbose = 2, random_state = 1 )
xgboostBO.maximize(init_points=2, n_iter = 10, acq='ei', xi=0.01)
~\Anaconda3\lib\site-packages\scipy\optimize\lbfgsb.py in _minimize_lbfgsb(fun, x0, args, jac, bounds, disp, maxcor, ftol, gtol, eps, maxfun, maxiter, iprint, callback, maxls, finite_diff_rel_step, **unknown_options)
292 # check bounds
293 if (new_bounds[0] > new_bounds[1]).any():
--> 294 raise ValueError("LBFGSB - one of the lower bounds is greater than an upper bound.")
295
296 # initial vector must lie within the bounds. Otherwise ScalarFunction and
ValueError: LBFGSB - one of the lower bounds is greater than an upper bound.
我对贝叶斯的东西一无所知,但在框界优化中,no-no 提供大于上限的下限:
‘伽玛’: (1, 0.01),
不确定这是否是您的问题,但我花了 7 秒才看到它。