当与 SKlearn RandomseachCV 一起使用时,XGboost 给出 objective 函数错误
XGboost gives objective function error when used with SKlearn RandomseachCV
我正在尝试 运行 使用 sklearn 运行dom 搜索 CV 函数对 XGBoost 的 select 超参数进行回归任务的 运行dom 搜索。这是我的代码:
#search space
params_xgboost = {
"learning_rate" : [0.05, 0.10, 0.15, 0.20, 0.25, 0.30],
"max_depth" : [ 3, 4, 5, 6, 8, 10, 12, 15],
"min_child_weight" : [ 1, 3, 5, 7 ],
"gamma" : [ 0.0, 0.1, 0.2 , 0.3, 0.4 ],
"colsample_bytree" : [ 0.3, 0.4, 0.5 , 0.7],
'n_estimators' : [5, 10, 15, 20, 25, 30, 35],
'objective': 'reg:squarederror'
}
model = XGBRegressor()
random_search = RandomizedSearchCV(estimator = model,
param_distributions = params_xgboost,
n_iter = 100,
cv = 5,
verbose=1,
random_state=42,
scoring = 'neg_mean_squared_error',
n_jobs = -1)
#params glare proba
random_search.fit(X_transform, Y['dgp'])
我真的很难理解为什么会出现以下错误
Unknown objective function: `u`
XGBoostError: [16:46:53] /Users/runner/miniforge3/conda-bld/xgboost_1607604592557/work/src/objective/objective.cc:26: Unknown objective function: `u`
Objective candidate: survival:aft
Objective candidate: binary:hinge
Objective candidate: multi:softmax
Objective candidate: multi:softprob
Objective candidate: rank:pairwise
Objective candidate: rank:ndcg
Objective candidate: rank:map
Objective candidate: reg:squarederror
Objective candidate: reg:squaredlogerror
Objective candidate: reg:logistic
Objective candidate: reg:pseudohubererror
Objective candidate: binary:logistic
Objective candidate: binary:logitraw
Objective candidate: reg:linear
Objective candidate: count:poisson
Objective candidate: survival:cox
Objective candidate: reg:gamma
Objective candidate: reg:tweedie
Stack trace:
[bt] (0) 1 libxgboost.dylib 0x00000001210ad23e dmlc::LogMessageFatal::~LogMessageFatal() + 110
[bt] (1) 2 libxgboost.dylib 0x00000001211a4bd7 xgboost::ObjFunction::Create(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, xgboost::GenericParameter const*) + 759
[bt] (2) 3 libxgboost.dylib 0x0000000121168d06 xgboost::LearnerConfiguration::ConfigureObjective(xgboost::LearnerTrainParam const&, std::__1::vector<std::__1::pair<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > >, std::__1::allocator<std::__1::pair<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > > > >*) + 1926
[bt] (3) 4 libxgboost.dylib 0x000000012115de1f xgboost::LearnerConfiguration::Configure() + 1247
[bt] (4) 5 libxgboost.dylib 0x000000012115e2e7 xgboost::LearnerImpl::UpdateOneIter(int, std::__1::shared_ptr<xgboost::DMatrix>) + 119
[bt] (5) 6 libxgboost.dylib 0x00000001210b1e5c XGBoosterUpdateOneIter + 156
[bt] (6) 7 libffi.7.dylib 0x0000000107c40ead ffi_call_unix64 + 85
[bt] (7) 8 ??? 0x00007ffee8691a00 0x0 + 140732797622784
我 运行 在单独的分类任务上使用相同的代码并且使用 multi:softmax
objective 它工作正常所以我不确定为什么在上述情况下我会收到此错误.
objective='reg:squarederror'
是默认值,因此您可以安全地忽略它:
XGBRegressor?
Init signature: XGBRegressor(objective='reg:squarederror', **kwargs)
Docstring:
Implementation of the scikit-learn API for XGBoost regression.
如果你想明确指定它,你总是可以这样做:
XGBRegressor(objective='reg:squarederror'...)
注意以及关于 **kwargs
for sklearn API (docs) 的注释:
**kwargs is unsupported by scikit-learn. We do not guarantee that parameters passed via this argument will interact properly with scikit-learn.
我同意@SergeyBushmanov 的观点,但要澄清这里的错误:您没有提供目标列表,只是字符串 'reg:squarederror'
。但是由于字符串在 python 中是可迭代的,随机搜索将其视为 16 个要尝试的目标:'r'
、'e'
、'g'
等(第一个尝试了是 'u'
,因为你的错误抱怨)。在它周围添加列表括号应该可以;但同样,对于您实际上对搜索不感兴趣的事情,只需在估算器中指定它们,就像@Sergey 的回答一样。
我正在尝试 运行 使用 sklearn 运行dom 搜索 CV 函数对 XGBoost 的 select 超参数进行回归任务的 运行dom 搜索。这是我的代码:
#search space
params_xgboost = {
"learning_rate" : [0.05, 0.10, 0.15, 0.20, 0.25, 0.30],
"max_depth" : [ 3, 4, 5, 6, 8, 10, 12, 15],
"min_child_weight" : [ 1, 3, 5, 7 ],
"gamma" : [ 0.0, 0.1, 0.2 , 0.3, 0.4 ],
"colsample_bytree" : [ 0.3, 0.4, 0.5 , 0.7],
'n_estimators' : [5, 10, 15, 20, 25, 30, 35],
'objective': 'reg:squarederror'
}
model = XGBRegressor()
random_search = RandomizedSearchCV(estimator = model,
param_distributions = params_xgboost,
n_iter = 100,
cv = 5,
verbose=1,
random_state=42,
scoring = 'neg_mean_squared_error',
n_jobs = -1)
#params glare proba
random_search.fit(X_transform, Y['dgp'])
我真的很难理解为什么会出现以下错误
Unknown objective function: `u`
XGBoostError: [16:46:53] /Users/runner/miniforge3/conda-bld/xgboost_1607604592557/work/src/objective/objective.cc:26: Unknown objective function: `u`
Objective candidate: survival:aft
Objective candidate: binary:hinge
Objective candidate: multi:softmax
Objective candidate: multi:softprob
Objective candidate: rank:pairwise
Objective candidate: rank:ndcg
Objective candidate: rank:map
Objective candidate: reg:squarederror
Objective candidate: reg:squaredlogerror
Objective candidate: reg:logistic
Objective candidate: reg:pseudohubererror
Objective candidate: binary:logistic
Objective candidate: binary:logitraw
Objective candidate: reg:linear
Objective candidate: count:poisson
Objective candidate: survival:cox
Objective candidate: reg:gamma
Objective candidate: reg:tweedie
Stack trace:
[bt] (0) 1 libxgboost.dylib 0x00000001210ad23e dmlc::LogMessageFatal::~LogMessageFatal() + 110
[bt] (1) 2 libxgboost.dylib 0x00000001211a4bd7 xgboost::ObjFunction::Create(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, xgboost::GenericParameter const*) + 759
[bt] (2) 3 libxgboost.dylib 0x0000000121168d06 xgboost::LearnerConfiguration::ConfigureObjective(xgboost::LearnerTrainParam const&, std::__1::vector<std::__1::pair<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > >, std::__1::allocator<std::__1::pair<std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > > > >*) + 1926
[bt] (3) 4 libxgboost.dylib 0x000000012115de1f xgboost::LearnerConfiguration::Configure() + 1247
[bt] (4) 5 libxgboost.dylib 0x000000012115e2e7 xgboost::LearnerImpl::UpdateOneIter(int, std::__1::shared_ptr<xgboost::DMatrix>) + 119
[bt] (5) 6 libxgboost.dylib 0x00000001210b1e5c XGBoosterUpdateOneIter + 156
[bt] (6) 7 libffi.7.dylib 0x0000000107c40ead ffi_call_unix64 + 85
[bt] (7) 8 ??? 0x00007ffee8691a00 0x0 + 140732797622784
我 运行 在单独的分类任务上使用相同的代码并且使用 multi:softmax
objective 它工作正常所以我不确定为什么在上述情况下我会收到此错误.
objective='reg:squarederror'
是默认值,因此您可以安全地忽略它:
XGBRegressor?
Init signature: XGBRegressor(objective='reg:squarederror', **kwargs)
Docstring:
Implementation of the scikit-learn API for XGBoost regression.
如果你想明确指定它,你总是可以这样做:
XGBRegressor(objective='reg:squarederror'...)
注意以及关于 **kwargs
for sklearn API (docs) 的注释:
**kwargs is unsupported by scikit-learn. We do not guarantee that parameters passed via this argument will interact properly with scikit-learn.
我同意@SergeyBushmanov 的观点,但要澄清这里的错误:您没有提供目标列表,只是字符串 'reg:squarederror'
。但是由于字符串在 python 中是可迭代的,随机搜索将其视为 16 个要尝试的目标:'r'
、'e'
、'g'
等(第一个尝试了是 'u'
,因为你的错误抱怨)。在它周围添加列表括号应该可以;但同样,对于您实际上对搜索不感兴趣的事情,只需在估算器中指定它们,就像@Sergey 的回答一样。