scikit-learn refit/partial 适合分类器中的选项
scikit-learn refit/partial fit option in Classifers
我想知道 sklearn 分类器中是否有任何选项可以使用一些超参数进行拟合,并在更改一些超参数后,通过节省计算(拟合)成本来重新拟合模型。
假设,逻辑回归适合使用 C=1e5
(logreg=linear_model.LogisticRegression(C=1e5)
),我们只将 C
更改为 C=1e3
。我想节省一些计算,因为只更改了一个参数。
是的,有一种叫做 warm_start
的技术,从文档中引用它意味着:
warm_start : bool, default: False
When set to True, reuse the solution of the previous call to fit as initialization, otherwise,
just erase the previous solution. Useless for liblinear solver.
如文档 here 中所述,它在 LogisticRegression
中可用:
sklearn.linear_model.LogisticRegression(..., warm_start=False, n_jobs=1)
具体来说,对于您的情况,您可以执行以下操作:
from sklearn.linear_model import LogisticRegression
# create an instance of LogisticRegression with warm_start=True
logreg = LogisticRegression(C=1e5, warm_start=True)
# you can access the C parameter's value as follows
logreg.C
# it's set to 100000.0
# ....
# train your model here by calling logreg.fit(..)
# ....
# reset the value of the C parameter as follows
logreg.C = 1e3
logreg.C
# now it's set to 1000.0
# ....
# re-train your model here by calling logreg.fit(..)
# ....
据我快速检查,它也可以在以下位置使用:
我想知道 sklearn 分类器中是否有任何选项可以使用一些超参数进行拟合,并在更改一些超参数后,通过节省计算(拟合)成本来重新拟合模型。
假设,逻辑回归适合使用 C=1e5
(logreg=linear_model.LogisticRegression(C=1e5)
),我们只将 C
更改为 C=1e3
。我想节省一些计算,因为只更改了一个参数。
是的,有一种叫做 warm_start
的技术,从文档中引用它意味着:
warm_start : bool, default: False
When set to True, reuse the solution of the previous call to fit as initialization, otherwise, just erase the previous solution. Useless for liblinear solver.
如文档 here 中所述,它在 LogisticRegression
中可用:
sklearn.linear_model.LogisticRegression(..., warm_start=False, n_jobs=1)
具体来说,对于您的情况,您可以执行以下操作:
from sklearn.linear_model import LogisticRegression
# create an instance of LogisticRegression with warm_start=True
logreg = LogisticRegression(C=1e5, warm_start=True)
# you can access the C parameter's value as follows
logreg.C
# it's set to 100000.0
# ....
# train your model here by calling logreg.fit(..)
# ....
# reset the value of the C parameter as follows
logreg.C = 1e3
logreg.C
# now it's set to 1000.0
# ....
# re-train your model here by calling logreg.fit(..)
# ....
据我快速检查,它也可以在以下位置使用: