Scikit Learn 具有 2 个特征的线性回归
Scikitlearn Linear Regression with 2 features
我是 scikit 学习的新手,正在尝试拟合简单的线性回归模型。我有一个矩阵 X,其中包含 2 列,c1 和 c1^2,并且我有相应的 y 值。我尝试使用 scikit 学习来拟合一个简单的 OLS 模型,但最后我得到了奇怪的情节。关于我做错了什么有什么想法吗?
X = np.array([[ -0.016746535778021, 0.280446460564527],
[-0.014577470749242, 0.212502653445002],
[0.034515758657299, 1.191337595688933],
[-0.047010075743201, 2.209947221381472],
[0.036975119046363, 1.367159428492700],
[-0.040686110015367, 1.655359548182586],
[-0.004472010975766, 0.019998882167376],
[0.026533634894789 , 0.704033780729957],
[-0.042797683100180, 1.831641678743394],
[0.025374099383528, 0.643844919525139],
[-0.031109553977308, 0.967804348667025],
[0.027311768635213, 0.745932705983427],
[-0.003263862013657, 0.010652795244191],
[-0.001818276487116, 0.003306129383598],
[-0.040719662402516, 1.658090906174888],
[-0.050013243645495, 2.501324539943689],
[-0.017411771548016, 0.303169788440313],
[0.003588193696644, 0.012875134004637],
[0.007085480261971, 0.050204030542776,],
[0.046282369018539, 2.142057681968212],
[0.014612289091657, 0.213518992498145]])*1e3
y = np.array([4.1702,
4.0673,
31.8731,
10.6237,
31.8360,
4.9594,
4.4516,
22.2763,
-0.0000,
20.5038,
3.8583,
19.3651,
4.8838,
11.0972,
7.4617,
1.4769,
2.7192,
10.9269,
8.3487,
52.7819,
13.3573])
from sklearn.linear_model import LinearRegression as LR
model1 = LR().fit(X,y)
import matplotlib.pyplot as plt
plt.plot(X[:,0],model1.predict(X))
plt.scatter(X[:,0],y,color = 'red')
plt.show()
plt.plot()
函数按照您给出的顺序绘制线条。为了绘制回归线,您需要按照 X 的最小值到最大值的顺序输入 X 值和预测值。最简单的方法将是:
predictions = model1.predict(X)
order = X[:,0].argsort()
predictions = predictions[order]
x = X[:,0][order]
plt.plot(x,predictions)
plt.scatter(X[:,0],y,color = 'red')
plt.show()
这是你想要的吗?
import numpy as np
from sklearn.linear_model import LinearRegression as LR
import matplotlib.pyplot as plt
model1 = LR().fit(X,y)
preds = model1.predict(X)
plt.scatter(preds, y)
xpoints = ypoints = plt.xlim()
plt.plot(xpoints, ypoints, linestyle='--', color='k')
plt.show()
我是 scikit 学习的新手,正在尝试拟合简单的线性回归模型。我有一个矩阵 X,其中包含 2 列,c1 和 c1^2,并且我有相应的 y 值。我尝试使用 scikit 学习来拟合一个简单的 OLS 模型,但最后我得到了奇怪的情节。关于我做错了什么有什么想法吗?
X = np.array([[ -0.016746535778021, 0.280446460564527],
[-0.014577470749242, 0.212502653445002],
[0.034515758657299, 1.191337595688933],
[-0.047010075743201, 2.209947221381472],
[0.036975119046363, 1.367159428492700],
[-0.040686110015367, 1.655359548182586],
[-0.004472010975766, 0.019998882167376],
[0.026533634894789 , 0.704033780729957],
[-0.042797683100180, 1.831641678743394],
[0.025374099383528, 0.643844919525139],
[-0.031109553977308, 0.967804348667025],
[0.027311768635213, 0.745932705983427],
[-0.003263862013657, 0.010652795244191],
[-0.001818276487116, 0.003306129383598],
[-0.040719662402516, 1.658090906174888],
[-0.050013243645495, 2.501324539943689],
[-0.017411771548016, 0.303169788440313],
[0.003588193696644, 0.012875134004637],
[0.007085480261971, 0.050204030542776,],
[0.046282369018539, 2.142057681968212],
[0.014612289091657, 0.213518992498145]])*1e3
y = np.array([4.1702,
4.0673,
31.8731,
10.6237,
31.8360,
4.9594,
4.4516,
22.2763,
-0.0000,
20.5038,
3.8583,
19.3651,
4.8838,
11.0972,
7.4617,
1.4769,
2.7192,
10.9269,
8.3487,
52.7819,
13.3573])
from sklearn.linear_model import LinearRegression as LR
model1 = LR().fit(X,y)
import matplotlib.pyplot as plt
plt.plot(X[:,0],model1.predict(X))
plt.scatter(X[:,0],y,color = 'red')
plt.show()
plt.plot()
函数按照您给出的顺序绘制线条。为了绘制回归线,您需要按照 X 的最小值到最大值的顺序输入 X 值和预测值。最简单的方法将是:
predictions = model1.predict(X)
order = X[:,0].argsort()
predictions = predictions[order]
x = X[:,0][order]
plt.plot(x,predictions)
plt.scatter(X[:,0],y,color = 'red')
plt.show()
这是你想要的吗?
import numpy as np
from sklearn.linear_model import LinearRegression as LR
import matplotlib.pyplot as plt
model1 = LR().fit(X,y)
preds = model1.predict(X)
plt.scatter(preds, y)
xpoints = ypoints = plt.xlim()
plt.plot(xpoints, ypoints, linestyle='--', color='k')
plt.show()