为什么 Logistic Regression(multi-class) 精度这么小?

Why at Logistic Regression(multi-class) accuracy is so small?

我尝试用 3 个特征和 6 个 类(label) 来解决问题。训练数据集是 700 行 * 3 列。特征值从 0 到 100 是连续的。我用的是one-Vs-all的方法,不知道为什么预测准确率这么小,只有24%。谁能告诉我好吗?谢谢! 这就是我进行预测的方式:

function p = predictOneVsAll(all_theta, X)
m = size(X, 1);
num_labels = size(all_theta, 1);
% You need to return the following variables correctly 
p = zeros(size(X, 1), 1);
% Add ones to the X data matrix
X = [ones(m, 1) X];
[m, p] = max(sigmoid(X * all_theta'), [], 2);
end

一对一

% You need to return the following variables correctly 
all_theta = zeros(num_labels, n + 1);

% Add ones to the X data matrix
X = [ones(m, 1) X];

initial_theta = zeros(n+1, 1);
options = optimset('GradObj', 'on', 'MaxIter', 20);
for c = 1:num_labels,
 [theta] = ...
     fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), ...
             initial_theta, options);
 all_theta(c,:) = theta';
end

predictOneVsAll 中,不需要使用 sigmoid 函数。只有在计算成本时才需要它。所以正确的代码是,

[m, p] = max((X * all_theta'), [], 2);

OneVsAll 中,循环应该是这样的

for c = 1:num_labels

all_theta(c,:) = fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), initial_theta, options);

endfor

最好在 andrew 的 ML 课程讨论中提出这些问题。他们会更熟悉代码和问题。