PyTorch Softmax 维度错误

PyTorch Softmax Dimensions error

我正在尝试编写一个简单的 NN 模块,有 2 层,第一层 ReLU 激活,输出 softmax 3 类(单热编码)。我使用 softmax 函数的方式似乎有问题,但我不确定发生了什么。

X 为 178x13 Y 为 178x3

我使用的数据集相当简单,可以找到here

我一直收到错误消息:

RuntimeError: dimension out of range (expected to be in range of [-2, 1], but got 3) . 

.

import pandas as pd
import numpy as np
import torch
from torch.autograd import Variable
from sklearn.preprocessing import LabelBinarizer

# Read in dataset, specifying that this set didn't come with column headers
x = pd.read_csv('Datasets/wine.data', header=None)

# Rename columns
x.columns = ['Class', 'A1', 'A2', 'A3', 'A4', 'A5', 'A6', 'A7', 'A8', 'A9', 'A10', 'A11', 'A12', 'A13']

y = x[['Class']].values

#turn class labels into one-hot encoding
one_hot = LabelBinarizer()
y = Variable(torch.from_numpy(one_hot.fit_transform(y)), )

x = Variable(torch.from_numpy(x.iloc[:, 1:14].values).float())


N, D_in, H, D_out = y.shape[0], x.shape[1], 20, 3

# Implement neural net with nn module

model = torch.nn.Sequential(
    torch.nn.Linear(D_in, H),
    torch.nn.ReLU(),
    torch.nn.Linear(H, D_out),
    torch.nn.LogSoftmax(dim=3)
)

loss_fn = torch.nn.NLLLoss

learning_rate = 1e-4
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)

for t in range(500):
    y_pred = model(x)

    loss = loss_fn(y_pred, y)
    print("Iteration: %d | Loss: %.3f" % (t, loss))

    optimizer.zero_grad()

    loss.backward()

    optimizer.step()

在我看来你误解了 LogSoftmax 的论点 dim。从文档中,

dim (int) – A dimension along which Softmax will be computed (so every slice along dim will sum to 1).

现在,在您通过两个线性层传递输入后,您获得并应用 LogSoftmax 的张量具有维度 178 x 3。显然,dim = 3 不可用,因为您的张量只有两个维度。相反,请尝试 dim=1 对列求和。

这是一个问题,因为对于 NLLLoss:

The target that this loss expects is a class index (0 to N-1, where N = number of classes)

而且我一直在尝试为它提供 one-hot 编码向量。我通过以下方式解决了我的问题:

loss = loss_fn(y_pred, torch.max(y, 1)[1])

其中 torch.max 找到了最大值及其各自的索引。