InfogainLoss 图层 blob 失败
InfogainLoss layer blobs failure
我正在尝试建立 caffe 的 infogain 损失层来工作。我已经看到 post 的解决方案,但对我来说它仍然不起作用
我的数据 lmdb 尺寸是 Nx1xHxW(灰度图像),我的目标图像 lmdb 尺寸是 Nx3xH/8xW/8(rgb 图像)。我最后一个卷积层的维度是 1x3x20x80。 output_size 是 3,
所以我有 3 classes,因为我的标签编号在目标 lmdb 图像数据集中是 (0,1,2)。
我想试试 infogain 损失层,因为我觉得我有 class 不平衡问题。我的大部分图片都包含太多背景。
在我的最后一个卷积层 (conv3) 之后,我有这些:
layer {
name: "loss"
type: "SoftmaxWithLoss"
bottom: "conv3"
top: "loss"
}
layer {
bottom: "loss"
bottom: "label"
top: "infoGainLoss"
name: "infoGainLoss"
type: "InfogainLoss"
infogain_loss_param {
source: "infogainH.binaryproto"
}
}
我的信息增益矩阵是由 InfogainLoss layer post 生成的(正如 Shai 所建议的)所以我的 H 矩阵是 1x1x3x3 维(单位矩阵)。所以我的 L
是 3,因为我有 3 个 class。
当我 运行 prototxt 文件时,一切都很好(尺寸没问题),但是在我的最后一个卷积层(conv3 层)之后,我得到以下错误:
I0320 14:42:16.722874 5591 net.cpp:157] Top shape: 1 3 20 80 (4800)
I0320 14:42:16.722882 5591 net.cpp:165] Memory required for data: 2892800
I0320 14:42:16.722892 5591 layer_factory.hpp:77] Creating layer loss
I0320 14:42:16.722900 5591 net.cpp:106] Creating Layer loss
I0320 14:42:16.722906 5591 net.cpp:454] loss <- conv3
I0320 14:42:16.722913 5591 net.cpp:411] loss -> loss
F0320 14:42:16.722928 5591 layer.hpp:374] Check failed: ExactNumBottomBlobs() == bottom.size() (2 vs. 1) SoftmaxWithLoss Layer takes 2 bottom blob(s) as input.
我仔细检查过,每个 lmdb 数据集文件名都已正确设置。我不知道可能是什么问题。有什么想法吗?
亲爱的@Shai
感谢您的回答。我按照你提到的做了以下事情:
layer {
name: "prob"
type: "Softmax"
bottom: "conv3"
top: "prob"
softmax_param { axis: 1 }
}
layer {
bottom: "prob"
bottom: "label"
top: "infoGainLoss"
name: "infoGainLoss"
type: "InfogainLoss"
infogain_loss_param {
source: "infogainH.binaryproto"
}
}
但我仍然有错误:
Top shape: 1 3 20 80 (4800)
I0320 16:30:25.110862 6689 net.cpp:165] Memory required for data: 2912000
I0320 16:30:25.110867 6689 layer_factory.hpp:77] Creating layer infoGainLoss
I0320 16:30:25.110877 6689 net.cpp:106] Creating Layer infoGainLoss
I0320 16:30:25.110884 6689 net.cpp:454] infoGainLoss <- prob
I0320 16:30:25.110889 6689 net.cpp:454] infoGainLoss <- label
I0320 16:30:25.110896 6689 net.cpp:411] infoGainLoss -> infoGainLoss
F0320 16:30:25.110965 6689 infogain_loss_layer.cpp:35] Check failed: bottom[1]->height() == 1 (20 vs. 1)
出了什么问题?
您的错误来自 "loss"
层,而不是 "InfogainLoss"
层:您将输出 class 概率的 "Softmax"
层与输出 class 概率的 "SoftmaxWithLoss"
层混淆了输出一个(标量)损失值。
你该怎么办?
用 "prob"
类型的层替换 "loss"
层 "Softmax"
层:
layer {
name: "prob"
type: "Softmax" # NOT SoftmaxWithLoss
bottom: "conv3"
top: "prob"
softmax_param { axis: 1 } # compute prob along 2nd axis
}
您需要计算第二个维度的损失,目前"InfogainLoss"
层似乎不支持此功能。您可能需要调整 "InfogainLoss"
层以具有类似 "SoftmaxWithLoss"
的功能,允许沿任意轴计算损失。
更新: 我创建了一个 pull request on BVLC/caffe
that "upgrades" infogain loss layer. This upgraded version supports "loss along axis" like you are after. Moreover, it makes the "Softmax" layer redundant as it computes the probabilities internally (see this thread).
升级后的图层可以这样使用:
layer {
bottom: "conv3" # prob is computed internally
bottom: "label"
top: "infoGainLoss"
name: "infoGainLoss"
type: "InfogainLoss"
infogain_loss_param {
source: "infogainH.binaryproto"
axis: 1 # compute loss and probability along axis
}
}
我正在尝试建立 caffe 的 infogain 损失层来工作。我已经看到 post 的解决方案,但对我来说它仍然不起作用
我的数据 lmdb 尺寸是 Nx1xHxW(灰度图像),我的目标图像 lmdb 尺寸是 Nx3xH/8xW/8(rgb 图像)。我最后一个卷积层的维度是 1x3x20x80。 output_size 是 3, 所以我有 3 classes,因为我的标签编号在目标 lmdb 图像数据集中是 (0,1,2)。
我想试试 infogain 损失层,因为我觉得我有 class 不平衡问题。我的大部分图片都包含太多背景。
在我的最后一个卷积层 (conv3) 之后,我有这些:
layer {
name: "loss"
type: "SoftmaxWithLoss"
bottom: "conv3"
top: "loss"
}
layer {
bottom: "loss"
bottom: "label"
top: "infoGainLoss"
name: "infoGainLoss"
type: "InfogainLoss"
infogain_loss_param {
source: "infogainH.binaryproto"
}
}
我的信息增益矩阵是由 InfogainLoss layer post 生成的(正如 Shai 所建议的)所以我的 H 矩阵是 1x1x3x3 维(单位矩阵)。所以我的 L
是 3,因为我有 3 个 class。
当我 运行 prototxt 文件时,一切都很好(尺寸没问题),但是在我的最后一个卷积层(conv3 层)之后,我得到以下错误:
I0320 14:42:16.722874 5591 net.cpp:157] Top shape: 1 3 20 80 (4800) I0320 14:42:16.722882 5591 net.cpp:165] Memory required for data: 2892800 I0320 14:42:16.722892 5591 layer_factory.hpp:77] Creating layer loss I0320 14:42:16.722900 5591 net.cpp:106] Creating Layer loss I0320 14:42:16.722906 5591 net.cpp:454] loss <- conv3 I0320 14:42:16.722913 5591 net.cpp:411] loss -> loss F0320 14:42:16.722928 5591 layer.hpp:374] Check failed: ExactNumBottomBlobs() == bottom.size() (2 vs. 1) SoftmaxWithLoss Layer takes 2 bottom blob(s) as input.
我仔细检查过,每个 lmdb 数据集文件名都已正确设置。我不知道可能是什么问题。有什么想法吗?
亲爱的@Shai
感谢您的回答。我按照你提到的做了以下事情:
layer {
name: "prob"
type: "Softmax"
bottom: "conv3"
top: "prob"
softmax_param { axis: 1 }
}
layer {
bottom: "prob"
bottom: "label"
top: "infoGainLoss"
name: "infoGainLoss"
type: "InfogainLoss"
infogain_loss_param {
source: "infogainH.binaryproto"
}
}
但我仍然有错误:
Top shape: 1 3 20 80 (4800)
I0320 16:30:25.110862 6689 net.cpp:165] Memory required for data: 2912000
I0320 16:30:25.110867 6689 layer_factory.hpp:77] Creating layer infoGainLoss
I0320 16:30:25.110877 6689 net.cpp:106] Creating Layer infoGainLoss
I0320 16:30:25.110884 6689 net.cpp:454] infoGainLoss <- prob
I0320 16:30:25.110889 6689 net.cpp:454] infoGainLoss <- label
I0320 16:30:25.110896 6689 net.cpp:411] infoGainLoss -> infoGainLoss
F0320 16:30:25.110965 6689 infogain_loss_layer.cpp:35] Check failed: bottom[1]->height() == 1 (20 vs. 1)
出了什么问题?
您的错误来自 "loss"
层,而不是 "InfogainLoss"
层:您将输出 class 概率的 "Softmax"
层与输出 class 概率的 "SoftmaxWithLoss"
层混淆了输出一个(标量)损失值。
你该怎么办?
用
"prob"
类型的层替换"loss"
层"Softmax"
层:layer { name: "prob" type: "Softmax" # NOT SoftmaxWithLoss bottom: "conv3" top: "prob" softmax_param { axis: 1 } # compute prob along 2nd axis }
您需要计算第二个维度的损失,目前
"InfogainLoss"
层似乎不支持此功能。您可能需要调整"InfogainLoss"
层以具有类似"SoftmaxWithLoss"
的功能,允许沿任意轴计算损失。
更新: 我创建了一个 pull request onBVLC/caffe
that "upgrades" infogain loss layer. This upgraded version supports "loss along axis" like you are after. Moreover, it makes the "Softmax" layer redundant as it computes the probabilities internally (see this thread).
升级后的图层可以这样使用:layer { bottom: "conv3" # prob is computed internally bottom: "label" top: "infoGainLoss" name: "infoGainLoss" type: "InfogainLoss" infogain_loss_param { source: "infogainH.binaryproto" axis: 1 # compute loss and probability along axis } }