在自定义数据集上微调 MobileNet 时出现形状错误

Shape Error while Fine Tuning MobileNet On A Custom Data Set

我正在跟随 deeplizard 微调 MobileNet。我试图做的是从模型的第 5 层到最后一层获取输出并将其存储在这个变量 x 中。模型第5层到最后一层的输出形状为global_average_pooling2d_3 (None, 1, 1, 1024)。然后添加一个具有 10 个单元的输出密集层。但是,在拟合模型时,出现以下错误。谁能请给我一些指导。非常感谢。我的代码如下所示

mobile = tf.keras.applications.mobilenet.MobileNet()
mobile.summary()
x = mobile.layers[-5].output
output =layers.Dense(units=10, activation='softmax')(x)
model = Model(inputs=mobile.input, outputs=output)

for layer in model.layers[:-23]:
    layer.trainable = False

model.compile(optimizer=Adam(lr=0.0001), 
              loss='categorical_crossentropy', 
              metrics=['accuracy'])

model.fit(x=train_batches,
            steps_per_epoch=len(train_batches),
            validation_data=valid_batches,
            validation_steps=len(valid_batches),
            epochs=30,
            verbose=2
)

ValueError: Shapes (None, None) and (None, 1, 1, 10) are incompatible

当您如下调用基础模型时,它将以默认参数启动。其中,include_top设置为True

mobile = tf.keras.applications.mobilenet.MobileNet()

并且,这带来了 (source) GlobalAvgkeepdims=True

  if include_top:
    x = layers.GlobalAveragePooling2D(keepdims=True)(x)

现在,根据你的错误,我假设你的标签是真实的形状,你可以简单地按照以下步骤进行操作

mobile = keras.applications.mobilenet.MobileNet()
x = mobile.layers[-5].output # shape=(None, 1, 1, 1024)
x = layers.Flatten()(x) # < --- Here shape=(None, 1024)

output =layers.Dense(units=10, activation='softmax')(x)
model = Model(inputs=mobile.input, outputs=output)