为什么tensorflow maxout不分别计算梯度哪里出错了?

Why is the tensorflow maxout not calculating the gradient respectively where is the mistake?

我正在尝试使用 tensorflow maxout 实现 (https://www.tensorflow.org/addons/api_docs/python/tfa/layers/Maxout),但遇到了困难;

我试着说明我的问题:如果我有以下

d=3


x_in=Input(shape=d)

x_out=Dense(d, activation='relu')(x_in)
model = Model(inputs=x_in, outputs=x_out)


model.compile(optimizer='adam', loss='MeanAbsoluteError')

X=tf.random.normal([200,3])
Y=tf.random.normal([200,3])

model.fit(X, Y, epochs=5, batch_size=32)

然后就正常了,就是loss在不断变小,可以得到预估的权重:

model.layers[1].get_weights()
Out[141]: 
[array([[-0.15133516, -0.14892222, -0.64674205],
        [ 0.34437487,  0.7822309 , -0.08931279],
        [-0.8330534 , -0.13827904, -0.23096593]], dtype=float32),
 array([-0.03069788, -0.03311999, -0.02603031], dtype=float32)]

但是,当我想改用 maxout 激活时,事情并没有成功

d=3


x_in=Input(shape=d)

x_out = tfa.layers.Maxout(3)(x_in)
model = Model(inputs=x_in, outputs=x_out)


model.compile(optimizer='adam', loss='MeanAbsoluteError')

X=tf.random.normal([200,3])
Y=tf.random.normal([200,3])

model.fit(X, Y, epochs=5, batch_size=32)

所有时期的损失都保持不变,

model.layers[1].get_weights()
Out[141]: []

我的错误在哪里?

它只能与另一层结合使用,例如 Dense 层。此外,Maxout 层本身没有任何可训练的权重,如您在模型摘要中所见,但它确实有一个超参数 num_units:

import tensorflow as tf
import tensorflow_addons as tfa

d=3
x_in=tf.keras.layers.Input(shape=d)
x = tf.keras.layers.Dense(3)(x_in)
x_out = tfa.layers.Maxout(3)(x)
model = tf.keras.Model(inputs=x_in, outputs=x_out)

model.compile(optimizer='adam', loss='MeanAbsoluteError')

X=tf.random.normal([200,3])
Y=tf.random.normal([200,3])

model.fit(X, Y, epochs=5, batch_size=32)
print(model.summary())
Epoch 1/5
7/7 [==============================] - 0s 2ms/step - loss: 1.0404
Epoch 2/5
7/7 [==============================] - 0s 3ms/step - loss: 1.0361
Epoch 3/5
7/7 [==============================] - 0s 2ms/step - loss: 1.0322
Epoch 4/5
7/7 [==============================] - 0s 2ms/step - loss: 1.0283
Epoch 5/5
7/7 [==============================] - 0s 3ms/step - loss: 1.0244
Model: "model_5"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 input_6 (InputLayer)        [(None, 3)]               0         
                                                                 
 dense_5 (Dense)             (None, 3)                 12        
                                                                 
 maxout_4 (Maxout)           (None, 3)                 0         
                                                                 
=================================================================
Total params: 12
Trainable params: 12
Non-trainable params: 0
_________________________________________________________________
None

也许还可以看看 paper 关于 Maxout:

The maxout model is simply a feed-forward achitecture, such as a multilayer perceptron or deep convolutional neural network, that uses a new type of activation function: the maxout unit.