在 Keras 中将 TimeDistributed 与循环层一起使用

Using TimeDistributed with recurrent layer in Keras

我想 运行 在每个批次的几个不同序列上构建一个 LSTM,然后加入最后的输出。这是我一直在尝试的:

from keras.layers import Dense, Input, LSTM, Embedding, TimeDistributed

num_sentences = 4
num_features = 3
num_time_steps = 5

inputs = Input([num_sentences, num_time_steps])
emb_layer = Embedding(10, num_features)
embedded = emb_layer(inputs)
lstm_layer = LSTM(4)

shape = [num_sentences, num_time_steps, num_features]
lstm_outputs = TimeDistributed(lstm_layer, input_shape=shape)(embedded)

这给我以下错误:

Traceback (most recent call last):
  File "test.py", line 12, in <module>
    lstm_outputs = TimeDistributed(lstm_layer, input_shape=shape)(embedded)
  File "/Users/erick/anaconda2/lib/python2.7/site-packages/keras/engine/topology.py", line 546, in __call__
    self.build(input_shapes[0])
  File "/Users/erick/anaconda2/lib/python2.7/site-packages/keras/layers/wrappers.py", line 94, in build
    self.layer.build(child_input_shape)
  File "/Users/erick/anaconda2/lib/python2.7/site-packages/keras/layers/recurrent.py", line 702, in build
    self.input_dim = input_shape[2]
IndexError: tuple index out of range

我尝试在 TimeDistributed 中省略 input_shape 参数,但没有任何改变。

input_shape 需要是 LSTM 层的参数,而不是 TimeDistributed(它是一个包装器)。省略它对我来说一切正常:

from keras.layers import Dense, Input, LSTM, Embedding, TimeDistributed

num_sentences = 4
num_features = 3
num_time_steps = 5

inputs = Input([num_sentences, num_time_steps])
emb_layer = Embedding(10, num_features)
embedded = emb_layer(inputs)
lstm_layer = LSTM(4)

shape = [num_sentences, num_time_steps, num_features]
lstm_outputs = TimeDistributed(lstm_layer)(embedded)


#OUTPUT:
Using TensorFlow backend.
[Finished in 1.5s]

在尝试了 michetonu 的回答并出现同样的错误后,我意识到我的 keras 版本可能已经过时了。确实,是 运行 keras 1.2,代码 运行 在 2.0 上没问题。