为一个特征跳过神经网络中的连接

Skip connection in a neural network for one feature

我有1000个对象,每个对象有100个时间戳和5个特征,但是一个很重要,所以我不想通过LSTM,而是立即将它传递到最后一层,我怎么能做这个?神经网络中需要很多输入层吗?

我认为这些中的任何一个都适合你:

import tensorflow as tf

# dummy data
inp1 = tf.random.uniform(shape=(1000, 100, 5))

# ALTERNATIVE 1 (use lambda layer to split input)
inputs = tf.keras.layers.Input((100, 5), name='inputs')

# assuming that the important feature is at index -1
input_lstm = tf.keras.layers.Lambda(lambda x: x[:, :, :4])(inputs)
input_dense = tf.keras.layers.Lambda(lambda x: x[:, :, -1])(inputs)
x = tf.keras.layers.LSTM(
    units=64, 
    recurrent_initializer='ones', 
    kernel_initializer='ones')(input_lstm)
x = tf.keras.layers.Concatenate()([x, input_dense])
out = tf.keras.layers.Dense(units=1, kernel_initializer='ones')(x)

model = tf.keras.Model(inputs=inputs, outputs=out)

# print(model.summary())
out = model(inp1)
print(out[:5])

# ALTERNATIVE 2 (split data before neural net)
# assuming that the important feature is at index -1
inp2 = inp1[:, :, -1]
inp1 = inp1[:, :, :4]

input_lstm = tf.keras.layers.Input((100, 4), name='lstm_input')
input_dense = tf.keras.layers.Input((100,), name='dense_input')
x = tf.keras.layers.LSTM(
    units=64, 
    recurrent_initializer='ones', 
    kernel_initializer='ones')(input_lstm)
x = tf.keras.layers.Concatenate()([x, input_dense])
out = tf.keras.layers.Dense(units=1, kernel_initializer='ones')(x)

model = tf.keras.Model(inputs=[input_lstm, input_dense], outputs=out)

# print(model.summary())
out = model([inp1, inp2])
print(out[:5])

# output:
# tf.Tensor(
# [[118.021736]
#  [117.11683 ]
#  [115.341644]
#  [120.00911 ]
#  [114.4716  ]], shape=(5, 1), dtype=float32)
# tf.Tensor(
# [[118.021736]
#  [117.11683 ]
#  [115.341644]
#  [120.00911 ]
#  [114.4716  ]], shape=(5, 1), dtype=float32)

图层权重初始化为 ones 只是为了说明它们提供相同的输出。