预期 min_ndim=2,发现 ndim=1。已收到完整形状:(None,)
Expected min_ndim=2, found ndim=1. Full shape received: (None,)
在我的模型中,我有一个用于 1 列特征数组的规范化层。我假设这给出了 1 ndim 输出:
single_feature_model = keras.models.Sequential([
single_feature_normalizer,
layers.Dense(1)
])
正常邮件步骤:
single_feature_normalizer = preprocessing.Normalization(axis=None)
single_feature_normalizer.adapt(single_feature)
我得到的错误是:
ValueError Traceback (most recent call last)
<ipython-input-98-22191285d676> in <module>()
2 single_feature_model = keras.models.Sequential([
3 single_feature_normalizer,
----> 4 layers.Dense(1) # Linear Model
5 ])
/usr/local/lib/python3.7/dist-packages/keras/engine/input_spec.py in assert_input_compatibility(input_spec, inputs, layer_name)
225 ndim = x.shape.rank
226 if ndim is not None and ndim < spec.min_ndim:
--> 227 raise ValueError(f'Input {input_index} of layer "{layer_name}" '
228 'is incompatible with the layer: '
229 f'expected min_ndim={spec.min_ndim}, '
ValueError: Input 0 of layer "dense_27" is incompatible with the layer: expected min_ndim=2, found ndim=1. Full shape received: (None,)
我好像密集层正在寻找一个 2 ndim 数组,而归一化层输出一个 1 ndim 数组。
有没有办法解决这个问题并让模型正常工作?
我认为您需要使用输入形状明确定义输入层,因为您的输出层无法推断来自归一化层的张量形状:
import tensorflow as tf
single_feature_normalizer = tf.keras.layers.Normalization(axis=None)
feature = tf.random.normal((314, 1))
single_feature_normalizer.adapt(feature)
single_feature_model = tf.keras.models.Sequential([
tf.keras.layers.Input(shape=(1,)),
single_feature_normalizer,
tf.keras.layers.Dense(1)
])
或者不使用输入层直接在归一化层定义输入形状:
single_feature_normalizer = tf.keras.layers.Normalization(input_shape=[1,], axis=None)
在我的模型中,我有一个用于 1 列特征数组的规范化层。我假设这给出了 1 ndim 输出:
single_feature_model = keras.models.Sequential([
single_feature_normalizer,
layers.Dense(1)
])
正常邮件步骤:
single_feature_normalizer = preprocessing.Normalization(axis=None)
single_feature_normalizer.adapt(single_feature)
我得到的错误是:
ValueError Traceback (most recent call last)
<ipython-input-98-22191285d676> in <module>()
2 single_feature_model = keras.models.Sequential([
3 single_feature_normalizer,
----> 4 layers.Dense(1) # Linear Model
5 ])
/usr/local/lib/python3.7/dist-packages/keras/engine/input_spec.py in assert_input_compatibility(input_spec, inputs, layer_name)
225 ndim = x.shape.rank
226 if ndim is not None and ndim < spec.min_ndim:
--> 227 raise ValueError(f'Input {input_index} of layer "{layer_name}" '
228 'is incompatible with the layer: '
229 f'expected min_ndim={spec.min_ndim}, '
ValueError: Input 0 of layer "dense_27" is incompatible with the layer: expected min_ndim=2, found ndim=1. Full shape received: (None,)
我好像密集层正在寻找一个 2 ndim 数组,而归一化层输出一个 1 ndim 数组。 有没有办法解决这个问题并让模型正常工作?
我认为您需要使用输入形状明确定义输入层,因为您的输出层无法推断来自归一化层的张量形状:
import tensorflow as tf
single_feature_normalizer = tf.keras.layers.Normalization(axis=None)
feature = tf.random.normal((314, 1))
single_feature_normalizer.adapt(feature)
single_feature_model = tf.keras.models.Sequential([
tf.keras.layers.Input(shape=(1,)),
single_feature_normalizer,
tf.keras.layers.Dense(1)
])
或者不使用输入层直接在归一化层定义输入形状:
single_feature_normalizer = tf.keras.layers.Normalization(input_shape=[1,], axis=None)