如何将 Pandas 列作为一维字符串张量中的一批句子传递?

How to pass a Pandas column as a batch of sentences in a 1-D tensor of strings?

我正在努力将大小为 (2946, 1) 的 Pandas 列(或 numpy 数组)传递给使用 Tensorflow 2 的 Keras 中的文本嵌入输入层。Pandas DataFrame 对象是只有 1 个文本列有 2946 个不同的观察结果。

根据 Tensorflow hub module documentation 关于这个预训练词嵌入,模块:

The module takes a batch of sentences in a 1-D tensor of strings as input.

网络和输入层定义如下:

import tensorflow_hub as hub
import tensorflow as tf
from tenorflow import keras


hub_layer = hub.KerasLayer("https://tfhub.dev/google/Wiki-words-500-with-normalization/2",
                           input_shape=[], dtype=tf.string)

model = keras.Sequential()
model.add(hub_layer)
model.add(keras.layers.Dense(16, activation='relu'))
model.add(keras.layers.Dense(1, activation='sigmoid'))


model.compile(optimizer='Adam',
              loss='binary_crossentropy',
              metrics=['accuracy'])

model.fit(X_train.values, y_train, epochs=10, validation_split=0.20)

我收到这个错误:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-36-cf8b37d02f89> in <module>()
      1 model.fit(X_train.values, y_train,
      2              epochs=10,
----> 3              validation_split=0.20)

8 frames
/tensorflow-2.1.0/python3.6/tensorflow_core/python/keras/engine/training_utils.py in standardize_input_data(data, names, shapes, check_batch_axis, exception_prefix)
    571                            ': expected ' + names[i] + ' to have ' +
    572                            str(len(shape)) + ' dimensions, but got array '
--> 573                            'with shape ' + str(data_shape))
    574         if not check_batch_axis:
    575           data_shape = data_shape[1:]

ValueError: Error when checking input: expected keras_layer_input to have 1 dimensions, but got array with shape (2946, 1)

如何将 pandas 列或 numpy 数组作为输入层期望的一维字符串张量中的一批句子传递?

尝试将您的 X.train.values(2946, 1) 拉平到 (2946)。如果 X.train.valuesnp.array,您可以使用 X.train.values.ravel()several other choices。如果不是,只需将其转换为 numpy。