尝试将我的 Tensorflow 模型导出到 T LITE 时出现问题

Problem trying to export my Tensor Flow model to TLITE

this tutorial 之后在 GoogleColab 上训练我的模型后,当我尝试 运行 model.export('image_classifier.tflite', 'image_labels.txt') 时它显示

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-37-4d5419f8b12d> in <module>()
----> 1 model.export('image_classifier.tflite', 'image_labels.txt')

1 frames
/usr/local/lib/python3.6/dist-packages/tensorflow_examples/lite/model_customization/core/task/image_classifier.py in export(self, tflite_filename, label_filename, **kwargs)
    185       else:
    186         quantized = False
--> 187       self._export_tflite(tflite_filename, label_filename, quantized)
    188     else:
    189       raise ValueError('Model Export Format %s is not supported currently.' %

/usr/local/lib/python3.6/dist-packages/tensorflow_examples/lite/model_customization/core/task/classification_model.py in _export_tflite(self, tflite_filename, label_filename, quantized)
    130       quantized: boolean, if True, save quantized model.
    131     """
--> 132     converter = tf.lite.TFLiteConverter.from_keras_model(self.model)
    133     if quantized:
    134       converter.optimizations = [tf.lite.Optimize.OPTIMIZE_FOR_SIZE]

AttributeError: type object 'TFLiteConverter' has no attribute 'from_keras_model

更新 Tensorflow 版本后问题解决了,我使用的是 1.X 版本,TensorFlow 在新版本中发生了很大变化如果您使用的是来自 GOOGLE 的 COLAB,则版本仍然是 1.X 并将很快更改为 2,因此请尝试以其他方式构建您的模型。