Hugging Face H5 load model error : No model found in config file
Hugging Face H5 load model error : No model found in config file
我正在尝试从 Hugging Face 加载模型,我从这里下载了 h5 模型:https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/tree/main
from flask import Flask, jsonify, request # import objects from the Flask model
from keras.models import load_model
from transformers import AutoTokenizer, AutoModelForSequenceClassification,TextClassificationPipeline
model = load_model('./tf_model.h5') # trying to load model here
出现错误:
File "C:\D\Learning\Flask\flask-pp-rest\main.py", line 11, in <module>
model = load_model('./tf_model.h5') File "C:\Users\ndrez\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\saving\save.py",
line 200, in load_model
return hdf5_format.load_model_from_hdf5(filepath, custom_objects, File
"C:\Users\ndrez\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\saving\hdf5_format.py",
line 176, in load_model_from_hdf5
raise ValueError('No model found in config file.') ValueError: **No model found in config file.**
有人知道怎么解决吗?如果你知道请帮助我。我会关注这个问题并尝试实现您的解决方案的答案。
要加载您指定的模型,这是代码:
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("distilbert-base-uncased-finetuned-sst-2-english")
model = AutoModelForSequenceClassification.from_pretrained("distilbert-base-uncased-finetuned-sst-2-english")
您可以使用 TFAutoModelForSequenceClassification class:
加载 distilbert-base-uncased-finetuned-sst-2-english
的 tensorflow 版本
from transformers import AutoTokenizer, TFAutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("distilbert-base-uncased-finetuned-sst-2-english")
model = TFAutoModelForSequenceClassification.from_pretrained("distilbert-base-uncased-finetuned-sst-2-english")
我正在尝试从 Hugging Face 加载模型,我从这里下载了 h5 模型:https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/tree/main
from flask import Flask, jsonify, request # import objects from the Flask model
from keras.models import load_model
from transformers import AutoTokenizer, AutoModelForSequenceClassification,TextClassificationPipeline
model = load_model('./tf_model.h5') # trying to load model here
出现错误:
File "C:\D\Learning\Flask\flask-pp-rest\main.py", line 11, in <module>
model = load_model('./tf_model.h5') File "C:\Users\ndrez\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\saving\save.py",
line 200, in load_model
return hdf5_format.load_model_from_hdf5(filepath, custom_objects, File
"C:\Users\ndrez\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\saving\hdf5_format.py",
line 176, in load_model_from_hdf5
raise ValueError('No model found in config file.') ValueError: **No model found in config file.**
有人知道怎么解决吗?如果你知道请帮助我。我会关注这个问题并尝试实现您的解决方案的答案。
要加载您指定的模型,这是代码:
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("distilbert-base-uncased-finetuned-sst-2-english")
model = AutoModelForSequenceClassification.from_pretrained("distilbert-base-uncased-finetuned-sst-2-english")
您可以使用 TFAutoModelForSequenceClassification class:
加载distilbert-base-uncased-finetuned-sst-2-english
的 tensorflow 版本
from transformers import AutoTokenizer, TFAutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("distilbert-base-uncased-finetuned-sst-2-english")
model = TFAutoModelForSequenceClassification.from_pretrained("distilbert-base-uncased-finetuned-sst-2-english")