如何将 BertforSequenceClassification 模型权重加载到 BertforTokenClassification 模型中?

How to load BertforSequenceClassification models weights into BertforTokenClassification model?

最初,我有一个使用文本 classification 数据集的微调 BERT 基本案例模型,为此我使用了 BertforSequenceClassification class。

from transformers import BertForSequenceClassification, AdamW, BertConfig

# Load BertForSequenceClassification, the pretrained BERT model with a single 
# linear classification layer on top. 
model = BertForSequenceClassification.from_pretrained(
    "bert-base-uncased", # Use the 12-layer BERT model, with an uncased vocab.
    num_labels = 2, # The number of output labels--2 for binary classification.
                    # You can increase this for multi-class tasks.   
    output_attentions = False, # Whether the model returns attentions weights.
    output_hidden_states = False, # Whether the model returns all hidden-states.
)

现在我想使用这个微调的 BERT 模型权重进行命名实体识别,为此我必须使用 BertforTokenClassification class。我无法弄清楚如何将微调的 BERT 模型权重加载到使用 BertforTokenClassification 创建的新模型中。

提前致谢......................

您可以从第一个模型中的 bert 获取权重并加载到第二个模型中的 bert:

new_model = BertForTokenClassification(config=config)
new_model.bert.load_state_dict(model.bert.state_dict())

这对我有用

new_model = BertForTokenClassification.from_pretrained('/config path')
new_model.bert.load_state_dict(model.bert.state_dict())