1.準備工作
安裝transformers,並下載以下文件:
vocab文件:https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-vocab.json
merges文件:https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-merges.txt
bart-large-cnn配置文件: https://s3.amazonaws.com/models.huggingface.co/bert/facebook/bart-large-cnn/config.json
bart-large-cnn模型:http://s3.amazonaws.com/models.huggingface.co/bert/facebook/bart-large-cnn/pytorch_model.bin
2.加載模型
from transformers import BartTokenizer, BartForConditionalGeneration
BART_PATH = 'F:/Corpus/bart-large-cnn'
bart_model = BartForConditionalGeneration.from_pretrained(BART_PATH, output_past=True)
bart_tokenizer = BartTokenizer.from_pretrained(BART_PATH, output_past=True)
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
bart_