用BART搭建摘要生成服務

1.準備工作

安裝transformers,並下載以下文件:

vocab文件:https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-vocab.json
merges文件:https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-merges.txt
bart-large-cnn配置文件: https://s3.amazonaws.com/models.huggingface.co/bert/facebook/bart-large-cnn/config.json
bart-large-cnn模型:http://s3.amazonaws.com/models.huggingface.co/bert/facebook/bart-large-cnn/pytorch_model.bin

 2.加載模型

from transformers import BartTokenizer, BartForConditionalGeneration

BART_PATH = 'F:/Corpus/bart-large-cnn'

bart_model = BartForConditionalGeneration.from_pretrained(BART_PATH, output_past=True)
bart_tokenizer = BartTokenizer.from_pretrained(BART_PATH, output_past=True)

device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')

bart_
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章