- 問題:unexpected key “module.encoder.embedding.weight” in state_dict
- 原因:保存模型用
nn.DataParallel
,使用該方法保存模型會用module
,但你加載模型時未用nn.DataParallel
- 解決方案:
方法1:用nn.DataParallel
方法載入模型
方法2:創建一個新的不包含module
前綴的ordered dict
載入模型
例子:
# original saved file with DataParallel
state_dict = torch.load('myfile.pth')
# create new OrderedDict that does not contain 'module'.
from collections import OrderedDict
new_state_dict = OrderDict()
for k, v in state_dict.items():
name = k[7:] # remove 'module'
new_state_dict[name] = v
# load params
model.load_state_dict(new_state_dict)