def adjust_learning_rate(optimizer, epoch):
"""Sets the learning rate to the initial LR decayed by 10 every 30 epochs"""
lr = args.lr * (0.1 ** (epoch // 30))
for param_group in optimizer.param_groups:
param_group['lr'] = lr
同時,pytorch本身也提供了學習率衰減的方法https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
https://github.com/ncullen93/torchsample 提供了做好的輪子和其他的一些回調功能,如early stop
覺得最好是使用pytorch接口,畢竟有人維護