Pytorch Finetune和MXNet Finetune
程序员文章站
2022-03-03 14:37:42
...
- MXNet Finetune:
每一层都有个 lr_mult 属性,就是学习率的倍率,可以设置为不为1的值进行放大或缩小。
参考代码:
_weight = mx.symbol.Variable("fc7_weight", shape=(args.num_classes, args.emb_size), lr_mult=1.0)
- Pytorch Finetune:
设置哪些层固定不变
model_ft = models.resnet50(pretrained=True) # 这里自动下载官方的预训练模型,并且
# 将所有的参数层进行冻结
for param in model_ft.parameters():
param.requires_grad = False
# 这里打印下全连接层的信息
print(model_ft.fc)
num_fc_ftr = model_ft.fc.in_features #获取到fc层的输入
from torch.optim import lr_scheduler
# Decay LR by a factor of 0.1 every 7 epochs
exp_lr_scheduler = lr_scheduler.StepLR(optimizer_ft, step_size=7, gamma=0.1)
model_ft = train_model(model_ft, criterion, optimizer_ft, exp_lr_scheduler, num_epochs=25)
https://blog.csdn.net/qq_34914551/article/details/87699317