Submitted by tsgiannis t3_10f5lnc in deeplearning
XecutionStyle t1_j4uu8r4 wrote
Are you fixing the weights of the earlier layers?
tsgiannis OP t1_j4uukd9 wrote
What exactly do you mean ...by "fixing weights" ?
The pretrained carries the weights from ImageNet and that's all .. if I unfreeze some layers it will get some more accuracy
But the "from scratch" starts empty.
XecutionStyle t1_j4v0dui wrote
When you replace the top layer and train the model, are the previous layers allowed to change?
tsgiannis OP t1_j4v447x wrote
No changes on the pretrained model besides removing the top layer.
I am aware that unfreezing can cause either good or bad results
Viewing a single comment thread. View all comments