Submitted by XecutionStyle t3_10fslf2 in deeplearning
Comments
XecutionStyle OP t1_j4yyvig wrote
Yes. A "better" method makes less sense in context it seems.
andsmi97 t1_j4zaiyp wrote
Since you haven't said anything about data and problem the answer is no.
onkus t1_j4zmty9 wrote
This is not a valid comparison.
onkus t1_j4zmvdh wrote
What do you mean by a better method?
BrotherAmazing t1_j52hucj wrote
If OP asked this question in a court of law, the attorney would immediately yell “OBJECTION!” and the Judge would sustain, scold OP, but give them a chance to ask a question that doesn’t automatically pre-suppose and imply that pre-training cannot be “correct” or that there is always a “better” way than pre-training.
FWIW, I often avoid transfer learning or pre-training when it’s not needed, but I’m sure I could construct a problem that is not pathological and of practical importance where pre-training is “optimal” in some sense of that word.
FastestLearner t1_j4yw7kj wrote
What do you mean by “correct method”?