Viewing a single comment thread. View all comments

IntelArtiGen t1_ivg4d74 wrote

I think it's not just "transfer learning" or "image classification" it's also just learning without explicitly using "labels". Like contrastive learning / self supervised learning / reinforcement learning etc.

12

billjames1685 OP t1_ivg5bij wrote

Yeah I agree. Not sure if I’m misunderstanding you, but by “transfer learning” I basically mean like all of our pre training (which occurred through a variety of methods as you point out) has allowed us to richly understand images as a whole, so we can apply and generalize well in semi-new tasks/domain.

−5

IntelArtiGen t1_ivg765c wrote

Ok that's one way to say it I also agree. I tend to not use the concept of "transfer learning" for how we learn because I think it's more appropriate for well-defined tasks and we are rarely confronted with tasks that are as well-defined as the ones we give to our models.

And transfer learning implies that you have to re-train a part of the model on a new task, and that's not exactly how I would define what we do. When I worked on reproducing how we learn words I instead implemented the solution as a way to put a new label on a representation we were already able to produce based on our unsupervised pretraining. I don't know which way is the correct one I just know that doing that works and that you can teach new words/labels to a model without retraining it.

3

billjames1685 OP t1_ivh33jr wrote

That’s a fair point; I was kind of just using it as a general term.

2