Submitted by eddytony96 t3_zx6jm5 in technology
currentscurrents t1_j21fh9o wrote
Reply to comment by Thatweasel in How AI innovation is powered by underpaid workers in foreign countries. by eddytony96
The big thing these days is "self-supervised" learning.
You do the bulk of the training on a simpler task, like predicting missing parts of images or sentences. You don't need labels for this, and it allows the model to learn a lot about the structure of the data. Then you fine-tune the model with a small amount of labeled data for the specific task you want it to do.
Not only does this require far less labeled data, it also lets you reuse the model - you don't have to repeat the first phase of training, just the fine-tuning. You can download pretrained models on huggingface and adapt them to your specific task.
Viewing a single comment thread. View all comments