[D] Does Transformer need huge pretraining process? Submitted by minhrongcon2000 t3_z8kit4 on November 30, 2022 at 7:06 AM in MachineLearning 8 comments 1
OutrageousSundae8270 t1_iyc9bnw wrote on November 30, 2022 at 9:13 AM Transformers do generally need to be pre-trained on a large corpus to do well on further downstream tasks. Permalink 1
Viewing a single comment thread. View all comments