[D] Do you think there is a competitive future for smaller, locally trained/served models? Submitted by naequs t3_yon48p on November 7, 2022 at 1:36 PM in MachineLearning 17 comments 70
pm_me_your_ensembles t1_ivf2wkb wrote on November 7, 2022 at 2:42 PM Network distillation and transfer learning are both reasonable approaches to constructing high quality "compressed" models. Permalink 40
Viewing a single comment thread. View all comments