fredlafrite OP t1_j3l6b40 wrote on January 9, 2023 at 9:01 AM Reply to comment by madmax_br5 in [D] Have you ever used Knowledge Distillation in practice? by fredlafrite Nice! There are 20MB versions of BERT, super interesting thank you! Permalink Parent 1
fredlafrite OP t1_j3l65ju wrote on January 9, 2023 at 8:58 AM Reply to comment by nmfisher in [D] Have you ever used Knowledge Distillation in practice? by fredlafrite Interesting! Echoing this, do you know which kind of companies one could work on this in an applied setting? Permalink Parent 1
fredlafrite t1_j3hit01 wrote on January 8, 2023 at 4:57 PM Reply to comment by josep-panadero in [D] What is the most complete reference on the history of neural networks? by gbfar Here it is accurate, but for more technical questions chatGPT very often invents references! Permalink Parent 6
[D] Have you ever used Knowledge Distillation in practice? Submitted by fredlafrite t3_106no9h on January 8, 2023 at 4:43 PM in MachineLearning 13 comments 9
fredlafrite OP t1_j3l6b40 wrote
Reply to comment by madmax_br5 in [D] Have you ever used Knowledge Distillation in practice? by fredlafrite
Nice! There are 20MB versions of BERT, super interesting thank you!