Submitted by Meddhouib10 t3_ybx4nt in MachineLearning
ethereumturk t1_itj6hx2 wrote
SBert biencoder
fastglow t1_itlyz8a wrote
SBert is for comparing sentences/phrases. No reason to use that over a regular transformer encoder-decoder for language modelling, and getting that to process 2000 tokens in less than 4 seconds would be challenging without efficiency-augmenting methods like quantization, pruning, distillation, etc.
ethereumturk t1_itq2d8e wrote
True
Viewing a single comment thread. View all comments