Viewing a single comment thread. View all comments

Under_Over_Thinker t1_jde5e2f wrote

Perplexity going from 20.8 to 20.4. Is that a significant improvement? Also, I am not sure if perplexity is representative enough to evaluate LLMs.

0