Submitted by floppy_llama t3_1266d02 in MachineLearning
Appropriate-Crab-379 t1_jefz9og wrote
Reply to comment by dreaming_geometry in [R] LLaMA-Adapter: Efficient Fine-tuning of Language Models with Zero-init Attention by floppy_llama
There’s a ton of noise, not all techniques are worth knowing because in a few years a bunch of these concepts will be outdone by something new.
Viewing a single comment thread. View all comments