Submitted by floppy_llama t3_1266d02 in MachineLearning
Koda_20 t1_jecddbs wrote
Reply to comment by currentscurrents in [R] LLaMA-Adapter: Efficient Fine-tuning of Language Models with Zero-init Attention by floppy_llama
I feel like they are starting with whales because it generates more publicity because Nemo lol
They are probably not but I thought it was funny
Viewing a single comment thread. View all comments