WigglyHypersurface t1_j4f1r8b wrote
Did you forget to change the tokenizer?
m98789 t1_j4f27pu wrote
Tokenizer is also my guess
GasZealousideal8691 OP t1_j4g8djf wrote
No, both use the GPT2 tokenizer. GPT-Neo uses GPT2Tokenizer.from_pretrained(‘EleutherAI/gpt-neo-1.3B)”, and GPT2 uses GPT2Tokenizer.from_pretrained(‘gpt2-xl’).
WigglyHypersurface t1_j4gpm5i wrote
What kind of head is on the models for the task?
GasZealousideal8691 OP t1_j4gpu8j wrote
GPT Neo is GPTNeoForCausalLM, and GPT2 is GPT2LMHeadModel. Like I said, I am not 100% familiar with these, but the huggingface docs listed both as “GPT-neo/GPT2 with an LM head”, so I figured they were analogous.
WigglyHypersurface t1_j4grftr wrote
I think those are the same but make both the causal version and see.
GasZealousideal8691 OP t1_j4gst0f wrote
Dont think there is a causal version for GPT2
WigglyHypersurface t1_j4gzweu wrote
The GPT2 LM is causal. If you do AutoModelForCausalLM with gpt2 it works fine.
Viewing a single comment thread. View all comments