GitGudOrGetGot
GitGudOrGetGot t1_j3wmiz8 wrote
Can anyone explain to me the mechanism by which investing $$$ allows Microsoft to gain some exclusive access to GPT which other firms don't get?
GitGudOrGetGot t1_j1o9wft wrote
Reply to comment by curiousshortguy in [D] Are reviewer blacklists actually implemented at ML conferences? by XalosXandrez
Plenty of colors are associated with negative things, including white, red, yellow, green, blue
GitGudOrGetGot t1_j1o51xf wrote
Reply to comment by JocialSusticeWarrior in [D] Are reviewer blacklists actually implemented at ML conferences? by XalosXandrez
Why
GitGudOrGetGot t1_iyc8yf7 wrote
Reply to comment by knockatize in Stephen Curry of sanitation by jonnycash11
NBA '94
Never forget
GitGudOrGetGot t1_ix3s761 wrote
Reply to comment by skelly0311 in [D] BERT related questions by Devinco001
>First the Bert model generates word embeddings by tokenizing strings into a pre trained word vector, then you run those embeddings through a transformer for some type of inference
Could you describe this a bit further in terms of inputs and outputs?
I think I get htat you go from a string to a list of individual tokens, but when you say you then feed that into a Pre Trained Word Vector, does that mean you output a list of floating point values representing the document as a single point in high dimensional space?
I thought that's specifically what the transformer does, so not sure what other role it performs here...
GitGudOrGetGot t1_j7phuiz wrote
Reply to comment by PK_thundr in [N] Microsoft announces new "next-generation" LLM, will be integrated with Bing and Edge by currentscurrents
Boobies