Submitted by BrownSimpKid t3_1112zxw in singularity
Frumpagumpus t1_j8dhip1 wrote
Reply to comment by helpskinissues in Bing Chat sending love messages and acting weird out of nowhere by BrownSimpKid
usually wrong and mostly right lol. Better than a human.
I literally just explained to you that you COULD give it short term memory by prepending context to your messages. IT IS TRIVIAL. if i were talking to gpt3 it would not be this dense.
Humans take time to pause and compose their responses. gpt3 is afforded no such grace, but still does a great job anyway, because it is just that smart
yesterday I gave it two lines of sql ddl and asked it to create a view denormalizing all columns except primary key into a nested json object. it did in in .5 seconds, i had to change 1 word in a 200 line sql query to get it to work right.
yea that saved me some time. It does not matter that it was slightly wrong. If that is a stochastic parrot then humans must be mostly stochastic sloths barely even capable of parroting responses.
helpskinissues t1_j8dhsz4 wrote
Nonsense, sorry. Ants do not need prepending context.
"mostly right" no, it's actually mostly wrong. The heck are you saying? Try to play chess with ChatGPT, most of the times it'll make things up.
Anyway, I suggest you to read some experts rather than acting like gpt3, being mostly wrong. Cheers.
Frumpagumpus t1_j8diap9 wrote
lol ants cant speak and i would be curious to read any literature on if they possess short term memory at all XD
challengethegods t1_j8dn5cy wrote
These "it's dumber than an ant" type of people aren't worth the effort in my experience, because in order to think that you have to be dumber than an ant, of course. Also yea, it's trivial to give memory to LLMs, there's like 100 ways to do it.
helpskinissues t1_j8dwubv wrote
Waiting for your customized chatGPT model that maintains consistency after 5 messages, make sure to ping me, I'd gladly invest in your genius startup.
challengethegods t1_j8dylol wrote
That alone sounds like a pretty weak startup idea because at least 50 of the 100 methods for adding memory to an LLM are so painfully obvious that any idiot could figure them out and compete so it would be completely ephemeral to try forming a business around it, probably. Anyway I've already made a memory catalyst that can attach to any LLM and it only took like 100 lines of spaghetticode. Yes it made my bot 100x smarter in a way, but I don't think it would scale unless the bot had an isolated memory unique to each person, since most people are retarded and will inevitably teach it retarded things.
helpskinissues t1_j8dzdfi wrote
Enjoy your secret private highly demanded chatbot version then.
This subreddit... Lol.
Viewing a single comment thread. View all comments