Submitted by EducationalCicada t3_10vgrff in MachineLearning
astrange t1_j7juabz wrote
Reply to comment by drooobie in [N] Google: An Important Next Step On Our AI Journey by EducationalCicada
No they're not. ChatGPT doesn't do anything, it just responds to you. Letting it reliably do things (or even reliably return true responses) can't even clearly use the same technology.
drooobie t1_j7l7mo0 wrote
If you replaced the assistant in my google home with ChatGPT I would use it a lot more. Maybe I'm an exception, but I don't think so.
MysteryInc152 t1_j7lg6rm wrote
I think he's basically saying AI's like chatGPT just output text at the base level. But that's really also a moot point anyway. You can plug in LLMs to be a sort of middle-man interface.
MysteryInc152 t1_j7lghig wrote
>No they're not. ChatGPT doesn't do anything, it just responds to you
Yes they are and you can get it to "do things" easily
astrange t1_j7oduw3 wrote
This is wishful thinking. ChatGPT, being a computer program, doesn't have features it's not designed to have, and it's not designed to have this one.
(By designed, I mean has engineering and regression testing so you can trust it'll work tomorrow when they redo the model.)
I agree a fine tuned LLM can be a large part of it, but virtual assistants already have LMs and obviously don't always work that well.
danielbln t1_j7ovvql wrote
What we all want is that Alexa/Siri/Home have modern LLM conversational features, on addition to reliably turn on/off our lights or give us the weather. Ever since ChatGPT came out, interacting with a home assistance feels even more like pulling nails than it used to.
Viewing a single comment thread. View all comments