Submitted by matt2001 t3_1204f8t in Futurology
MrRandomNumber t1_jdj3s3z wrote
Thought: if consciousness is dreaming limited by perception, perhaps ai hallucinations are an essential property of their emergent systems. Why shouldn't these systems be confused, over confident and superstitious? Worked for us... These things are less Einstein, more "Cliff" from Cheers.
Mercurionio t1_jdlmewp wrote
Messing up with facts will make you killed.
Like, an Alpaca assistant that will give you completely wrong data about your power circuit. And list go on.
These models must be based on the world WE are living in. Not that they are creating based on the params
Viewing a single comment thread. View all comments