Submitted by FrogsEverywhere t3_zgfou6 in Futurology
PeartsGarden t1_izk8bru wrote
Reply to comment by MrZwink in The technological singularity is happening (oc/opinion) by FrogsEverywhere
Yeah but what if you never told a child about crushed glass? What if that child never dropped a glass, and never cut his/her finger while cleaning the mess? What would a child say?
Would you say that child lacks common sense? Does that child lack experience (a training set)?
MrZwink t1_izl1el1 wrote
I'm not getting in a whole filosophical debate. These ai's aren't meant to be a child that gives it's opinion on a subject. They're expected to be oracles. And they're just not good enough yet.
PeartsGarden t1_izl89y7 wrote
> they're just not good enough yet.
My point is, that specific AI's training set may have been insufficient. The same as if a child's experiences are insufficient. I think we can both agree that a child has common sense, at least a budding version of it.
MrZwink t1_izlbrmv wrote
it's not the training set that is the problem. It is the way statistics approach the problem. Correlation is not causation. Ai's are a tool to automate cognitive processes. Nothing more. We shoulnt expect them to be oracles.
Viewing a single comment thread. View all comments