Submitted by RamaSchneider t3_10u9wyn in Futurology
Isabella-The-Fox t1_j7eidnk wrote
Reply to comment by RamaSchneider in What happens when the AI machine decides what you should know? by RamaSchneider
AI eventually being able to decide for itself is pure speculation. Us humans build it, we control what it do. Right now we have AI that "writes" code, infact it's run through open AI. It's called github copilot. I put write in quotes for a reason. This code it writes is just a algorithm taking from github, meaning if the AI tried to write code for itself, it'd run into errors and flaws (And has run into errors and flaws while being used. Source: I had a free trial). A AI will never be fully intelligence even when it seems like it is. Once it seems like it is, it really still isn't, at least compared to a human being. Us humans will always dictate AI
Viewing a single comment thread. View all comments