Viewing a single comment thread. View all comments

tnetennba9 t1_irc9vuq wrote

And what do you mean by “own the AI”? The company/researchers who built it? The individual ml/software engineers? The company who built the training data?

Either way, the worry is that as AIs get more powerful, they often become more difficult to interpret.

3

cy13erpunk t1_ircpur0 wrote

in short yes ; lets not be silly

until AI is self-aware it should be treated like any other technology

once it is self-aware then it will be responsible for its own self-governance

we can avoid pretending like laws or rights have any real-world meaning when they are written by corrupt politicians and selectively enforced to oppress whoever they please ; these things are naive hollow words at best and intentionally manipulative lies at worst

i expect that AGI/ASI will be much more capable than previous humans have been at self-governance , thus i would not honestly trust humans to craft legitimate/authentic rules around AI [some humans certainly are/would be capable of this task, but obvs none that are in positions of power atm today]

1