TheLastSamurai t1_iunh200 wrote
Reply to comment by OsakaWilson in whats the probability of governements voting laws to block AI research or other transhumanist tech by [deleted]
Why would it not make sense to refrain from creating AGI? I would love to see an actual risk/benefit analysis done.
OsakaWilson t1_iuohcv8 wrote
If one other group makes it, they rule the world.
TheLastSamurai t1_iuoifrc wrote
Some bad game theory. So we all have to try because another might, but if they do so it could literally end humanity?
OsakaWilson t1_iuoxwq3 wrote
Yes. You are also describing nuclear weapons, which are verifiable, and nearly every party that could, created them. I'm not saying it is good, I'm saying in an environment of distrust, that will be the result. It's not even a national decision. Multiple companies worldwide could pursue it. All it takes is one group believing they can contain it while they get rich and it's over.
Viewing a single comment thread. View all comments