Dyedoe
Dyedoe t1_ja1z072 wrote
Reply to comment by innovate_rye in do you know what the "singularity" is? by innovate_rye
That’s not a definition of singularity but it is almost certainly what will cause singularity… I guess you’re right about some people on this sub not knowing what singularity is.
Dyedoe t1_jea0jg4 wrote
Reply to The Only Way to Deal With the Threat From AI? Shut It Down by GorgeousMoron
Two thoughts. First, and this is touched on at the end of the article but only in a similarly idealistic but not realistic discussion we have about nukes. A Country that prioritize human rights needs to be the first to obtain AGI. If this article were written in the 1940s and everyone k we about nuke development, it would be making the same argument. It’s a valid point but what would the world be like if Germany beat the USA to the punch and developed the first nuke? Second, the article is a little more dramatic than what I envision worst case. Computers cannot exist perpetually without human maintenance in the physical world. It makes a lot of sense to achieve AGI before robotics is advanced and connected enough that humans have no use.
There is no question that AGI presents a substantial risk to humanity. But there are other possibly outcomes like: solving climate change, solving hunger, minimizing war, solving energy demand, curing diseases, etc. In my opinion, AGI is essential to human progress and if countries like the USA put a pause on its development, god help us if a country like Russia gets there first.