Submitted by Darustc4 t3_126lncd in singularity
This is a link-post to the Time's article written by Eliezer Yudkowsky that addresses the recent open letter on slowing down AGI research: https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/
I personally think he makes a fair point: A 6 month moratorium will not work, and much less if it will only slow OpenAI down, allowing all other companies to catch up and create very dangerous and complex race dynamics. Shutting it all down is more sensible than it sounds at first.
acutelychronicpanic t1_je9mn7y wrote
Imagine thinking something could cause the extinction of all humans and writing an article about it.
Then putting it behind a pay wall.