HINDBRAIN
HINDBRAIN t1_j9sthbq wrote
Reply to comment by [deleted] in [D] To the ML researchers and practitioners here, do you worry about AI safety/alignment of the type Eliezer Yudkowsky describes? by SchmidhuberDidIt
"Your discarded toenail could turn into Keratinator, Devourer of Worlds, and end all life in the galaxy. We need agencies and funding to regulate toenails."
"That's stupid, and very unlikely."
"You are dismissing the scale of the threat!"
HINDBRAIN t1_j91o4fh wrote
Reply to [D] Please stop by [deleted]
>I wonder what the mods are doing
I'm seeing some of them disappear after 1 hour or so, so deleting the posts probably?
HINDBRAIN t1_j9tnkfa wrote
Reply to comment by soricellia in [D] To the ML researchers and practitioners here, do you worry about AI safety/alignment of the type Eliezer Yudkowsky describes? by SchmidhuberDidIt
You're basically a doomsday cultist, just hiding it behind Sci-Fi language. "The scale of the threat" is irrelevant if the probability of it happening is infinestimal.