Viewing a single comment thread. View all comments

soricellia t1_j9tn2xi wrote

I don't even think this is a strawman mate you've mischaracterized me so badly it's basically ad hominem.

5

HINDBRAIN t1_j9tnkfa wrote

You're basically a doomsday cultist, just hiding it behind Sci-Fi language. "The scale of the threat" is irrelevant if the probability of it happening is infinestimal.

−4

soricellia t1_j9tomaw wrote

Well I think that entirely depends on what the threat is mate. The probability of AGI rising up terminator style I agree seems pretty small. The probability of disaster due to the inability of humans to distinguish true from false and fact from fiction being exasperated due to AI? That seems much higher. Also, I don't think either of us have a formula for this risk, so I think saying the probability of an event happening is infinitesimal is intellectual fraud.

6