Viewing a single comment thread. View all comments

chefparsley t1_ja7p0z8 wrote

Why do you continuously make broad generalizations about the members of this subreddit and singularitarians? It's revealing that your initial assumption is to label them as degenerates who prioritize endless porn and waifu relationships over anything else. Most "real people" don't even comprehend the magnitude of the changes that will impact society over the next few decades, so it's not surprising that they don't care when presented with extreme versions of these ideas.

Additionally, in another comment, you state that people desire meaningful work or the ability to make a difference, but then contradict yourself by suggesting that we need to provide employment even if it is meaningless.

That being said, it's plausible that people may develop an anti-AI stance due to the rapid changes occurring in a short time span ( nowhere near billions though), but I think this would be mainly due to governments dragging their feet in facilitating the transition to a society that is heavily automated, rather than the change itself being the driving factor.

2

Yuli-Ban t1_ja7sccx wrote

> Why do you continuously make broad generalizations about the members of this subreddit and singularitarians?

I suppose I generalize because I see these attitudes and sentiments all too often being shared and upvoted, so there's a general sense that these are widely accepted viewpoints on this forum. It doesn't help when you see people often coming out and saying "I'm 15!" or "I just want this world to end so I can live all my dreams in VR."

As for the contradiction: both are correct. I feel people do desire meaningful work, but we absolutely do need to provide people some work to maintain a sense of stability in people, as humans are, as mentioned, reactionary apes who do not much like rapid change (generally). Meaningful work is desirable; meaningless work isn't desirable (why else would we be automating so many jobs) but is almost certainly necessary to keep society functioning long enough to even make it far into the AGI era. We absolutely need a grace period to wean ourselves off the need for work. We're absolutely not getting that grace period. And to people who say "Too bad, so sad," all I can hear the Luddites saying is "Oh well, guess this server farm at OpenAI's labs isn't that meaningful to you either then."

Will it be billions of Luddites?

I want to say no. But whenever I think about what exactly we're dealing with here, I don't see how you can come to any other conclusion. True, humanity isn't a hivemind. There isn't one position I think all humans collectively can agree upon, not even "I don't want to die." However, generally, most humans do expect stability and security, and there is stability in the status quo. A radical change to the status quo is tolerable, but a Singularity rate of change is much too scary by definition, especially if the benefits are not immediately available and punctuated by such freakish statements like "This superintelligence might decide to forcibly turn you into computronium; we really don't know what it's going to do." The prospect of a tech utopia is a great one, and most people currently seem to buy it. But I doubt that positive reception will remain when that tech utopia begins coming at the cost of their livelihoods and, potentially, their futures.

You're basically telling all of humanity "you need not apply" long before we've come to any sort of agreement on how we're going to maintain all of humanity, and at least some of the proposals given are "We'll just kill you" and "We'll let this superintelligence use gray goo to eat you." To which I ask "What exactly do you think is going to happen?" Only a few million plucky angry red-hats/blue-haired Luddites decide to take up pitchforks and fight back? No; if you're going to threaten all of humanity, you shouldn't be surprised if all of humanity threatens you back.

And again, I say this as someone who is pro-AGI.

If this doesn't lead to a giant Luddite uprising, it very well could equally lead to the alignment failure Yudkowsky fears, as even a friendly AI might see this extreme hostility and decide "The majority of humanity sees me as a threat; I must defend myself." In which case, it was not the Average Joe or Farmer John's fault for being exterminated when they had zero expectation or awareness any of this was going to happen even two years prior and, in fact, were being assured that there would still be jobs and work and a human future indefinitely.

4