BigZaddyZ3

BigZaddyZ3 t1_jcq2i9j wrote

The problem with this logic is that, if everyone is equally capable of something (in this case, art), there’s actually no value or achievement within it. There’s nothing to “flex” because you’re merely doing something that everyone else is equally capable of doing thanks to AI. This will also impact the enjoyment of said thing because that’s often directly linked to a sense of accomplishment or achievement as well. Which like I said, will be basically non-existent in a world where AI makes everyone an artist.

Think about artistic talent like a college degree basically. The more people with that same degree, the less valuable that degree is as a whole. A lot of people don’t seem to understand that, the scarcity of a skill is what gives the skill it’s value.

2

BigZaddyZ3 t1_j9ebv70 wrote

I’d say we have much more control of our own personal lives now than we would during a singularity. We also can’t individually determine what technology hits the market (and what it does to society), but we can control our own decisions and reactions to said technology right?

By pretty much every account, we are way more in control of our own lives than we are of automation and AI. So it’s kind of a silly comparison in a way.

1

BigZaddyZ3 t1_j9d5j86 wrote

Reply to comment by Spire_Citron in Relevant Dune Quote by johnnyjfrank

Right, you could literally guess it and be correct. This is why I stay open to all possibilities when it comes to the future. Especially at the current moment. The future of humanity is as wide open as it’s ever been.

1

BigZaddyZ3 t1_j9d4de6 wrote

Reply to comment by NanditoPapa in Relevant Dune Quote by johnnyjfrank

Fair points. But biased or not, we can’t say which ones got it right and which didn’t yet (at least for most of them). So you can’t totally rule the Dune scenario out yet right?

And I get your point about the dystopian bias, but you can’t just gloss over things with your own optimism-bias as well. There have been actual dystopian periods in human history (such as the holocaust or mass slavery for example.) There’s no guarantee that we’ve seen the last one. In reality, things could go either way is all I’m saying.

4

BigZaddyZ3 t1_j9d28k4 wrote

Reply to comment by NanditoPapa in Relevant Dune Quote by johnnyjfrank

Right but let’s not act like works of fiction have never predicted the future before. The Simpsons alone has made some pretty accurate predictions that panned out years after the fact. The truth is that we don’t really know what the future holds or which sci-if scenarios will actually prove prophetic.

3

BigZaddyZ3 t1_j9cx6pp wrote

I kind of sensed that from how idealistic and unrealistic a lot of the users here seem to be tbh… I’d say you’re far from the only one who’s like that here. I get that exact vibe from like half the users here.

That doesn’t sound like a healthy state of mind to be in honestly. It isn’t really wise to place all of your hopes and dreams on something that you can’t even predict the final result of…

2

BigZaddyZ3 t1_j9cclqf wrote

The funny part is that even when these AI get as good at writing as humans, most people won’t be able to monetize their “stories”. Anyone who actually understands economics understands that higher supply equals lower demand for each individual story. So flooding the market with stories just creates saturation and lowers the amount of money each story could fetch. Eventually when the supply is high enough, most people’s work will be worth little to nothing.

16

BigZaddyZ3 t1_j938yb6 wrote

Well, in my defense, I’m just giving my opinion based on everything I’ve learned about the subject over the years. Just like we all do in this sub all the time. It’s not a crime to be confident in your opinion. And from the conversations we’ve had so far, you aren’t that much different when it comes to that.

But yeah, I was only giving my take on how things are likely to unfold. I wasn’t saying it was a 100% guarantee. If that’s what you thought then I see where some of the tension and confusion stem from. I wasn’t trying to say that it was an undeniable certainty. Just that what I described seems most likely to occur (imo).

1

BigZaddyZ3 t1_j90au3a wrote

>>The first person to use the concept of a "singularity" in the technological context was John von Neumann.[5] Stanislaw Ulam reports a 1958 discussion with von Neumann "centered on the accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue". [6] Subsequent authors have echoed this viewpoint.[3][7]

>>The concept and the term "singularity" were popularized by Vernor Vinge first in 1983 in an article that claimed that once humans create intelligences greater than their own, there will be a technological and social transition similar in some sense to "the knotted space-time at the center of a black hole",[8] and later in his 1993 essay The Coming Technological Singularity,[4][7] in which he wrote that it would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate.

>> Some scientists, including Stephen Hawking, have expressed concern that artificial superintelligence (ASI) could result in human extinction.

>>The other prominent prophet of the Singularity is Ray Kurzweil. In his book The Singularity is Near, Kurzweil basically agrees with Vinge but believes the later has been too optimistic in his view of technological progress. Kurzweil believes that by the year 2045 we will experience the greatest technological singularity in the history of mankind: the kind that could, in just a few years, overturn the institutes and pillars of society and completely change the way we view ourselves as human beings.

>>The technological singularity—or simply the singularity[1]—is a hypothetical future point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization.

You were saying? How exactly can we achieve a post-scarcity human society after the singularity when the most prominent proponents of the singularity believe we won’t even be able to control technology by that point and that it will mark the end of human era in one way or another? Use your fucking brain for fuck’s sake..

0

BigZaddyZ3 t1_j90473x wrote

Lmao do you actually think I care what you think enough to go through the trouble of doing that? 😂😂Fuck off, I’m literally about to go to bed. I’m not gonna write a fucking research essay for you. Go do your own research if you care that much.

−1

BigZaddyZ3 t1_j903o4e wrote

>>There’s no way an AI would randomly be able to control that amount of energy without us knowing of the mechanisms used to control such energy, let alone seeing the structures built to move that energy around in a useful way.

Why not? Are you dumb enough to assume AGI will never surpass human cognitive abilities? Please tell me you’re not that stupid…

1

BigZaddyZ3 t1_j903fni wrote

I didn’t give a hard time line tho… A hard timeline would be me giving specific dates and shit. I didn’t. You seriously need to improve your reading comprehension skills bruh.

It’s just pretty much universally agreed on by actual experts that if we ever achieve post-scarcity, it’ll before any singularity occurs. No other order even makes sense. There’s no guarantee humans will even still be around post-singularity. And the singularity isn’t even needed in order to reach post scarcity. So do the math there genius…

0