Neurogence t1_isyub9h wrote
Reply to comment by 0913856742 in Since Humans Need Not Apply video there has not much been videos which supports CGP Grey's claim by RavenWolf1
Ray Kurzweil (arguably one of the main proponents of the singularity) recently stated AI will create more jobs for humans.
People love capitalism too much to imagine a world without work.
RavenWolf1 OP t1_isyyan8 wrote
I was pretty shocked that he thought that way too.
RikerT_USS_Lolipop t1_isz7uwu wrote
He is very much a "don't tax the rich, just grow the pie instead" type. Any time someone asks him about growing wealth inequality he falls back to that. So if your conclusion is that wealth equalizing is bad, then you're going to work backwards and believe that systemic failures of Capitalism don't exist, and how can you support that idiotic notion? By believing technology isn't causing the game to be continuously and increasingly rigged against the little guy.
It's a human response. And humans are kinda shit.
BearStorms t1_iszgn9o wrote
Yep, me too. It is obviously wrong argument anyways IF you believe singularity will happen at some point. With superintelligence on tap humans will be just bunch of moody toddlers in comparison. Why would you let us do anything at all?
RavenWolf1 OP t1_it00blz wrote
I think he said that because he is working in Google. It seems like all tech people say the same thing. Maybe they are scared that people will start to blame tech giants if they say that AI will take jobs. That is only reason why I can think why all the tech people say this.
kmtrp t1_it03qv4 wrote
Yeah! I saw OpenAI's CEO say the same crap, something like "this will augment productivity, it'll be a companion to all developers...". Man, you are talking about a software that can code without a human! WTF?
So I am shocked and disappointed at the lack of honesty. The people working on these projects know that speech is full of shit, right?
BearStorms t1_it0ar10 wrote
>this will augment productivity, it'll be a companion to all developers...
Well, it will, only now you will need 1 dev instead of 100. Ask illustrators in like a year or 2...
FML, I thought software development will be the last job to go...I may be sooo wrong (I'm a dev).
kmtrp t1_it4khep wrote
Same thing for me, a former full-stack developer. Isn't it crazy? I mean paintings and drawings, and freaking programming? Especially given the state of front development? Incredible times.
BearStorms t1_it4m4wv wrote
Honestly, we'll see. The image generation is a problem where even a very imperfect result is perfectly acceptable. The coding is much harder problem, and then you have to remember all kinds of regulation, etc. But it's coming for everyone eventually. Ironically the physical blue collar trades working in a very heterogeneous environments like a plumber are probably the safest...
[deleted] t1_itp28k8 wrote
[deleted]
RemindMeBot t1_itp2a59 wrote
I will be messaging you in 2 years on 2024-10-25 07:27:36 UTC to remind you of this link
CLICK THIS LINK to send a PM to also be reminded and to reduce spam.
^(Parent commenter can ) ^(delete this message to hide from others.)
^(Info) | ^(Custom) | ^(Your Reminders) | ^(Feedback) |
---|
FomalhautCalliclea t1_it4psvw wrote
Especially since Sam Altman (OpenAI's CEO) has been quite open and outspokenly extremely optimistic on tech progress, talking about things like "free energy" (fusion) and AGI soon, more or less.
He also spoke about UBI and a need to radically change our economy. I wonder if he (and others) have multiple opinions and faces they show selectively in regard with context.
kmtrp t1_it4q6bf wrote
Most probably, it's an obvious CEO trait too.
FomalhautCalliclea t1_it4qpv2 wrote
I hope it's a "Charisma -100 / Perception +100" rather than "Charisma +100 / Perception -100" character trait.
BearStorms t1_it0aiu2 wrote
I think you are right and that makes it even scarier...
haptiK t1_isz4lqi wrote
> Ray Kurzweil
why does this guys website suck so badly?
iNstein t1_iszv3yc wrote
Ray is right about AGI/ASI and the singularity. Beyond that, I consider his work to be self serving and horribly wrong. Fortunately none if this relies on Ray so we will get our new world regardless of his poor mid term predictions and misguided ideas of the society that will result.
Sashinii t1_isz8ahd wrote
Well said. There's no way there'll still be any economic systems post-singularity.
Sotamiro t1_isz9j4a wrote
They will still exist... in my simulations
RavenWolf1 OP t1_it00lpm wrote
Exactly! Like in games. I love all those strategy games and city building games. Those have to have economies!
[deleted] t1_iszavef wrote
[deleted]
Bakoro t1_iszogjh wrote
Economics will exist as long as there are people. Scarcity will always be a thing, it's essentially a law of the universe.
There is only so much beachfront property, only so many houses with an ocean view, only so many people who can live on the top of a hill.
One way or another there will have to be a way to decide who gets what limited resources, and who gets the new things first.
Even if you just make everything timeshare, so everyone takes turns with exclusivity of a thing, some people won't care about one thing but will want more of their favorite thing. Some things will be more popular.
"I'll trade you my week in Maui for a day in the glorgotron" you'll say, and I'd be like dang, that's a good deal, the glorgotron gives me a headache anyway...
It's just a matter of what people value, what people want exclusive access to, and what is limited. If nothing else, people's time will always be somewhat valuable into the distant future.
RavenWolf1 OP t1_it01dni wrote
>Economics will exist as long as there are people. Scarcity will always be a thing, it's essentially a law of the universe.
Economy sure but not money necessary. Economy does not mean money. But I agree. As long as humans values something then we create value for it. In human society something is always valuable, like beauty or friends etc. Value is which causes us to have standing in society. We always have something which differentiates us from others. We give value for things which others don't have.
Sure we can have infinite energy and resources but there will always be something which creates hierarchy in our world. We live in society after all.
Bakoro t1_it1bxzb wrote
>Economy sure but not money necessary. Economy does not mean money.
Money is a useful abstraction for value. How many chickens to a television, and televisions to the beach house is a hard problem.
If you have resource tokens, its basically the same thing. The right to requisition x food resources and y labor resources, and z land resources. Anything fungible which replaces direct barter ends up being similar.
If humans are to still exist, they'll have to be part of the equation in terms of directing the AI. Like, who decides what the AI spends its discretionary time on? If the AI doesn't have its own motivation and interests, or otherwise just allocates resources to human requests, that can be a kind of money in and of itself. Start off giving everyone an equal share of AI requests, and the requests which generate the most positive feedback from the community yields more time to the person or group who made the request, and people can trade AI time share just like money.
I personally like the resource allocation model. It's basically money, only it ties value to quantifiable things. That's only viable when you have highly mechanized everything where the energy and time costs are highly predictable, like a society mostly by AI.
Viewing a single comment thread. View all comments