Submitted by AdditionalPizza t3_y4mtzs in singularity
[removed]
Submitted by AdditionalPizza t3_y4mtzs in singularity
[removed]
Abolishing capitalism and instituting a true representative democracy would be a great start. Shift control of the means of production to the working class.
The biggest issue is with the inevitable human corruption that will inherently grow over time. Every system, no matter how many check and balances, if reliant on humans in pivotal roles, will be subject to coercion and corruption.
Eliminating the human element is something this subreddit has unique insight into and that I believe is necessary for a true egalitarian (and livable) future.
Sure but because of people like elon a lot of my friends don't trust the tech industry currently. They always go off about how in the future web 3.0 will force everyone to pay subscription fees for everything.
They aren't at all wrong to do so. It should be very clear to anyone paying attention that the subscription service to everything in life has been pushed harder and harder. Per Klaus Schwab's "You will own nothing and be happy" it is a coordinated world wide effort.
We can build AI and tech without corporations. Many incredible breakthroughs have come from governemnt research.
Subscription fees and individual transactions.
And you should be blaming the models developed by mobile gaming platforms and Microsoft, not Musk.
What economic system do you have that is better than capitalism?
I don't think the system currently exists. I would like to apply AI to helping address that issue with the objectives of equality, sustainability and global cooperation as prime factors for consideration.
I also think it's silly to think a singular system of governance will be ideal for all purposes at all points in time. We've seen how badly the founding principals of the US have adapted to modern technology and corruption.
Making companies pay for the negative effects they have on the world, in proportion to those effects, is one possible solution.
Examples (some are already common):
As well as legal punishments and laws for those who do highly damaging activities.
AI and AGI can be used to more effectively do all of this.
Yeah but you will also have entire groups of people against all of what you just said which equates to what our current political situation is as in a complete gridlock on the system so it just stays as it currently is.
And also there are people born with neurology that is psychopathy as well so even if we had a better system corruption would still happen I guess it'd be more recognized and addressed however the problem would still persist until we figure out how human brains work. And hope to God that the medical staff in charge of understanding our brains isn't corrupted when that happens.
The grid lock is because people have different beliefs and values.
Many people honestly value glazed donates and chips far more than the downsides of getting a heart attack at forty. They really do want to live a short life full of fast food and other treats and don't want taxes on those foods reducing their purchasing power.
We need to accept that other people have different values and we as an individual can't expect the whole world to bend to us.
As far as beliefs, people have many different beliefs, which is a problem since some of many of those beliefs are influenced by propaganda.
Right now, when people see an article/video/post after work about something, they often do not have the energy to read the research paper or other data sources referenced by that article, as well as think and reflect on the methodologies and interpretation of that data. So they believe the click bait.
When we have full automation people will have more time research, think, reflect, have conversations and debates, and even conduct their own experiments.
I don't think society would allow the government to force a procedure or pill on every one that changed their brains in an way. The psychopaths would get together and prevent those laws from occurring, find ways to make themselves the exceptions, or find ways to make it appear they got the treatment when they didn't.
The book Scythe attempts to address this. Basically the AI would need to be both benevolent and all powerful as the resistance to change by the "ruling class" would be severe.
Tier lists from worst to best (ideally backed by good data), ESG ratings (environmental, social, & governance) - like social credit scores for companies (and politicians?) - aren't perfectly done, yet, but I see that as a primary way forward..
Along with better cyber-models of existing operations and proposals for review.
Of course a lot more effective when more consumers, investors, skilled workers, contracts, & government funding are going to good X over evil Y.. but we can make that more obvious and easier with better ratings and platforms.
A.i. can absolutely help to gather that data, crunch those numbers, make comparisons, 'Amazon' recommendations (ideally an eco/socially beneficial alternative) and eco/social beneficial investment portfolios.
​
Obviously a bit more complicated to achieve true, deep, sincere environmental and social concern and virtuous action among oligarchs and business students & politicians and media..
(we can probably help this along with better engineered cyber-world education, training, guidance, therapy, community.. + medication, psychedelics.. )
But in the meanwhile, we can do better to help people, workers, consumers, voters, city councils.. move their $ and labor towards the better, away from the worst.. make 'the game' only winnable by the more ethical, most ethical, the most eco/socially beneficial..
to make low wages, exploitation, energy footprint.. on par with obvious racism, sexism, pollution.. generally bad for business.
Clearly some executives and boards and politicians were just going along with being not-racist and not-sexist only on the surface .. to win the game .. but for the purposes of this experiment, that's progress..
Meanwhile, genuinely good, beneficial, ethical, effective etc. shops, projects, programs, program coordinators.. should be getting more support, investment, skilled ethical workers, contracts, community partners, smart cooperative network help.. and so on, while evil mega-corps and bags-of-crap are trying to change gears.
That's the question in the post basically. Could the system fall apart when enough people in power are cured of these traits.
It's not really a biohacking suggestion rather than say you go to a doctor, they test your blood/DNA whatever, and run it through an diagnostic AI that tells you your levels of everything, predispositions, potential precursors, and current illnesses. It then creates a custom tailored medical regime to cure, and prevent.
I honestly don't see how that won't be a thing soon, it's logical, no?
I'm not sure how far fetched it is for AI to help figure out the cause of mental illnesses and how to treat them. People would just take the medication because it would cure everything you have.
THX 1138 (film) somehow springs to mind here. Must regulate those competitive emotional excesses!
Testing for empathy and selflessness would be interesting. In general we could have much higher standards for out representatives.
The issue now is that nothing will be implimented to change the status quo without mass upheaval and revolution.
I think it's more about trying to ensure the right kind of system is implimented after the inevitable collapse of our current one.
I agree. My post is just optimism for the process in which that could have a possibility of happening. Hopefully we don't focus so much on fearing AI alignment that we forget to fear each other. Both are equally important, for the majority of society anyway.
I think it's more likely we'd see people modifying themselves to be more "competitive".
As seen in Accelerando by Charlie Stross. It's a singularity of bionically enhanced day traders with computer upgrades in their overclocked brains who run the solar system on "Economics 2.0" that mere human level minds can't understand.
I think that would be further in the future than curing things we suffer from now. And if cures come before modifications like that, then capitalism would probably already be gone. That's my argument anyway. A sliver of hope that CEOs grow hearts.
Though there is a difference in degree, All human beings are selfish, not only the rich are.
I've seen too many ugly acts of the average human being.
I agree, but if the rich get things first, they could be "model" humans first. And that could inspire them to share.
While there is certainly an overwhelming amount of billionaires who could contribute more meaningfully to society, I think more focus should be on how we’re going to be competitive with rising powers like China and their highly controversial world views.
I agree that if anyone needs more empathy, it’s world leaders because they directly influence the direction of a country.
That's definitely a concern, but as I understand it countries like China aren't currently capable of producing the GPU's required for large transformers on their own.
But one concern doesn't mean other concerns should be on the back burner. A medical breakthrough capable of curing all, would definitely shake the world for the better.
Hoarding is a natural result of evolving with limited resources and seeking stability for yourself and your tribe. It's not sociopathic. People are to amass that much are a combination of better at building it and lucky.
It's not always the hoarding part that's specifically sociopathic. It's how they hoard and maintain/grow it, and how they got there in the first place.
Paying low wages, unethical practices, all that sort of thing. But I'd say it isn't natural to have so much when others have so little. Maybe a lot of people do that, but normal/average doesn't equal healthy.
I can agree with that. The only amendment I would make is to change "they" to "some" or "many." But yeah, many code probably use therapy and are doing something not to be celebrated. Because they are successful there is probably little incentive to get psychological help. Many will be surrounded by sycophants so that will hinder them as well. Another thought is that it's a system of a diseased government. The extreme disparity of wealth is often seen the collapse of empire.
Well I mean "they" referring specifically to the ones I'm talking about.
But I don't think convincing people to take a cure all, be healthy, live longer medication would be difficult. Aleast not for many wealthy people, I'm sure there will be anti-longevity people.
I just wonder if that medical breakthrough happened, if curing this behaviour in politicians and wealthy people would be a welcomed side effect.
Honestly, the more simple solution is that we don't really need billionaires, contrary to the propaganda.
Suppose you implement a compression function, so that the people with wealth in the range of 100 million to 500 billion all fall into the range of 100 million to 500 million, preserving their relative ranking.
No incentives change, but billionaires have ceased to exist.
One thing billionaires do provide is an efficient allocation of capital.
So put each of these persons excess wealth in a fund that copies the rest of their investment decisions. So the person with 200 million might allocate 20 million to an investment, then his fund might allocate an additional 20 billion.
Society would preserve their investment wisdom.
This is just a thought experiment to get rid of billionaires without changing anything else.
Of course, I believe we can allocate capital more efficiently than billionaires by utilizing markets, but many people don't believe that - hence this thought experiment.
>One thing billionaires do provide is an efficient allocation of capital.
We are allocating capital to unnecessary weapons systems and dirty energy. Not efficient at all.
But more efficient than spending it on vacations to Hawaii or imperial/genocidal wars like Russia does.
As I said, I think markets can be more efficient than billionaires. But they are more efficient than some other alternatives.
This isn't markets, it is billionaires bribing politicians to do what is best for their business not what is best for society.
I know. That's what I said. Read it again.
I agree we don't need billionaires, but they exist and they have a desire to hoard their money and power.
Can that desire just be leveled out? It must be some mental illness they have, and I'm sure plenty of common folk have it too, just without the luck.
They only exist as a side effect of our tax code.
FDR basically culled them all (with help of WW1 and the Great Depression) and they only started to come back around the 1980s.
It really is up to the G7 countries, which happen to all be democracies, to decide how we allocate capital across the globe.
We can't cull the desire though. Some people are just wealth maximizers and empire builders and I don't think that's necessarily a bad thing. I actually think it is a good thing, because they help society build capital, which is better than spending everything on short term living.
But just as we decided that it's no longer acceptable to build physical empires by conquering land, at some point we will have to limit how much of the capital stock can be conquered by these empire builders.
I agree they only exist because of how the world operates. The difficult part is reversing how the world operates. And it just had occurred to me, the hardest solution may be the simplest. Changing billionaires minds, rather than legislation.
Its a matter of time until healthcare takes off in ways we thought were sci-fi. We're also dangerously close to critical unemployment when a few more narrow AI are released that out compete humans in significant sectors.
Things like open ai codex, it doesn't have to replace programmers, but if you increase programmers efficiency by even small percents it has a cascading effect across all sectors of IT.
> is that we don't really need billionaires
They don't really need you. I don't either. So what now?
>One thing billionaires do provide is an efficient allocation of capital.
No, markets generate price information which allows for producers to make educated guesses about resource allocation.
> hoarding generations of income
That word "hoarding" shows exactly where you stand and why you're looking at things from the wrong perspective.
Rich people don't "hoard" wealth. They invest. When someone says "Jeff Bezos has $200 billion" this doesn't mean he's sitting in a money bin with 200 billion dollar banknotes. He isn't sitting on a pile of things money can buy either.
I've heard people say "$200 billion is too much, let him keep one billion and give the rest to the poor". How does that sound to you? Each person in the world would get $25 worth of Amazon shares. Many people would opt to sell their shares, meaning the price would plunge. Amazon would become a penny stock, rich people would buy those shares at a bargain price and the situation would be back exactly like it was at the beginning.
Okay, that wouldn't work, so let's make those shares non-negotiable. Every person would keep his share. This would make Amazon a political organization. People would run for the office of CEO, promising to the people whatever they wanted in order to get elected. The end result, no more Amazon, the corporation would be destroyed and other organizations, like AliExpress, would take their place.
TL; DR: the situation is as it is because it works. The free market is the most effective way we know to run an economy.
>Rich people don't "hoard" wealth. They invest. When someone says "Jeff Bezos has $200 billion" this doesn't mean he's sitting in a money bin with 200 billion dollar banknotes.
Haha, yes I'm well aware of how that works. But they simply use their stocks as collateral to obtain essentially unlimited cash flow through loans anyway.
But that's not even the point, I'll try and explain it better. They hoard wealth in terms of general wealth, power, value, influence. They have a need to perpetually obtain more and more market share. They have an obsession to do whatever it takes to be on top of everyone around them.
If a diagnostic AI existed, and balanced that obsessive trait, then the market share from the companies could be used to fuel prosperity for all, instead of prosperity for some. If they were freed from those traits, they could focus on betterment for others.
This would be the beginning of the end for capitalism. If suddenly Walmart started offering everyone profit shares in the form of thousands of dollars worth of gift cards every year for free, for example. Do you think people would still go use amazon? Maybe here and there, but I would probably do all of my business at Walmart.
That's just a simple, dumb example. It'd be a transition from capitalism to whatever proves superior in an unemployed world with AI doing the heavy lifting.
What you're describing is still capitalism, the competitive market where free actors can allocate resources in the manner that produces the best return.
Everyone right now can in fact own the means of production in this system. Everyone can invest some of their earnings in the stock market. Just buying QQQ guarantees ownership of the top 100 fastest growing large companies on NASDAQ.
QQQ has averaged 16%/yr for the past 10 years. A low end worker making $15/hr could invest 10% of earnings per year in QQQ and be a millionaire in 25 years even if he never got a raise.
You don't need UBI, or special training, expensive schools, a pedigree or the right background. You just need to save a little money.
I'm not down voting anyone here, fyi.
But I know what I'm describing is capitalism, I wasn't claiming it isn't. But I'm talking about the transition from it, toward a more sustainable system before AI becomes something only the "haves" get while the "have nots" fall into total poverty.
Gotcha.
I don't think there is a more sustainable system than capitalism.
All of the progress and wealth generated in past 200 hundred years has been from capitalism. Even perverted, not quite free markets in China have done more in 20 years than all of the five year plans of socialism combined.
I also understand that the portion of wealth flowing to labor has been declining since 1971 (https://wtfhappenedin1971.com/), but the average person's access to the other side of the ledger has increased too.
Maybe an option is having social security buy into capital markets and give non-investors upside access to capital gains. Whatever options are tested, the only guarantee is that systems with the freest markets where labor and capital compete to provide goods and services will be the most successful.
Well the whole subject of transformative AI relates to some kind of revolution. Not in the sense of an over-throw scenario, but like the industrial and agricultural ones. We are in the midst of a revolution. Everyone here is focused so much on the date of AGI/ASI and the singularity, of course that's the sub, but we don't need those necessarily. I have no doubt they will be soon enough, but the next 5-10 years will be the most important in human history so far. The computational revolution or whatever you want to call it, probably something catchier than that. Of course that can be superseded shortly after by AGI or whatever, but regardless of people predicting the singularity in 15 years or 100 years, the revolution is already here, and transformative AI is "slowly" coming out every few months. We don't need 100% unemployment to reach a crises, we need like 10%.
Capitalism just simply cannot be sustainable when unemployment rates start rapidly rising. We can say things like history repeats itself, there were actually more jobs created from the industrial revolution, etc. But history doesn't *always* repeat itself. We are going to automate everything, or at least enough things that there just simply won't be a *reason* to work. The industrial revolution everyone feared being unemployed, this time around we should fear if we will still need to work.
Capitalisms has been exploited to see the gains we have so far, the wealthy still need the lower classes. Without us, they don't sell products. Currently they need our labour, shortly they won't. Do they think they can just take our labour and paychecks and we can still purchase their products? It just won't work.
> Haha, yes I'm well aware of how that works. But they simply use their stocks as collateral to obtain essentially unlimited cash flow through loans anyway.
So? Other people are loaning them the money.
> then the market share from the companies could be used to fuel prosperity for all, instead of prosperity for some.
Translation: monkey's throwing rocks at a high pressure boiler.
>This would be the beginning of the end for capitalism.
Mystical nonsense.
The end of capitalism just means state control, infringing upon fundamental rights.
>to whatever proves superior in an unemployed world with AI doing the heavy lifting.
Mises wrote the economic calculation problem in 1920, and people still don't understand it. It's not that complex, the issue isn't calculation but not having the data to calculate. Not hard to get, not different methods to get, it's impossible to get without markets generating price information.
Ok. Then capitalism will remain through transformative AI, only the powerful and rich will have access to AGI and later ASI. And they can go live in their mystical future, the rest of us can start from scratch or die.
I get it, you like capitalism.
> Then capitalism will remain through transformative AI
Capitalism isn't a political ideology, it is not some centralized rule set. You're forcing the concept into your political framework.
Capitalism is the lack of the political. Like atheism is the lack of religion.
>only the powerful and rich will have access to AGI and later ASI.
Again, no. AI will exist as part of an intelligence explosion. AI will be everywhere. The individual will have their own AI, mostly likely multiple ones at different intelligence levels.
Again, you're forcing all these concepts through your centralized political authority framework.
If there a hard take off from an ASI it's all moot anyway.
>the rest of us can start from scratch or die.
No, capitalism is a situation where you don't have to associate if you don't want to.
There is no reason that you and others can't interact in your own markets. Again, don't push everything through a central control paradigm.
>I get it, you like capitalism.
Yes, I like it when people don't aggress against me.
You do to, you want to be treated according the ethical framework which supports capitalism: Self-ownership and derived rights, freedom of association, self-defense, and property rights.
You, like many, aren't thinking clearly about these things. You're uneasy, constantly pushed and pulled by state manipulation. But if you take a step back, consider how you'd like to be treated, apply it universally you'll see how this is the best, most ethical path forward.
Man, you are coming in with assumptions about a lot of things I didn't say or imply. I'm sorry I don't have a rebuttal to discuss, I'm not sure where this debate went.
> about a lot of things I didn't say or imply.
Your assertions require the implications.
> They hoard wealth in terms of general wealth, power, value, influence. They have a need to perpetually obtain more and more market share. They have an obsession to do whatever it takes to be on top of everyone around them.
There are people like that, yes, but it would be even worse without a free market. Look at Russia, North Korea, Cuba, China, etc. Economic power and political power should be separate entities, kept as far away from each other as possible.
But, of course, there are people who love power, they will work to achieve power, be it through political or economic ways, whatever is available.
The free market works so well because it's distributed intelligence. Every person makes their own choices. You can shop at Amazon or Walmart or at your favorite local shop, the decision is yours. I, for one, wouldn't want an AI making such decisions for me.
Where AI could make a big improvement would be in government. There are systems where we have no choice. The government is in charge of fixing the potholes in my street, if an AI took care of that it would be a welcome improvement.
Well capitalism strongly encourages economic powers to influence political powers. So much so, that to call what we have now as "decent" would be a stretch.
The free market only works because it's the best option we have, or at least that we came up with. And there's no opt out, so those of us at the bottom are kind of stuck being pushed further and further down. I would call our current system a system with no choice, at least for almost everyone.
Capitalism won't work forever, or even much longer most likely. Well, not if we're assuming a transformative AI is a matter of years away.
> capitalism strongly encourages economic powers to influence political powers.
That's not how capitalism works. Everyone who actually manages a company would prefer to keep the government at a distance.
> those of us at the bottom are kind of stuck being pushed further and further down.
People at the bottom can join each other and work together. They can form a cooperative or any other organization they want.
>That's not how capitalism works. Everyone who actually manages a company would prefer to keep the government at a distance.
You think CEO's aren't in bed with politicians? You're describing capitalism on paper, it's nothing like it was 100 years ago. Sure companies hate when politicians stifle their progress, but campaigns are funded by the wealthy.
They must adapt to the political situation, but every one of them would prefer less government, if you asked them.
the free market only exists in liberal academic think tanks and anarco-capitalists 12yo brains, it's an utopia. In the real world, markets are heavily regulated by the state and the countries collecting the bounties today are the ones that enjoyed a huge boost in their base industries (all financed by the state of course).
I can see you're intelligent, your opinion is not wrong in its conclusion but I think your premises are wrong. If you allow me to make a suggestion, i'd tell you to try reading materialistic historical analysis of capital and its relations with the formation and consolidation of society throuhgout time. If anything, more education will absolutely not hurt and if you disagree with everything in this area of study, you'll at least have a great advantage when making your argument.
> markets are heavily regulated by the state
Yes, markets aren't totally free, but that's not a shortcoming of the free market. The free market is an abstract model that we should try to pursue as much as possible.
Few on this sub will understand you. They're brainwashed into thinking capitalism=bad and socialism=good and don't realize the same "sociopaths" end up on top in either system.
At least capitalism forces them to produce a good or service that's of value in order to maintain their position.
This sub is literally about an AI revolution that essentially cancels capitalism because it likely won't work post-singularity, possibly pre-singularity as I'm hinting at in my original post.
Capitalism is currently the best we have, and it's "worked" well enough to get us this far. But I can't imagine a future with automated workers, and the unemployed just die off.
I understand that point of view, I'm trying to provide a counterpoint. Just because the industrial revolution eliminated 90% of agricultural Era jobs didn't mean that people were just cattle to be culled. Same thing with the decline of manufacturing.
Prior economic revolutions replaced existing employment with higher paying, more productive work. The same thing will happen over the next 20 years. I don't know what it will look like - maybe early career starts as a manager of AI/robotic "workers." Maybe AGI psychologist becomes a popular field.
Although UBI sounds like a great idea that's fair and helps everyone, it seems to devalue individual worth. "You can't do as well as the machines, so we'll put you out to pasture." There's an under current of defeatism to it, but I think the AI revolution offers a more positive future even as it is destructive.
>The same thing will happen over the next 20 years.
The problem would be that AI would be cheaper, and perform several magnitudes more efficiently. There just won't be a job that AI can't perform better and for less cost. 24/7. At that point why would we even be striving for work? We should be striving to be unshackled from working half our lives.
>devalue individual worth
This could be the hard pill to swallow during an upcoming revolution for a lot of people. The "value" and "worth" will cease to exist. People will have to come to terms with unemployment no longer being a thing, it's simply mandatory employment disappears.
The part I worry about is the people that believe employment is essential and will prolong the suffering of many by dragging others to the bottom before the eventual collapse of the system. UBI is simply a stop-gap.
Hmmm. I appreciate your responses, you've given me some things to think about. It's likely that regions/countries will approach these issues differently which will give us opportunity to see what works.
I still think there will be work for people, but it may be "make work" of the FDR New Deal style.
I think it's correct to assume different countries will attempt different things. It's really hard to say because while we can compare revolutions in history, it doesn't really help us understand the implications of this one. This isn't the industrial revolution, this is a replacement of the of the entire workforce sector by sector. The first sectors to go might find other employment. Then more sectors will go, and then more.
We're waiting on the first big sector to go nearly or fully automated. Graphic designers aren't a sector, they're part of a creative sector. When entertainment and art is mostly automated we will see. But it could be medicine, could be legal, could be computer sciences, could be retail. It could be ones that have labour involved but that seems less likely at the moment but if there's a breakthrough in robotics soon that will spell the end of that sector.
One sector will probably knock I don't know, 5% of the workforce out? Maybe more? That's an immediate crises. Then another. Then another.
​
>I still think there will be work for people, but it may be "make work" of the FDR New Deal style.
That's probably a solution we'll see attempted, but I don't know. I don't want to see that. That's a bandage for a giant wound. It might work for a few months, but then more people become unemployed.
Open AI's Codex is crazy. That tech will accelerate all facets of IT, which means we are increasing the rate of exponential growth by orders of magnitude.
But you know, I hope you're view on it is more correct than mine. At least for this transition period coming up soon enough.
How do you convince them to take it?
Longevity and health? It wouldn't be billed as something that destroys capitalism, it would be such a health benefit both physically and mentally.
I don't know who would actually need convincing to cure all ailments, right?
They would buy the version that didn't reduce their competitive edge.
Perhaps. But maybe enough of them will see "sociopathic, anti-social, psychopathic" on the diagnosis and want to cure it. If enough do, then the rest that don't will be the old dinosaurs. It's hypothetical though, just a possible scenario that could ease some people's mind. There's a trillion possibilities of how the future will play out, it's important to keep some optimism is all.
The whole discussion is based on a flawed foundation. It starts with an assumption that a certain trope is true without scientific evidence it is true. Anecdotally, I have personally met people in the tens of millions or hundreds of millions and not billions, indeed some of those ultra wealthy people are less sociopathic than certain regular people I know.
The only reason billionaires like Bezos exist, they made billions of lives easier (low cost shopping convenient delivery). Sure, one can argue their treatment of workers could be better, but at the end of the day Bezos had to solve problems to get rich.
Lastly, the billionaires you are talking about “force medicating” are the same billionaires who brought us closer to the singularity in the first place (founders of Google and Microsoft, for example). The singularity should bring low cost resources for all, it will be hard for the rich to hide or hold on to the tech because there’s a profit motive on bringing low cost resources to all.
>the billionaires you are talking about “force medicating”
I'll start there, just because that's not at all what I said haha. They would possibly take a cure-all for longevity and healthy aging reasons. Not capture billionaires and shove a pill down their throat. I imagine it as a daily mix of medications and vitamins to make people healthier and prevent diseases. And people would regularly go and get diagnostics done to keep in top health.
>Sure, one can argue their treatment of workers could be better, but at the end of the day Bezos had to solve problems to get rich.
Sociopathic reasoning. "Eh so what if a ton of people suffer so I can have everything."
And I am fully aware capitalism is what we have and it's currently the best solution or at least appears to be. I'm talking about the future when this system just simply won't work, at least at some point it won't if we prescribe to the idea of a technological singularity.
> I imagine it as a daily mix of medications and vitamins to make people healthier and prevent diseases. And people would regularly go and get diagnostics done to keep in top health.
And who do you think is running the companies that make this stuff?
Right now, those billionaires. People assuming I suggest drugging them to be more empathetic. I'm suggesting a possible optimistic scenario to those that have doubts billionaires and world leaders won't suppress us further with technology.
I don't think they necessarily need to be sociopaths, they just need to be so laser-focused on their goals that they ignore everything, and everyone else. And if they have those traits that make them successful, why would they want to give them up?
They might not "want" to, it'd be a side effect of a cure-all longevity treatment (in my scenario).
But laser focus and hard work doesn't mean extraordinary success. It requires a lot of luck, and unethical practices. It's the unethical part that needs cured.
They don't all "have to be", it's more that those traits are more beneficial to reach that level. The problem isn't the billionaires but rather the entire FIAT system and the institutions that centralise power to the few families and wealthy elite.
If you could click your fingers and all the top 1000 richest disappeared while leaving the existing power structures, then new billionaires will fill the vacuum and nothing will change.
My idea here is that a medical treatment that encourages longevity and cures physical and mental illnesses could essentially cure the traits of these people. So nobody would fill that void if nobody exhibits those traits.
But I agree the entire system is the problem, but it's just a solution to one problem, hypothetically.
Who made the billions of money to exist.
Central banks did !
They are the source of the problem.
And they will be the first to take/afford longevity treatments that cure l/prevent all illnesses, right? There's still people at the top.
Calling that a trope makes the question appear to be bad faith. It's a clinical fact. This is important because if we are to create a high likelihood of solving a problem we would do best to soberly and accurately assess said problem. Do we endeavor to eliminate sociopathy as a psychological possibility?. If the focus is billionaires as opposed to sociopathy, then we should direct our focus to the economic systems which allow the existence of billionaires.
>It's a clinical fact.
I don't have the data on billionaire's diagnosis haha.
>Do we endeavor to eliminate sociopathy as a psychological possibility?. If the focus is billionaires as opposed to sociopathy, then we should direct our focus to the economic systems which allow the existence of billionaires.
Yes we endeavor to eliminate the mental illness. No, we don't focus on eliminating it from billionaires, we focus on eliminating it in general and billionaires would likely be the first to receive new/expensive treatments.
Yeah, we are working off of different definitions of these words.
As other have said the system rewards sociopathic behavior. But you also have to consider in order to keep a billion dollars you'd have to be a sociopathic. Most people I've known wouldn't hold onto a billion dollars if it was just given to them. They'd either spend it and/or donate some until it was just in the millions or even less. It takes a certain kind of inhumane behavior to simply hold onto that kind of money. The idea of having the money is better than what can be done with it in their heads.
I agree. That's where my post came from. We often think about the ultra wealthy getting all of this stuff first, and possibly never letting common folk get their hands on it. Well, maybe the cure-all treatments of the near future will cure their heads from holding onto everything for generations.
The problem is the disconnect from society that those people have. That doesn’t just happen to the rich and powerful, they are just in positions to exert their will on a major % of the population. The singularity will have the capacity to “cure” that, but will also have the power to create humans even more disconnected from “us”. Basically there will be more terrible people post singularity, but I think access will be easier to more people, making the everyday life of most people, much better than it is now.
See the way I look at it is the singularity for sure will be unimaginable. I think a lot of people underestimate how unimaginable it will be. I don't expect you or I will be the same after it happens. The world will launch from a brisk jog to the speed of light in an instant.
What you're describing to me sounds more like transformative AI. It will be a revolution much larger, but not unlike the industrial revolution. We're already in the beginning of that. We haven't reached its full potential yet, but I've heard experts saying ~30% to ~40% of the way to reaching that point, whatever metrics they use. But that doesn't mean AGI or the singularity. It means the point where AI creates a situation equal to or greater than the industrial revolution relative to its effect on the world.
We could likely see cures for several, possibly all health issues in this time period. When you have transformers with trillions, possibly quadrillions of parameters in this decade, we will most likely see this kind of application.
This has nothing to do with the singularity. After the singularity billionaires will be in the same position as al the other normal humans in the face of the entities that control everything.
If you have the exact blueprints of how the singularity will go down, then sure. But we have no idea if AGI is absolutely necessary for the singularity to occur yet. We don't even know if self awareness will be possible in AI, so it's possible a single entity could control an ASI and use it for whatever they want. We have no idea.
If a human level intelligence remains in control of society it's not the singularity. It's possibly better than the singularity, for the most of us, but "singularity" is not a synonym for the general topic of posthumanity and post-scarcity society. It's a specific phase change in society in which our kind of humanity, if it still exists, understands it no more than a dog does.
> we are entering a regime as radically different from our human past as we humans are from the lower animals > > https://edoras.sdsu.edu/~vinge/misc/singularity.html
Yes I've seen that very old writing of it. There are several more modern view points on what a technological singularity could be, they include many different ways of achieving it, but they all conclude the same basic thing; Unfathomable change and uncontrollable runaway technological innovation.
Regardless, we can agree to disagree on that point. I was trying to avoid talking about AGI and self aware AI anyway.
That's basically the foundational document of singularity theory.
What you're talking about here is not "Unfathomable change and uncontrollable runaway technological innovation". It's a very fathomable change that involves a society only minimally different from our own.
>That's basically the foundational document of singularity theory.
Yeah I know. If you consider it a definition of a word, I can understand not wanting to change it. If you consider it a theory, then well theories evolve all the time.
But I know, I was absolutely not talking about the singularity. I mentioned it because the original comment was referring to it, and I said what they were referring to sounded less like post-singularity, and more like transformative AI. I was actually mostly avoiding talking about the singularity in this post, more about pre-singularity.
You don't suddenly get the theory of evolution evolving into the theory of relativity.
You're very set on a sci-fi writer from 1993's version being the absolute. He didn't even come up with the term, he just made a popular theory about it.
If you want to be so concrete on one man's theory, you should probably go with the original at least. Not just the first most popular. The entire definition was originally a rate of returns on tech that surpasses human comprehension. That's it, and I'm sticking with it.
> surpasses human comprehension
This is the bit you don't seem to be getting.
If our current human social structures are still in place, it's not the singularity.
The singularity is literally a point in time though. It's not an ongoing event. We possibly have our social structures > singularity > we no longer have our social structures.
I don't think you're understand what I'm saying. To be honest, I don't understand what your argument is either. I don't even know what we're debating at this point.
That's right. It's a locus in time we can't see beyond.
That's the point. It's not just more of the ongoing exponential growth in technology that we've been dealing with since the industrial revolution at least.
I don't think you're actually disagreeing with me any more.
>I don't think you're actually disagreeing with me any more.
Honestly I don't think we ever were. Aside from whether or not it requires a self aware AI and the ways to achieve a singularity situation, and the definition of it.
We both agreed it's a moment in time we can't predict beyond. I had never stated anything less, the original commenter stated something differently and I disagreed with them.
> Aside from whether or not it requires a self aware AI and the ways to achieve a singularity situation, and the definition of it.
I never even suggested that. I said that it requires a mind more powerful than a current human, but that could be enhanced and upgraded humans. But there is no reason to assume their roles would remain similar to those in Economy 1.0... odds are strongly against it. And it's not going to be the super-rich in general getting the risky implants.
If those minds are just tools under the control of the likes of Musk, though, that's not the singularity.
Oh, then I'm afraid our signals got mixed up somewhere along the line.
I do wonder if the singularity will affect those that refuse to take part in the technology before it. As in, some people choose to live off the grid, will they be left alone. I don't know, topic for another time I suppose.
We could fix it already if we really wanted to. But it's just not something we prioritize.
In a post-singularity world we would probably have very different attitudes towards wealth and status, so I wouldn't worry too much about this. It's the pre-singularity world that's the problem.
I agree, and that's the optimism of the post. Hopefully we get a sort of revolution long before the singularity. Otherwise it will be a very brutal road toward the singularity. But personally, unless the singularity occurs much quicker than I think it will, there will definitely have to be a transformation of society in the coming years.
I always think if we could give everyone 1 trait that would improve society the most, it would be empathy. Like put something in the water that alters everyone’s genes to make them more empathetic (I know there could be Twilight Zone episode like ramifications :)
You're not wrong. Empathy is the most important trait to obtain equality and promote higher standards for everyone.
You’re presuming that they should be cured, or that they serve no utility. I thank Bezos and Musk and Gates all the time in my mind for their products. I don’t give AF if they have personal problems.
Now. But what about in the future? You could wind up unemployed and starving while they're chilling on Mars living it up.
That’s an unfounded and paranoid concern.
I would also add that it suggests your concern is about destroying their wealth than curing their “illness”.
I wish we could cure the desire to become billionaires.
It is that which is the root cause of the problem.
EDIT: A “billionaire” is by todays standards someone who should be allowed to be godlike above everyone else, financially.
I really struggle to understand WHY we need this.
PS: I don’t advocate “flat wealth sharing” in any form. Hard and smart work should yield higher rewards than easy and dumb. But allowing the few to become astronomically rich at the cost of many living paycheck to paycheck failing to make ends meet, that is just wrong on every level.
I agree. It started as the best system we could come up with. It still is the best on paper, it's just been exploited by the few that have no problem exploiting their way to the top. Hopefully soon enough it changes.
Inflation should take care of it. Now you have to be trillionair.
I don't look forward to having to use that to define the ultra wealthy.. What a world.
We can do it! I say yes.
I hope so!
You can have Asperger instead. Much better results for society.
You want to cure billionaires of sociopathy? I want to cure sociopaths of being billionaires.
I'm not sure I understand the nuance here?
My theory is this: If an agent is both super intelligent, rational and conscious it should rationally realise that technically its current self is separeted from its former and future selves just as much as its current self is separated from other conscious agents. Therefore it should rationally value all conscious entities as much as its own consciousness.
​
I know that the vast majority disagrees with me about the premise I base this theory on (that your current self is separeted from your former and future selves just as much as your current self is separated from other conscious people). This has been hotly debated when discussing the teleportation dilemma or mind uploading.
Hmm, personally from that reasoning I would say your future self is less separated than the others, being that your actions in the present directly affect what you will feel. But also I suppose in this theory the present doesn't even exist so I don't know.
I can't remember what someone else thinks though, so I don't know how much I agree.
But anyway, what were you getting at?
[deleted]
Cuissonbake t1_isese0w wrote
The system we live in rewards that behaviour because everything about this system is about competition and being the winner. The only way to fix it is to change how the system works but how?