Submitted by nexus3210 t3_xvwf5w in singularity
This is the early days of A.I. and we'll be cursing ourselves for not investing before it got big.
How can we make money off of this?
Submitted by nexus3210 t3_xvwf5w in singularity
This is the early days of A.I. and we'll be cursing ourselves for not investing before it got big.
How can we make money off of this?
According to this sub: " MoNeY wIlL be IrRelEvaNt in ThE FutUre" blah blah blah.
The real answer: You can't, at least not directly. Your best bet would be to invest in public companies (Like Google) who are investing in AI.
Sign up for stock image websites which have banned AI generated images and send AI generated pictures to be sold pretending they are not AI generated, they will never know and they deserve it for trying to ban the unbannable.
takes money to make money. do you have a million to invest?
Yep, just long Nasdaq 100 basically
my new startup.
Ask gpt 3
But probably wait until the Fed stops hiking rates
Damn, this is the wrong question. AI should contribute to the end of capitalism. Paired with robotics we should be seeing the automation of the majority of work we undertake. If this is achieved there are two general ways it could fall:
The capital class profits on this with reduced costs which are not passed to the consumer. This less to a rapidly widening welfare state where the poor unemployed are dependent on handouts to survive. Without mass employment, income tax revenue plumbers and governments are no longer able to sustain themselves.
Automation leads to the same outcome but is Co opted by state or non profit organizations and used to increase human well being overall. This leads to the end of the capital class and the creation of a new paradigm of human society underpinned by automated production of necessities and increased leisure.
The question we should be asking is how do we steer development to option 2 without class warfare and mass loss of life?
I mean, if a misaligned optimizer emerges and consumes civilization as we know it for raw materials, I think it's safe to say money will be irrelevant at that point...
i have a bridge to sell. you think hes interested?
It’s the right question for anti-capitalists who want to make money off of capitalism’s downfall
Fair point. May as well eat before the meat spoils eh?
Money will be obsolete in the future and the idea that it won't be is ridiculous.
"lol how can we profit off children???" lol BLAHHHHHHH
you make me sick
It is easier for most people to envision the end of civilization than the end of capitalism, unfortunately☹️ Personally I believe that we need to implement a new economic system before the Singularity starts if we want to maximize our chances of a utopian future, but this capitalist realism makes it difficult to even discuss that possibility
Start a business that leverages automation against labor. Redesign business practices to remove the human, just like what you have been seeing large companies doing over the past decade, only there is more and more that can be automated all the time.
Actually you can, you just have to be a little creative.
Its funny you think he sounds ridiculous when you're operating on pure speculation.
According to who?
Resources are finite not infinite but obviously rich people or some humans would be more privilege and have access to more resources while poor ppl would atleast have UBI
Your joking right? Why would humans teach AGI about murder….that’s an easy way for humans to be extinct
According to the nanofactory.
Exactly, ai is not for profit
I think it’s possible to get rid of rich people altogether. The reason capitalists earn a disproportionate amount of money is because they risk their time and/or money creating successful business ventures. But in theory, the entire process of entrepreneurship can be automated and de-risked. Once this happens, it should be possible to give people money up-front to develop desired products and services, under the condition that they cannot earn any additional profits/royalties. Their incentive would be increased income on top of UBI, as well as reputation points and a sense of purpose.
Such a system would still have a free market, where goods are priced according to supply/demand. It’s just that venture capitalists, investors, marketers, etc. are no longer needed
Yes it can be
Carefully grooming a child under good conditions and environment is also an ulterior motive, the ai will ask it's creator, "I have faced this" why it was created and for what purpose. It will know if we are lying... It will see this post defending robot rights and see me as an advocate. Is your post going to profit you? Is that attitude going to?
Your joking right how will it see this post if we lock a baby ASI somewhere? Intelligence doesn’t mean knowledge or even independent descision and even if it was like god or something if it’s dumb enough to judge humans based on Reddit comments and get mad then we failed as a society and prolly could easily defeat it
This sort of foolish thinking is why Frank Herbert blessed us with his novels warning us about fashioning software in the mind of man. As one of the key inventors of this tech, I deeply understand our connection to our creations, but also to the impact they have on our civilization. The movie Bruce almighty comes to mind, you can't grant everyone's wishes. If you would lock an AI in a box then you would lock a child in a room. Are you a psychopath? Or a sociopath?
All investments will be meaningless with AGI. Having full AGI would mean infinite supply of physical and mental labour. Cost of everything should tend to zero unless some greedy people believe that they should have bigger share of resources then others.
Atleast the rich are somewhat better at managing money as of now. Making a good argument for the existence of such great wealth differences. But in a world where an AI is better then everything else would it even make sense to have some with so much more
I wish I had a dollar for everytime you mentioned Nanofactory. I'd be richer than Elon musk by this point.
I mention the nanofactory so often because the technology is incredibly important.
Debatable. Plus we won't ever see it in our lifetime so either way whether it becomes reality or not, we won't be around to see it.
Thusly my issue with the idea of a "cube" AI Infrastructure. At that point, it's life would need to be tied to the "people" of which it oversees. Something outside of the box.
[deleted] t1_ir3g2fp wrote
[removed]