Submitted by Shiningc t3_124rga4 in Futurology
acutelychronicpanic t1_je0zic1 wrote
They would release the AGI because of competitors nipping at their heels. That, and it would make them a lot of money to be first.
I would buy your argument if one company was years ahead of everyone else. Right now the gap is more like months.
Shiningc OP t1_je13vpz wrote
I mean if you have a golden-egg laying goose, then you don't even need to sell the goose. You can have all the money in the world.
An AGI is, metaphorically, like a super-genius. They wouldn't want a super genius to be poached by somebody else.
acutelychronicpanic t1_je15v21 wrote
They aren't the only ones with a goose. They're just the first to release it. Across the world, companies are scrambling right now to catch up, and my understanding of the tech is that it should work. The most important mechanisms exist as publicly available knowledge.
Shiningc OP t1_je17yyx wrote
Yes, but they don't need to sell it to make money because the AGI can make all the money for them.
acutelychronicpanic t1_je19pww wrote
I'd agree if it were true ASI (artificial super intelligence). But a proto-agi as smart as a highchooler that can run on a desktop would be worth hundreds of billions, if not trillions. They would have incentive to lease that system out before they reached AGI.
Shiningc OP t1_je1bc9g wrote
Soo, basically they wouldn't release an AGI.
acutelychronicpanic t1_je1ddup wrote
I think we disagree on what an AGI is. I would define an AGI as roughly human level. It doesn't need to be superhuman.
And I still think they would if they suspected someone else would beat them to it.
Shiningc OP t1_je1en1s wrote
But an AGI is going to be millions of times faster than a human.
acutelychronicpanic t1_je1fo9n wrote
Eventually, yeah. But the first AGI need not be that good.
Viewing a single comment thread. View all comments