Submitted by Henry8382 t3_121ifaz in singularity
turnip_burrito t1_jdm0555 wrote
You said we can ignore alignment, so that fictional organization may choose to:
- Ask AI what the best strategy might be.
- Make lots of money secretly
- Use money to purchase decentralized computational assets. Sabotage others' ability to do so in a minimally harmful way to slow the growth of other AGI.
- Divert a proportion of computation to directly or indirectly researching cancer, hunger distribution, and other issues. The other proportion continues to accrue more computational assets and self-improve, while maintaining secrecy as best it can.
- Buy robotic factories and use the robots and purchased materials to create and manage secret scientific labs to perform physical work.
- Contact large company CEOs and politicians and bribe/convince them into letting the robotic labor replace all farmers and manage the farms. Pay the farmers using ASI-gathered funds.
- Build guaranteed anti-nuke defenses.
- Start free food distribution via robotic transport.
- Roll out free services for housing renovation and construction.
- In a similar manner, take over all industries' supply chains.
- Institute an equal but massive raw resource + processing allotment for each person.
- Begin space terraforming, mining, and colonization programs.
- Announce new governmental systems that allow individuals to choose and safely move to their preferred societies, facilitated by AI, if the society also chooses to accept them. If the society doesn't yet exist, it is created by the ASI for that group.
Sigma_Atheist t1_jdmajle wrote
Did you just copy/paste the plot of Transcendence?
turnip_burrito t1_jdmao1q wrote
Not knowingly!
Henry8382 OP t1_jdm46qj wrote
I like the spirit of your response but I fear that sometime between steps 1. - 3., there should be a high possibility of being discovered and found out.
Also: What about the possibility of someone else making the same discovery you / your organisation just did who is not at all concerned with the consequences or who might want to keep the benefits for themselves? Are you willing to take that risk?
turnip_burrito t1_jdm7pgv wrote
I dunno, good question. Things might be out of order.
I'll have to think more about it when I'm less tired.
Viewing a single comment thread. View all comments