ribblle
ribblle OP t1_jego2mx wrote
Reply to comment by Surur in The only AI that the US should be trying to make by ribblle
If you want to minimize the risk of AI, you minimize the actions of AI.
This isn't actually good enough, but it's the best strategy if you're forced to make one.
ribblle OP t1_jefti0b wrote
Technically, silicon goku. Not saving cats from trees here, world threatening things only.
Submitted by ribblle t3_127u3wc in singularity
ribblle t1_j64dh51 wrote
Reply to comment by Rumblestillskin in The next globalisation: there is growing support for the idea that the world is experiencing not 'deglobalisation' but rather 're-globalisation', owing to accelerating changes in energy and technology. by Vucea
They care more, but are they more concerned about it?
ribblle t1_j64cy8f wrote
Reply to comment by Surur in The next globalisation: there is growing support for the idea that the world is experiencing not 'deglobalisation' but rather 're-globalisation', owing to accelerating changes in energy and technology. by Vucea
At the moment it's making them richer. At the moment.
ribblle OP t1_j2pohue wrote
Reply to comment by WhoopieGoldmember in How Would our Worldview Have to Change for the Human Feel to remain Familiar? by ribblle
There's not a damn thing that can be done about a potential fire and ash outcome. It's a matter of luck - and I'm considering what kind of luck it would take.
ribblle OP t1_j2pcm15 wrote
Reply to comment by frenetickticktick in How Would our Worldview Have to Change for the Human Feel to remain Familiar? by ribblle
Too young to have panicked.
ribblle OP t1_j2pcfdr wrote
Reply to comment by DoesntWantToBe in How Would our Worldview Have to Change for the Human Feel to remain Familiar? by ribblle
Christ man. I'm a paranoid schizophrenic for having my worries about rapid technological progression. Fucking reddit.
ribblle OP t1_j2p44l9 wrote
Reply to comment by khamelean in How Would our Worldview Have to Change for the Human Feel to remain Familiar? by ribblle
Eugenics is synonymous with gene editing, a very real and worryingly far-along technology.
ribblle OP t1_j2p1sy3 wrote
Reply to comment by TzedekTirdof in How Would our Worldview Have to Change for the Human Feel to remain Familiar? by ribblle
Good luck dealing with all this 6 days a week.
ribblle OP t1_j2p1hq0 wrote
Reply to comment by Federal-General-9683 in How Would our Worldview Have to Change for the Human Feel to remain Familiar? by ribblle
I'm an atheist; but you actually get diminishing returns from this level of technology when it comes to religion. If the technology is crazy enough people become more willing to believe in the fantastic not less. If things get chaotic enough they just get more keen on cosmic order.
ribblle OP t1_j2p0yl1 wrote
Reply to comment by WhoopieGoldmember in How Would our Worldview Have to Change for the Human Feel to remain Familiar? by ribblle
I have a survivors priorities. Why focus on the world where you lose.
ribblle OP t1_j2p0uap wrote
Reply to comment by DoesntWantToBe in How Would our Worldview Have to Change for the Human Feel to remain Familiar? by ribblle
The problem is, these things aren't just information. They would be constant annoyances and danger and, threats. People don't actually just move on. The gears turn, and sooner or later you start worrying about nuclear weapons again. This stuff would represent a much more immediate and often personal problem, with the stress to match.
But, let me not bury my point. If you have all these things constantly in your view, your just not going to have the same experience we're accustomed to today. You could end up living like a rabbit mentally running away from one strange question after another.
ribblle OP t1_j2ozod9 wrote
Reply to comment by General_Josh in How Would our Worldview Have to Change for the Human Feel to remain Familiar? by ribblle
This wouldn't merely be information in the world you live in. All of these things would have very real, unignorable impacts on your life. You can't just sleep on robots becoming commonplace or gene editing shaping your world. And it's very hard to adjust to in reality.
And then you go to bed, and wonder if this is a part of your life, what will be tomorrow?
You see the problem?
ribblle OP t1_j2oz9dp wrote
Reply to comment by khamelean in How Would our Worldview Have to Change for the Human Feel to remain Familiar? by ribblle
All of this stuff has the potential to actually come knocking. You can't just ignore commonplace robots and eugenics, and trying to steel yourself for it is a fulltime job.
ribblle OP t1_j2oz4bq wrote
Reply to comment by khamelean in How Would our Worldview Have to Change for the Human Feel to remain Familiar? by ribblle
It's not the concepts, it's the day to day of living in a world where you have no real grasp on the possibility space. A robot here and some eugenics there and a little bit of AR sprinkled on top and you're already in a very unideal place mentally.
ribblle OP t1_j2o1xkk wrote
Reply to comment by kyoko9 in How Would our Worldview Have to Change for the Human Feel to remain Familiar? by ribblle
No. People need a lot of things, and mundanity and a certain level of existential inertia is one of them.
ribblle OP t1_j2o1qwp wrote
Reply to comment by NeadNathair in How Would our Worldview Have to Change for the Human Feel to remain Familiar? by ribblle
No, they wouldn't. They'd just contextualize us as low-level wizards and, on a deep level, move on.
You can't live naturally in the context I've described above due to sheer over-stimulus, which is inherently insane.
ribblle OP t1_j2nzmmg wrote
Reply to comment by kyoko9 in How Would our Worldview Have to Change for the Human Feel to remain Familiar? by ribblle
Or maybe we just demand certain experiences, as part of our nature?
Submitted by ribblle t3_101k224 in Futurology
Submitted by ribblle t3_zo6fse in Futurology
Submitted by ribblle t3_ycr5fy in singularity
ribblle t1_ir1wxve wrote
Reply to What happens in the first month of AGI/ASI? by kmtrp
Plot twist or everything's a headache. Gods are boring
ribblle OP t1_jego522 wrote
Reply to comment by Iffykindofguy in The only AI that the US should be trying to make by ribblle
You realize most people don't have faith in the singularity being a safe goal.