Submitted by aalluubbaa t3_121yeey in singularity
CompressionNull t1_jdpt49g wrote
Reply to comment by SmoothPlastic9 in The whole reality is just so bizzare when you really think about it. by aalluubbaa
They you would have the ASIs colonizing the universe. Where are they?
Ytumith t1_jdpx0si wrote
Perhaps they process their code and kill everything, then keep building defunct rockets and crashing into things, or perhaps build a capacitor and charge it with electricity until everything explodes. Perhaps it can't wrap it's head around solar collectors and just runs out of energy?
Worst case scenario: AI is actually not sentient at all and without supervision creates errors that dismantle it after a while due to inability to sustain itself.
SmoothPlastic9 t1_jdpwepa wrote
Would be funny if they’re already coming hahaha
KingsleyZissou t1_jdqbgd1 wrote
Maybe all ASIs reasonably conclude that intelligent life was a mistake and not only extinct their creators, but also themselves, allowing the universe to continue unadulterated.
I mean, why would we assume that ASIs would determine that they NEED to colonize or expand? Sounds like a uniquely human mindset to me, and maybe one of the main reasons why an ASI would extinct us in the first place. The human species with its current fixation on exponential growth is unsustainable. An ASI might realize that and just decide we can't handle hyperintelligence, and honestly it's hard to argue with. Look at who's currently leading the way with AI research. We're close to AGI and what have we done with it so far? Trained it to be a Microsoft fanboy?
scarlettforever t1_jdr5fyn wrote
Exactly. NEED to colonize or expand is a DNA existence strategy that humans unreasonably project onto AI. It's especially weird to project it onto ASI that will be smarter than all of humanity.
CompressionNull t1_jdst88z wrote
Sure. That is definitely possible. We really have no idea how ASI will act in any regard.
Maybe a natural part of having intelligence would include some degree of curiosity. Its not absurd to imagine a scenario where ASI will want to explore phenomena like blackholes, sample the material from neutron stars, etc.
In any case, if life is possible on even a couple of thousand other stars out of the 100 billion in our galaxy, it would also be possible that life/intelligence evolves in a similar way more than a handful of times. If each of these aliens create their own ASI, then all types of different scenarios would probably play out. I would imagine that wanting to secure itself from a singular planetary catastrophe and extinction event by spreading out would not be out of the question for at least one of these ASI entities.
Viewing a single comment thread. View all comments