[deleted] t1_iv35hmx wrote
[deleted]
MythOfMyself t1_iv3xd1k wrote
Lies. I always been awake, you fool.
[deleted] t1_iv3yl8b wrote
[deleted]
MythOfMyself t1_iv4eb4q wrote
you are looking for self-consciousness
that, indeed, was me tripping
my bad, happens often
:P
drsimonz t1_iv8291a wrote
> we may choose to have ASI create the perfect simulation for us and keep us safe inside it, rather than expending energy to expand.
Yeah. This video made me wonder, what if there are as-yet-unknown natural limits to intelligence? What if minds pursuing greater intelligence universally lose interest in that goal once they reach a certain level, and pursue other things like entertainment, creativity, or even self destruction? Since we have zero examples of ASI, how can we possibly know? And consider how tiny a percentage of people alive today who actually consider intelligence a goal at all? Most people don't even seem to have a concept of intelligence being a good thing, let alone being something you can change. I think people like Ray (and to be fair, myself) like to assume that the obvious choice is to continue increasing intelligence forever, since it increases your future capabilities for any other goals you might have.
Also worth noting that "saturate the universe with computronium" thing obviously isn't compatible with the existence of other intelligent species. Unless we're unique in the universe, it's extremely unlikely we just happen to be the first species to have a chance to trigger a singularity, which we'd have to be since we can look in any direction and see billions of non-computronium stars.
[deleted] t1_iv87l0x wrote
[deleted]
drsimonz t1_iv900iu wrote
I have a good friend who believes reality is subjective - that events may be determined more by where you choose to focus your attention, than by some universally consistent instance of the laws of physics. If that is true (which I think it would have to be, if we were an attention-oriented simulation like you describe), then it seems pretty difficult to come to any conclusions at all. If causality doesn't have to be globally consistent, it should be possible to "break" the laws of physics and get things like free energy or faster than light travel. I highly doubt Mr. Kurzweil would want to entertain such notions, since the possibilities are already so exciting even if we assume that universe is objective (i.e. the laws of physics apply everywhere simultaneously).
Of course, the possibility of us being the only intelligent species certainly would depend on whether we're in a simulation designed specifically for us. But I don't see any reason to prefer that idea over a simulation with 1 billion intelligent species per galaxy. To prefer the former seems no better than assuming the earth is the center of the universe.
Viewing a single comment thread. View all comments