Comments
CrowShotFirst t1_j80eoco wrote
So the movie Prometheus?
[deleted] t1_j80f984 wrote
[removed]
I_HaveA_Theory OP t1_j80fqsn wrote
Hah, something like that
FuturologyBot t1_j80iles wrote
The following submission statement was provided by /u/I_HaveA_Theory:
This essays talk about the possibility of sequencing and storing our genomic information, paired with a mapping of our closest relationships, in order to encode a simulation of ourselves in the future whenever the technology permits it. It also explores the possibility that this may have already happened, what that would mean, and how entertaining some notion of universal meaning may be called for.
Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/10yyvqa/a_different_kind_of_ark_how_we_can_sequence_and/j80dx4d/
Tato7069 t1_j80ip5q wrote
For what purpose? Who in the world cares if a version of themselves simulated from their DNA lives on in a simulated world after they die?
[deleted] t1_j80iy95 wrote
[deleted]
pepperoniwassabi t1_j80newx wrote
Your family might want to interact with a virtual version of yourself
The more info you store the more realistic the simulation.
stangerlpass t1_j80o5vf wrote
You gotta be a special kind of arrogant to think that even after your death the world still needs your personality
[deleted] t1_j80ojvb wrote
[removed]
Kewkky t1_j80qqoi wrote
I've always hated when supposed scientists talk about "this may be a thing" with no math behind it, no tests at all, just speculation only. Just call it philosophising.
I_HaveA_Theory OP t1_j80r9an wrote
It's more about simulating relationships, not just yourself. Imagine you're here because some reality before you decided they wanted to simulate their loving relationships, and that included "you" (or your likeness via your DNA). Wouldn't that be meaningful? Especially if you came to realize that's how you got here?
[deleted] t1_j80rs0j wrote
biff444444 t1_j80sjrc wrote
I just hope that someone will warn future me about gluten before too much damage is done.
Artanthos t1_j80u5gx wrote
I would.
If a simulated version of myself thinks and feels as if it was me, then it is me from its perspective.
Artanthos t1_j80udsp wrote
Fortunately, not everyone cares about others opinions.
Some people are capable of independent thought instead of just going along with peer pressure.
Icy-Opportunity-8454 t1_j80uhfg wrote
I think with enough data, an AI could simulate any person, their speech, character, memories, appearance and so on... in the future we might be able to browse through a catalogue of friends and relatives who have passed, also celebrities and historical figures and talk to a simulation of them.
Artanthos t1_j80umoq wrote
Simulation Theory does have math behind it.
It is also a philosophical debate as it is not testable.
SimplyJorah t1_j80v629 wrote
Without any maths or testing involved, it sounds a lot like organized religion as well.
Danjou667 t1_j80vz01 wrote
We here in real world cant reach U in other way, u live una a simulation atm...
Artanthos t1_j80ynar wrote
There is no way to prove you don’t live in a simulation or are not a Boltzmann brain.
Futurology-ModTeam t1_j80z221 wrote
Hi, AllergenicCanoe. Thanks for contributing. However, your comment was removed from /r/Futurology.
> > Black mirror covers this. No thanks
> Rule 6 - Comments must be on topic, be of sufficient length, and contribute positively to the discussion.
Refer to the subreddit rules, the transparency wiki, or the domain blacklist for more information.
[Message the Mods](https://www.reddit.com/message/compose?to=/r/Futurology&subject=Question regarding the removal of this comment by /u/AllergenicCanoe&message=I have a question regarding the removal of this comment if you feel this was in error.
Verbenablu t1_j812rt0 wrote
A different kind of Ark? Please, all ARKives are the same…. They hold information.
Tato7069 t1_j819ex1 wrote
So if you were cloned from your DNA and the clone killed someone, that would be the same as you killing someone, from your perspective?
Tato7069 t1_j819nkg wrote
It would be meaningful to me, not the person that simulated me
Artanthos t1_j81beps wrote
No, each version of me would be responsible for their own actions.
To put it another way, if there were 10 copies of me, each would be me from their perspective.
They would each also be unique individuals as they would each begin to diverge from me, and each other, at the moment of their separation.
JAFOguy t1_j81broa wrote
Simulation or not, a difference that makes no difference IS no difference.
[deleted] t1_j81d5my wrote
[removed]
CaseyTS t1_j81deo2 wrote
A human sharing your DNA is not "you". It's an identical twin. You gotta get into the brain for any "self" to be involved in any way, and the brain is heavily influenced by life experience.
CaseyTS t1_j81dhz2 wrote
Sharing your DNA would not accomplish that at all. You're thinking of your brain, not your DNA, and a lot of things other than DNA affect the brain in huge ways.
peregrinkm t1_j81dqsx wrote
They can already do that by combining deepfakes with AI. Encoding DNA into the simulation would just help it to simulate protein and cell growth. That would be extremely high resolution.
The question is: could it ever be conscious?
CaseyTS t1_j81dtby wrote
How would DNA alone let you construct a person? There are a lot of things other than DNA that affect their brain throughout their lifetime.
You're thinking of an identical twin. They are frequently very different (and, yes, frequently similar).
CaseyTS t1_j81eafh wrote
Fortunately, the people who never listen to others' perspectives, by and large, fail and fall into obscurity because they choose not to learn or adapt to perspectives other than their own.
CaseyTS t1_j81ei0q wrote
Yep. That's a much better way to get someone's personaliry than to copy their DNA.
CaseyTS t1_j81f7wb wrote
>would it ever be conscious?
With actual computers, that is a hard question and I'm not sure how to answer. But if you could simulate every cell in the human brain, you could definitely produce something that behaves exactly like a person that we'd call conscious - inside and out. There's no fundamental rule that says that matter we build machines out of cannot be conscious. I see consciousness as purely emergent, not primal like a dualist's idea of a soul. As such, I think of it as more of an information phenomenon than a material phenomenon (though, obviously, humans use physics to operate).
peregrinkm t1_j81fyfh wrote
But would it be aware of itself as a conscious entity, rather than merely mimic the patterns of something that is conscious?
I_HaveA_Theory OP t1_j81g3lk wrote
The goal would not be to construct the same mind or memories, it would likely be to convey some amount of meaning. Here's an excerpt from the essay which imagines a scenario where we exist in such a simulation:
>an entire history of people before us [...] decided their carefully spun web of love was worth living again. Maybe we don’t share their memories, but they looked like us, loved like us. We are their memorial. They decided – through their love, heartache, and scientific toil – that they would do anything to say “I love you” in a spectacular gesture that transcends universes.
[deleted] t1_j81gg1z wrote
[removed]
Tato7069 t1_j81i6cw wrote
So how is that different than the simulation?
I_HaveA_Theory OP t1_j81jc8z wrote
It could be meaningful for them too, knowing it would be meaningful for you
[deleted] t1_j81kzi4 wrote
[removed]
clearlylacking t1_j81l59z wrote
People have the urge to procreate, this is similar Imo. You are just one swab away from immortality, for a one time payment of 999$
Dozygrizly t1_j81l7iy wrote
It's actually pretty hotly debated whether this would be possible, if you're interested check out the debates around things like the blue brain project etc.
Your brain has billions or trillions of synapses. The information being relayed at synapses is not binary (different neurotransmitters have excitatory/inhibitory effects which can even propagate backwards). The effect of the entire nervous system would need to be modelled, as well as the gut microbiome (these all influence it significantly).
Add onto that, say you manage to create a perfect simulation, you essentially just have a brain in a jar. So you now need to simulate an external environment, a lifetimes worth of experiences to allow the simulated brain to plastically develop in response to input (otherwise it's an inert lump of meat essentially). Your simulated brain will not respond accurately without this plastic development.
To simulate a consciousness accurately, you would essentially need to simulate someone's entire life.
I agree that consciousness is emergent, but I don't think we could simulate consciousness as we know it. I believe we could get to some form of consciousness though.
StarChild413 t1_j81pkyd wrote
if it already happened, does it need to happen to continue some kind of bootstrap loop or does it already happening render pursuing it redundant
WaitingForNormal t1_j81pneu wrote
Ah yes, let’s make a simulation where everything sucks, the people of the future are gonna love it.
I_HaveA_Theory OP t1_j81q23s wrote
Ooh, I'm glad you asked that actually. There's a line of reasoning that universal simulations could actually serve to maintain a necessary causal loop: https://www.vesselproject.io/essays/god-in-the-loop
Stealthy_Snow_Elf t1_j81qk46 wrote
Im of the opinion that if humanity should destroy the environment to the point where humanity cant survive, that any attempts to preserve humans should be destroyed.
Failed intelligent species do not deserve to be preserved.
Kewkky t1_j81ua2w wrote
Can you show me a link with the math behind it? Honest request.
OvermoderatedNet t1_j81xqaf wrote
> debates around things like the blue brain project
It would really suck if at the end of the day it turned out there were tasks that silicon and computers literally cannot do and that anything more complex than a slow-motion self-driving delivery bot requires organic brain cells.
Make_Mine_A-Double t1_j81zhnk wrote
If the popular documentary I watch is correct that stuff can make your d*^% fly off!
PandaEven3982 t1_j821wo1 wrote
I'm hoping down the road, we reserve sex for pleasure and put foeti in artificial wombs. Would relieve a ton of social pressures.
StarChild413 t1_j825h2b wrote
Would people save the environment if we told them something like that
magi70 t1_j825qct wrote
Or. Prometheus and Bob!
Ok_Kale_2509 t1_j826au5 wrote
May I intrest you in the documentary final fantasy x. It may grant some interesting insight.
CrowShotFirst t1_j826i1q wrote
That was my favorite skit growing up!
Stealthy_Snow_Elf t1_j826m3m wrote
Nah, humans are shortsighted creatures of the present at the moment. There is little that can be done that has not already been done that would succeed in convincing humanity to change.
Wait for the natural disasters to get worse and the droughts to start killing millions via famine.
CaseyTS t1_j829j70 wrote
In the same way that a human is, sure. Consciousness is a product of the behavior of a brain. If the simulation allows the brain to make whatever choices a human would (it would have to have virtual senses or something), then I would say it's the same as human consciousness. I don't see a reason otherwise.
CaseyTS t1_j829r9r wrote
Sure, but we don't have to use a binary computer to simulate it. We could use an analogue computer or whatever else. That said, I agree that this is outside of any practical application; it's science fiction. But I think that, in principle, there is no difference between a machine brain and a human brain if they do the same things. Of course, any consciousness would have to have an appropriate environment, artificial or not.
peregrinkm t1_j82axzr wrote
Clearly there’s something within you that registers sight as an image interpreted by consciousness, but is that any reason why someone should “see” what they see? You experience consciousness, meaning something experiences the sensory stimuli. What is the nature of experience itself?
Wild_Sun_1223 t1_j82l5p2 wrote
If one's going to "preserve" humanity by this approach, why not re-wire its genetics and/or brains so it doesn't work that way the "second time around"? That'd seem to deal with that problem fairly easily.
DarkestDusk t1_j82mnfg wrote
You will find out that I have everyone always stored within my being, so while it may have already happened, it happened before time began from your experience.
StarChild413 t1_j82yuyo wrote
> Wait for the natural disasters to get worse and the droughts to start killing millions via famine.
If that's what it'd take couldn't that be faked even if it'd take someone with Ozymandias-level resources
strxberryswitchblade t1_j8358t5 wrote
it’s a fact though our planet is dying bc of us this is proven
FibroBitch96 t1_j83kbe8 wrote
This is the internet, it’s okay to swear
Stealthy_Snow_Elf t1_j83mbha wrote
No it’s not the famine that will convince humans, itll be the tens of millions of humans dying while even more flee to the developed nations.
And there’s no guarantee that’s humans will do the right thing even then. I mean look at now, migrants flee to US and EU because of global warming or violence we have reason to believe is exacerbated by global warming.
You can’t fake it, humans need a reality check, even then no promises. They got one on the dangers of fascism and nationalism with the holocaust but in less than a hundred years the lessons are either forgotten or many were never even taught to begin with because the victors had traits of their own that implicated them.
I hope humans do the right thing. But i told myself years ago i would focus on my own plans and not interfere with human beings. There’s like a saying or some idiom that essentially goes: “the actions of the exceptional will cost the lessons of the many.” Basically, if a relative few are the reason a species disaster was avoided and not the collective action of the species, than the species hasn’t actually learned what brought them to failure and, most importantly, they didn’t learn on their own how to fix that failure.
It has the same effect as artificial evolution. In essence your species is no longer alive because it’s fit, but because outliers helped you avoid disasters. And outliers ruin the data
Make_Mine_A-Double t1_j83pl4u wrote
Alright… I’m gonna do it! Poop!
Zammyboobs t1_j84vnfj wrote
This is just SOMA, and we don’t want SOMA, that fucking nightmare fuel
MyPhillyAccent t1_j84x8ar wrote
Christ on a bike, so many emo fuckers parroting "woe is me" bs. Get a grip, find some joy.
Stealthy_Snow_Elf t1_j84ycp7 wrote
Lol. I do have joy, but humans are dumb af and a species that is incapable of preserving its homeworld does not deserve to explore the stars. In fact, they present a danger to more responsible intelligent life elsewhere in space.
MyPhillyAccent t1_j84zw3n wrote
> humans are dumb af and a species that is incapable of preserving its homeworld does not deserve to explore the stars. In fact, they present a danger to more responsible intelligent life elsewhere in space.
None of that is true and it reads like a hormonal teenager wrote it.
FibroBitch96 t1_j85566k wrote
🥲I’m so proud of you, our little boy is growing up
ArBui t1_j86ltqj wrote
This is the plot of >!SOMA!<. (Sci-fi horror video game spoilers)
Artanthos t1_j8710jo wrote
Simulations, downloading into a biological body, many-world theory.
Choose your scenario.
Tato7069 t1_j871bz7 wrote
In any situation that's still not you... It's a copy of you. Your conscious brain in your current body will never have any awareness of the existence of this copy after you died. It's not a continuation of your consciousness
Artanthos t1_j871flf wrote
There is a difference between listening and having your life dictated by.
There are plenty of wildly successful people who would have been been successful if they had given in to peer pressure or listened to their peers.
Artanthos t1_j871ot0 wrote
A simple DNA sample would be nothing more than an identical twin with an age difference, not a mental copy.
That’s obviously not the end goal.
Artanthos t1_j87387z wrote
It would be from the copies perspective.
And in the many-worlds scenario, it is 100% indisputably you. A near infinite number of you, each diverging from each other.
Tato7069 t1_j8761sx wrote
So again... What's the point in creating a copy that thinks it's you? Wtf do you get out of that?
thisimpetus t1_j87aejp wrote
This is the definition of a strawman argument.
Kewkky t1_j87ddf4 wrote
I'm not gonna lie, I read it, went to the math, literally read through the descriptions, and read the conclusion. I even looked at the references and who Nick Bostrom was. This is literally just philosophy, and despite what it says in the actual webpage, it's not rigorous.
He made up the equations with no reference to anything else except three possible future scenarios in his argument: (1) humans go extinct and don't become post-human, (2) humans become post-human but don't care about simulations, or (3) humans become post-human and care about simulations. All the way to his conclusion, he also never proves that we're living in a simulation, he just states that we're either (1), (2) or (3), and that the chances of any of those 3 being true are completely even, and we'll never know.
Personally, I find this proposition to be dumb. He narrows down all potentialities to only 3: (1) we'll never get there, (2) we can get there and don't want to simulate, or (3) we can get there and we want to simulate. It reminds me of the argument about God existing: Non-existence is a sign of imperfection, but God is supposed to be perfect, which means that he can't be non-existent, and therefore God exists. It's a non-rigorous argument that can never actually prove the proposition itself and is completely philosophical in nature. It also reminds me of the doomsday clock, in that it's treated as this scientific observation of when nuclear war will break out, but all it is is just a bunch of people moving the clock forward at different speeds and slowing it down at the last "hour", and I can guarantee you that once nuclear war actually breaks out, they'll fast-forward the clock to midnight and be like "See? We could totally predict it!". The "math" in the simulation website is also extremely basic, it's just basically treating the 3 possible scenarios as fractions of a 100%, which literally proves nothing except that those 3 scenarios make up 100% of his argument. I wouldn't consider it math, since there's no actual operations happening.
I do greatly appreciate you linking that website to me though, so here's my upvote.
donald_trunks t1_j885or0 wrote
Space will be fine. The Universe is unfathomably huge. If something out there wants to kill us, so be it. We don't have to make a special effort to kill ourselves. Let's just relax and see how it plays out. If nothing else we get more data that way.
warren_stupidity t1_j88fu6n wrote
A simulation of you is not you. You will still be dead.
Dozygrizly t1_j8bh5ot wrote
Yea, in my mind it's more of a case of, if we get to the point where we can simulate one brain properly in this fashion, it's essentially useless.
We can simulate the brain of a Lithuanian man who has been addicted to 2CB his entire life. Great. Now that we have done that, we have essentially just recreated a brain that already exists.
What does this fake brain tell us that we don't already know, after studying this man for his entire life to determine all the inputs required to simulate his brain?
We now have one brain. This is useless in an inferential sense for any kind of research - we have a sample of N = 1, meaning we have a case study of a brain that we already have fully mapped without having to expend the resources to simulate it.
Once we have the capabilities to simulate a human brain properly, we wont need to do it (or learn anything substantive from it). This is the argument I most agree with anyway.
I wouldn't be so quick to distinguish between a biological brain and a computerised one, they exist on such different planes that (in my opinion) such broad statements are bold to say the least. I do appreciate your point though.
Artanthos t1_j8bpvhv wrote
>This is literally just philosophy
I already agreed that is was philosophy.
Anything that cannot be tested falls under philosophy, and it is impossible to test if we are living in a simulation.
Artanthos t1_j8bqr9m wrote
Continuation: from the copies perspective they are me and encompass all that I have ever been.
If there are multiple copies then, when I come to a fork in the road, I chose both. No more wondering what happens on the path not chosen.
In more practical terms, in the future there come well be journeys from which there can be no return. I.e., the clones will never have a chance of meeting. Be that separate simulations with no crossover or colony ships headed in different directions. From the copies point of view, each would be the sole version of me.
And point of view is everything.
I_HaveA_Theory OP t1_j80dx4d wrote
This essays talk about the possibility of sequencing and storing our genomic information, paired with a mapping of our closest relationships, in order to encode a simulation of ourselves in the future whenever the technology permits it. It also explores the possibility that this may have already happened, what that would mean, and how entertaining some notion of universal meaning may be called for.