Comments

You must log in or register to comment.

I_HaveA_Theory OP t1_j80dx4d wrote

This essays talk about the possibility of sequencing and storing our genomic information, paired with a mapping of our closest relationships, in order to encode a simulation of ourselves in the future whenever the technology permits it. It also explores the possibility that this may have already happened, what that would mean, and how entertaining some notion of universal meaning may be called for.

1

FuturologyBot t1_j80iles wrote

The following submission statement was provided by /u/I_HaveA_Theory:


This essays talk about the possibility of sequencing and storing our genomic information, paired with a mapping of our closest relationships, in order to encode a simulation of ourselves in the future whenever the technology permits it. It also explores the possibility that this may have already happened, what that would mean, and how entertaining some notion of universal meaning may be called for.


Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/10yyvqa/a_different_kind_of_ark_how_we_can_sequence_and/j80dx4d/

1

Tato7069 t1_j80ip5q wrote

For what purpose? Who in the world cares if a version of themselves simulated from their DNA lives on in a simulated world after they die?

42

Kewkky t1_j80qqoi wrote

I've always hated when supposed scientists talk about "this may be a thing" with no math behind it, no tests at all, just speculation only. Just call it philosophising.

23

I_HaveA_Theory OP t1_j80r9an wrote

It's more about simulating relationships, not just yourself. Imagine you're here because some reality before you decided they wanted to simulate their loving relationships, and that included "you" (or your likeness via your DNA). Wouldn't that be meaningful? Especially if you came to realize that's how you got here?

7

biff444444 t1_j80sjrc wrote

I just hope that someone will warn future me about gluten before too much damage is done.

33

Icy-Opportunity-8454 t1_j80uhfg wrote

I think with enough data, an AI could simulate any person, their speech, character, memories, appearance and so on... in the future we might be able to browse through a catalogue of friends and relatives who have passed, also celebrities and historical figures and talk to a simulation of them.

1

Futurology-ModTeam t1_j80z221 wrote

Hi, AllergenicCanoe. Thanks for contributing. However, your comment was removed from /r/Futurology.


> > Black mirror covers this. No thanks


> Rule 6 - Comments must be on topic, be of sufficient length, and contribute positively to the discussion.

Refer to the subreddit rules, the transparency wiki, or the domain blacklist for more information.

[Message the Mods](https://www.reddit.com/message/compose?to=/r/Futurology&subject=Question regarding the removal of this comment by /u/AllergenicCanoe&message=I have a question regarding the removal of this comment if you feel this was in error.

1

Verbenablu t1_j812rt0 wrote

A different kind of Ark? Please, all ARKives are the same…. They hold information.

1

Artanthos t1_j81beps wrote

No, each version of me would be responsible for their own actions.

To put it another way, if there were 10 copies of me, each would be me from their perspective.

They would each also be unique individuals as they would each begin to diverge from me, and each other, at the moment of their separation.

1

JAFOguy t1_j81broa wrote

Simulation or not, a difference that makes no difference IS no difference.

1

CaseyTS t1_j81deo2 wrote

A human sharing your DNA is not "you". It's an identical twin. You gotta get into the brain for any "self" to be involved in any way, and the brain is heavily influenced by life experience.

30

peregrinkm t1_j81dqsx wrote

They can already do that by combining deepfakes with AI. Encoding DNA into the simulation would just help it to simulate protein and cell growth. That would be extremely high resolution.

The question is: could it ever be conscious?

2

CaseyTS t1_j81dtby wrote

How would DNA alone let you construct a person? There are a lot of things other than DNA that affect their brain throughout their lifetime.

You're thinking of an identical twin. They are frequently very different (and, yes, frequently similar).

3

CaseyTS t1_j81eafh wrote

Fortunately, the people who never listen to others' perspectives, by and large, fail and fall into obscurity because they choose not to learn or adapt to perspectives other than their own.

1

CaseyTS t1_j81f7wb wrote

>would it ever be conscious?

With actual computers, that is a hard question and I'm not sure how to answer. But if you could simulate every cell in the human brain, you could definitely produce something that behaves exactly like a person that we'd call conscious - inside and out. There's no fundamental rule that says that matter we build machines out of cannot be conscious. I see consciousness as purely emergent, not primal like a dualist's idea of a soul. As such, I think of it as more of an information phenomenon than a material phenomenon (though, obviously, humans use physics to operate).

15

I_HaveA_Theory OP t1_j81g3lk wrote

The goal would not be to construct the same mind or memories, it would likely be to convey some amount of meaning. Here's an excerpt from the essay which imagines a scenario where we exist in such a simulation:

>an entire history of people before us [...] decided their carefully spun web of love was worth living again. Maybe we don’t share their memories, but they looked like us, loved like us. We are their memorial. They decided – through their love, heartache, and scientific toil – that they would do anything to say “I love you” in a spectacular gesture that transcends universes.

3

Dozygrizly t1_j81l7iy wrote

It's actually pretty hotly debated whether this would be possible, if you're interested check out the debates around things like the blue brain project etc.

Your brain has billions or trillions of synapses. The information being relayed at synapses is not binary (different neurotransmitters have excitatory/inhibitory effects which can even propagate backwards). The effect of the entire nervous system would need to be modelled, as well as the gut microbiome (these all influence it significantly).

Add onto that, say you manage to create a perfect simulation, you essentially just have a brain in a jar. So you now need to simulate an external environment, a lifetimes worth of experiences to allow the simulated brain to plastically develop in response to input (otherwise it's an inert lump of meat essentially). Your simulated brain will not respond accurately without this plastic development.

To simulate a consciousness accurately, you would essentially need to simulate someone's entire life.

I agree that consciousness is emergent, but I don't think we could simulate consciousness as we know it. I believe we could get to some form of consciousness though.

3

StarChild413 t1_j81pkyd wrote

if it already happened, does it need to happen to continue some kind of bootstrap loop or does it already happening render pursuing it redundant

2

WaitingForNormal t1_j81pneu wrote

Ah yes, let’s make a simulation where everything sucks, the people of the future are gonna love it.

1

Stealthy_Snow_Elf t1_j81qk46 wrote

Im of the opinion that if humanity should destroy the environment to the point where humanity cant survive, that any attempts to preserve humans should be destroyed.

Failed intelligent species do not deserve to be preserved.

2

OvermoderatedNet t1_j81xqaf wrote

> debates around things like the blue brain project

It would really suck if at the end of the day it turned out there were tasks that silicon and computers literally cannot do and that anything more complex than a slow-motion self-driving delivery bot requires organic brain cells.

1

PandaEven3982 t1_j821wo1 wrote

I'm hoping down the road, we reserve sex for pleasure and put foeti in artificial wombs. Would relieve a ton of social pressures.

2

Stealthy_Snow_Elf t1_j826m3m wrote

Nah, humans are shortsighted creatures of the present at the moment. There is little that can be done that has not already been done that would succeed in convincing humanity to change.

Wait for the natural disasters to get worse and the droughts to start killing millions via famine.

2

CaseyTS t1_j829j70 wrote

In the same way that a human is, sure. Consciousness is a product of the behavior of a brain. If the simulation allows the brain to make whatever choices a human would (it would have to have virtual senses or something), then I would say it's the same as human consciousness. I don't see a reason otherwise.

3

CaseyTS t1_j829r9r wrote

Sure, but we don't have to use a binary computer to simulate it. We could use an analogue computer or whatever else. That said, I agree that this is outside of any practical application; it's science fiction. But I think that, in principle, there is no difference between a machine brain and a human brain if they do the same things. Of course, any consciousness would have to have an appropriate environment, artificial or not.

2

peregrinkm t1_j82axzr wrote

Clearly there’s something within you that registers sight as an image interpreted by consciousness, but is that any reason why someone should “see” what they see? You experience consciousness, meaning something experiences the sensory stimuli. What is the nature of experience itself?

0

DarkestDusk t1_j82mnfg wrote

You will find out that I have everyone always stored within my being, so while it may have already happened, it happened before time began from your experience.

1

Stealthy_Snow_Elf t1_j83mbha wrote

No it’s not the famine that will convince humans, itll be the tens of millions of humans dying while even more flee to the developed nations.

And there’s no guarantee that’s humans will do the right thing even then. I mean look at now, migrants flee to US and EU because of global warming or violence we have reason to believe is exacerbated by global warming.

You can’t fake it, humans need a reality check, even then no promises. They got one on the dangers of fascism and nationalism with the holocaust but in less than a hundred years the lessons are either forgotten or many were never even taught to begin with because the victors had traits of their own that implicated them.

I hope humans do the right thing. But i told myself years ago i would focus on my own plans and not interfere with human beings. There’s like a saying or some idiom that essentially goes: “the actions of the exceptional will cost the lessons of the many.” Basically, if a relative few are the reason a species disaster was avoided and not the collective action of the species, than the species hasn’t actually learned what brought them to failure and, most importantly, they didn’t learn on their own how to fix that failure.

It has the same effect as artificial evolution. In essence your species is no longer alive because it’s fit, but because outliers helped you avoid disasters. And outliers ruin the data

1

Zammyboobs t1_j84vnfj wrote

This is just SOMA, and we don’t want SOMA, that fucking nightmare fuel

3

Stealthy_Snow_Elf t1_j84ycp7 wrote

Lol. I do have joy, but humans are dumb af and a species that is incapable of preserving its homeworld does not deserve to explore the stars. In fact, they present a danger to more responsible intelligent life elsewhere in space.

1

MyPhillyAccent t1_j84zw3n wrote

> humans are dumb af and a species that is incapable of preserving its homeworld does not deserve to explore the stars. In fact, they present a danger to more responsible intelligent life elsewhere in space.

None of that is true and it reads like a hormonal teenager wrote it.

1

ArBui t1_j86ltqj wrote

This is the plot of >!SOMA!<. (Sci-fi horror video game spoilers)

1

Tato7069 t1_j871bz7 wrote

In any situation that's still not you... It's a copy of you. Your conscious brain in your current body will never have any awareness of the existence of this copy after you died. It's not a continuation of your consciousness

1

Artanthos t1_j871flf wrote

There is a difference between listening and having your life dictated by.

There are plenty of wildly successful people who would have been been successful if they had given in to peer pressure or listened to their peers.

1

Kewkky t1_j87ddf4 wrote

I'm not gonna lie, I read it, went to the math, literally read through the descriptions, and read the conclusion. I even looked at the references and who Nick Bostrom was. This is literally just philosophy, and despite what it says in the actual webpage, it's not rigorous.

He made up the equations with no reference to anything else except three possible future scenarios in his argument: (1) humans go extinct and don't become post-human, (2) humans become post-human but don't care about simulations, or (3) humans become post-human and care about simulations. All the way to his conclusion, he also never proves that we're living in a simulation, he just states that we're either (1), (2) or (3), and that the chances of any of those 3 being true are completely even, and we'll never know.

Personally, I find this proposition to be dumb. He narrows down all potentialities to only 3: (1) we'll never get there, (2) we can get there and don't want to simulate, or (3) we can get there and we want to simulate. It reminds me of the argument about God existing: Non-existence is a sign of imperfection, but God is supposed to be perfect, which means that he can't be non-existent, and therefore God exists. It's a non-rigorous argument that can never actually prove the proposition itself and is completely philosophical in nature. It also reminds me of the doomsday clock, in that it's treated as this scientific observation of when nuclear war will break out, but all it is is just a bunch of people moving the clock forward at different speeds and slowing it down at the last "hour", and I can guarantee you that once nuclear war actually breaks out, they'll fast-forward the clock to midnight and be like "See? We could totally predict it!". The "math" in the simulation website is also extremely basic, it's just basically treating the 3 possible scenarios as fractions of a 100%, which literally proves nothing except that those 3 scenarios make up 100% of his argument. I wouldn't consider it math, since there's no actual operations happening.

I do greatly appreciate you linking that website to me though, so here's my upvote.

1

donald_trunks t1_j885or0 wrote

Space will be fine. The Universe is unfathomably huge. If something out there wants to kill us, so be it. We don't have to make a special effort to kill ourselves. Let's just relax and see how it plays out. If nothing else we get more data that way.

1

Dozygrizly t1_j8bh5ot wrote

Yea, in my mind it's more of a case of, if we get to the point where we can simulate one brain properly in this fashion, it's essentially useless.

We can simulate the brain of a Lithuanian man who has been addicted to 2CB his entire life. Great. Now that we have done that, we have essentially just recreated a brain that already exists.

What does this fake brain tell us that we don't already know, after studying this man for his entire life to determine all the inputs required to simulate his brain?

We now have one brain. This is useless in an inferential sense for any kind of research - we have a sample of N = 1, meaning we have a case study of a brain that we already have fully mapped without having to expend the resources to simulate it.

Once we have the capabilities to simulate a human brain properly, we wont need to do it (or learn anything substantive from it). This is the argument I most agree with anyway.

I wouldn't be so quick to distinguish between a biological brain and a computerised one, they exist on such different planes that (in my opinion) such broad statements are bold to say the least. I do appreciate your point though.

1

Artanthos t1_j8bpvhv wrote

>This is literally just philosophy

I already agreed that is was philosophy.

Anything that cannot be tested falls under philosophy, and it is impossible to test if we are living in a simulation.

1

Artanthos t1_j8bqr9m wrote

Continuation: from the copies perspective they are me and encompass all that I have ever been.

If there are multiple copies then, when I come to a fork in the road, I chose both. No more wondering what happens on the path not chosen.

In more practical terms, in the future there come well be journeys from which there can be no return. I.e., the clones will never have a chance of meeting. Be that separate simulations with no crossover or colony ships headed in different directions. From the copies point of view, each would be the sole version of me.

And point of view is everything.

0