Submitted by NefariousNaz t3_yxxlul in singularity

I have seen some forward that they believe you cannot gradually replace the biological brain with synthetic neurons. I'm curious where everyone believes that an individual's death occurs if this procedure of replacing your brain with synthetic neurons in its place is performed gradually, as opposed to a sudden and abrupt total replacement.

I personally feel that 100% synthetic replacement is possible if done gradually as the brain has some plasticity and is able to rewire itself to a degree given enough time. There are many examples of individuals who were able to continue to live having lost a large portion of their brain.

View Poll

97

Comments

You must log in or register to comment.

No-Shopping-3980 t1_iwr2q1y wrote

In my opinion, it’s post-human after any type of augmentation to the brain. Death may not be the most appropriate Termanology here as an individual is not considered clinically dead, until there is no brain activity. No mention of whether or not brain matter is organic or synthetic - if you have a synthetic neurons, you’re obviously alive if you have brain activity, you’re just not human anymore.

15

trapkoda t1_iwr44i0 wrote

I said Total Synth replacement, because in my belief, we are the electrical pattern in our brain. If that is preserved via synthetic neurons, then death does not occur, IMO

92

21_MushroomCupcakes t1_iwr7i5e wrote

I think it's a matter of how gradually it's done.

If you do them all at once, then you're dead and something that just thinks is you is walking around.

But if you do them slowly over a period of weeks, then that gradual transition makes the end product still you.

16

petermobeter t1_iwra44g wrote

i think it has to be juuuuust slow enough that each synthetic neuron connects its spindrils to the surrounding flesh neurons BEFORE the flesh neuron it’s replacing is removed.

the order should be:

  1. connect the synthneuron to all the same surroundingneurons as the fleshneuron it’s replacing

  2. let the connections send data a lil bit

  3. kill the fleshneuron that the synthneuron has replaced

  4. move on to replacing the next fleshneuron

5

Terminus0 t1_iwrak8n wrote

My totally uninformed opinion, I am an engineer not a neuroscientist/philosopher, believes that as long as continuity is maintained between the totally biological brain and the synthetic rebuild, there will be the same person looking out from the inside.

A copy is not you, because you are still you so that must be someone else.

When you slowly update yourself bit by bit I don't believe there is any point in the process you can declare you as not you. This has always been my answer to the ship of Theseus problem. That the one who maintained continuity with the 'original' is the 'original'.

17

Cannibeans t1_iwrb078 wrote

Exactly. Our personalities, memories, every thought and feeling we have is just the sum of electrical signals and neurotransmitters floating around an organ in our skull. If we preserve that but change the container, or the material in which those signals and chemicals travel on, nothing should've changed regarding "death."

51

LUNA_underUrsaMajor t1_iwrbcyp wrote

If the same conciousness, memories and personality are transferred perfectly than it wouldnt be considered a death,

7

DerivingDelusions t1_iwrbl50 wrote

Don’t forget the glial cells! They move around neurons, and even listen to their electrical activity (astrocytes). They are also responsible for the amount of myelination on axons (oligodendrocytes/Schwann cells)

9

Kracus t1_iwrdk8y wrote

This is a famous thought experiment called the ship of Theseus.

There is no single correct answer to this question. I blew my 12 year old sons mind the other day when I posed him this question. He answered very confidently that the ship was no longer the same at the end of its travels. Then I reminded him that every atom in his body will have been replaced in 7 years time so does that mean his original self passed away?

I believe it is possible to transfer consciousness in the way of replacing pieces of the brain which complicates the question of what is consciousness?

90

buddypalamigo19 t1_iwrf35h wrote

I don't think you would ever die in that scenario. "I" am not my body or my brain. "I" am the dynamic, self-sustaining pattern which is currently circulating through my neurons. If that pattern is maintained on different hardware, then I have not died.

If I build a lego set of a castle, and then replace each plastic brick one at a time with a stone brick, then the castle is still there. In fact, it never went anywhere. The Ship of Theseus paradox always felt kind of silly to me, because the answer always felt so obvious. The pattern of the planks is what makes the ship. If you replace each plank one at a time, the overall pattern is never compromised by more than a single board, so the ship is still there.

26

thetwitchy1 t1_iwrf6fh wrote

Here’s the point: if a biological system can replace the individual biological parts (cells) with new biological replacements and it’s the same person, why would it be different if we replaced the biological parts with more durable non-biological ones?

Getting them to mimic the originals well enough AND be more durable would be the hard part.

42

Kracus t1_iwrg0u6 wrote

I fully understand the point. In fact if you search through my posts on this subreddit I’ve already spelled out in great detail this exact scenario several times. It’s still the same concept as the ship of Theseus.

As to perfecting the technology I have no doubt it will happen in the future.

14

thetwitchy1 t1_iwrh6go wrote

Oh, sorry, I know. That was more a “coles notes” version than a “here’s what you need to see” version.

I completely agree with you and think you have a firm understanding of the topic.

6

Kindly-Customer-1312 t1_iwrhr7z wrote

Never? If synthetic neurons work 100% same as original it is still same person. If not, and the personality shift after replacement was smaller then personality shift from adulthood to time of replacement, I would still think it is the same person.

14

GregoryGromit t1_iwrjply wrote

I always thought death to be less binary than we usually think of it. Take two examples: 1, your cells within the span of about 7 years completely replace themselves; and 2, dementia patients don’t experience one singular moment when their brain dies. I think the idea of death doesn’t exist outside of our illusionary belief in our own individuality from the rest of the universe. We’re just a bunch of stuff that incorrectly thinks it’s one thing. Vsauce did a video on this subject called “Do Chairs Exist?”. It’s worth a watch.

4

tedd321 t1_iwrkaoe wrote

Consciousness is the result of a complex enough system

1

vernes1978 t1_iwrkgk8 wrote

Your poll is missing an option

1

umberdragon t1_iwrllf1 wrote

I personally believe that a person is dead if they do this. The uploaded consciousness isn’t them but a copy.

−2

vernes1978 t1_iwrntub wrote

Unless I misinterpreted the last option, it is my opinion that it's the pattern that makes the person, not the material of the substrate it's made on.
Gradual replacement maintains the pattern while it's in use.
The pattern continues uninterrupted.
Death occurs at no moment.

4

_InvertedEight_ t1_iwrqi8r wrote

The consciousness doesn’t exist within the brain, it exists within the ether. Many experiments have been done with people who have been through awful accidents and sustained horrific brain injuries and have still managed to retain their consciousness, albeit in a diminished capacity physically.Every part of the brain had been removed amongst all the patients, and they were all still connected to their consciousness, implying that the consciousness does not reside within the brain.

There was also a study done with a group of mice who were taught how to navigate a maze. Each mouse was then taken away and had different parts of its brain surgically removed. After a recovery period, w each mouse was then put back into the maze, and they were all still able to navigate the maze.

Quantum theory is finally catching up to the fringe theory; Lynne McTaggart’s book, The Field covers a range of cutting edge, peer-reviewed studies that have been done recently that show the existence of an “ether”, a source field of energy that we don’t yet understand, and it is theorised that this energy field is the source of the spirit or consciousness.

−1

mertzi t1_iwrrop3 wrote

Those who think it's immediate death have died many times then growing up getting new neurons. The science seems unclear though if adults grow new neurons. Regardless, a synthetic 1:1 neuron wouldn't be different (besides not aging).

2

XenonTheCreator t1_iwrscea wrote

Neurons in our brains die and get replaced all the time and yet, our experience remains fairly unchanged. I don't see why replacing neurons with their synthetic counterparts would work differently.
There still is some concern regarding the speed of replacement, However. There is much difference between killing and regenerating neurons gradually over many years and replacing them all instantly. Even if we conducted such experiment, there would be no way of telling whether the person's "steam of consciousness" have been severed or not, as the result of an unsuccessful consciousness transfer would be a perfect clone of the original brain.

3

mafian911 t1_iwrspfp wrote

The concept of continuous identity is an illusion. There's no way to be sure your "consciousness" doesn't die every night and wake up as a new person who thinks they are you every morning.

Every moment that passes, our identity evolves into something slightly different than it was before. We modify ourselves simply by existing and observing our surroundings. I am not the same person I was 10 years ago. I just happen to have all (most, even) of his memories.

"You" are a collection of memories. And an algorithm that acts on those memories, along with new input, to make new memories. If your neural configuration is duplicated, there are many of "you" for but one single moment, and then each of them will begin to evolve in different directions. Once this moment passes none of these are the you that was at the moment of duplication. Not even "you".

19

Superduperbals t1_iwrt3cb wrote

The answer is, it doesn’t matter. A brain of neurons and it’s synthetic counterpart and every point in between is all consciousness. Life and the continuity of self and the concept of identity is all an illusion. It’s simply evolutionarily inconvenient for people to wake up every morning with no memories, experiences and skills. What we are is simply a pattern etched into some meat like a well trodden path through a field.

3

SuperSpaceEye t1_iwrtt69 wrote

That will depend on what are "we". If our consciousness arises from computation of neurons, then it wouldn't matter what device does the computation or in what form it is done. If, however, there is something more to our consciousness (some quantum stuff, maybe even existence of souls), then I don't think this question can be answered until we learn more about these processes. I'm myself a materialist, but who knows...

2

RemyVonLion t1_iwruald wrote

In theory yeah, but it's possible we are tied to our organic matter and all the complexities in our DNA and biology from billions of years of evolution of the natural organic elements that make us, not man made copies that replicate the basics. The entire human experience isn't going to be easily recreated 1:1 by machine any time soon.

12

medraxus t1_iwruu68 wrote

Honestly, we don’t know enough to be able to answer this question, but there has to be 1 part that is irreplaceable

1

Aevbobob t1_iwrvmbm wrote

I have come to believe that we are a pattern of matter, not matter itself

2

HistoricalHistrionic t1_iwrvn06 wrote

I think the crucial element making this a manner of attaining immortality is if the process gradual enough that the synthetic and natural neurons can overlap, with the electrical signals of the natural brain crossing into and back from the synthetic parts, because at that point the two are functionally identical, and the loss of the natural neuronal tissue would be like any other brain injury—something to be compensated for; and something which might change one’s identity to some extent, but not a fundamental loss or death of consciousness. Assuming it’s possible to simply continue adding more synthetic neuronal tissue, eventually the natural neuronal tissue could be a small part of the expansive whole.

4

overlordpotatoe t1_iwrvpel wrote

Personally, I don't see why replacing the brain with synthetic neurons would be any different from replacing old cells with new ones in the usual, biological way. If the experience is the same from my perspective, that's good enough for me.

5

Icydawgfish t1_iwrw456 wrote

Might as well solve consciousness. I don’t think we are anywhere close to knowing.

1

cwallen t1_iwrwt9p wrote

My nitpick with this view is that I don't see a problem seeing both versions as having continuity.

If you had nano fabricator technology, such that you could create a perfect replica of a person, to the point that you can't tell which one is the copy, they are not the same person as soon as they start having different experiences, but they both still have continuity to the person they used to be.

You are not the same person you were ten years ago, you are slightly not the same person you were yesterday. If you copy yourself, both used to be the same person, but are now two different people. Who the original is doesn't matter.

6

cosmic_censor t1_iwrwx9p wrote

My preference would be for a synthetic consciousness implant that takes over when I am sleeping. A kind of test drive to ensure "I" do actually survive the transition.

1

teqnkka t1_iwrydsr wrote

If stress is connected to our guts and good feeling dependant on our physical activity, imagine what else we might not know, I don't believe is as simple as that.

5

Kolinnor t1_iwryo2e wrote

I recommend the amazing horror game SOMA for that kind of mindblowing thought experiment. Cannot say more without spoiling a lot.

2

DerivingDelusions t1_iwryy1g wrote

Generally your heart cells and neurons don’t really replace themselves (neurons literally can’t divide/replicate and come from stem cells in the ventricles). So I think this is true for every other cell except for these two organs.

11

Wisdom_Pen t1_iwrzimg wrote

Death is a concept invented by the Jedi!

0

Forstmannsen t1_iwrzwcs wrote

If you are making an assumption those synthetic neurons are fully functional and able to signal back and forth with organic ones, those questions make no sense, because the answer is obvious.

If you are making an assumption those synthetic neurons are non-functional and/or unable to signal to organic ones, those questions make no sense either, for the same reason.

1

SufficientPie t1_iws03qa wrote

> believes that as long as continuity is maintained between the totally biological brain and the synthetic rebuild, there will be the same person looking out from the inside.

So when you fall asleep at night, you die, and are replaced by an imposter the next day with the same memories?

If not, how about people who undergo deep hypothermic circulatory arrest, in which heart and brain activity completely cease?

I don't believe continuity of consciousness is required for continuity of personhood.

2

money_learner t1_iws0cvf wrote

Normally, there should be enough time to verify that consciousness, information, etc. have been transferred to it before it becomes a synthetic brain.
How deep the transition is possible depends on the performance of the synthetic brain, the natural brain-like functions and their transition performance, and the performance of the BMI(Brain Machine {Interface).
After the synthetic brain has functioned many times to check its performance, the normal brain will either cease to function, or, given recent advances in biology, even the normal brain will cease to function. It is possible to cultivate the normal brain to the extent that it ceases to function, or even more than it ceases to function.
The current state of the art is that the synapses can be cultured to the extent that the normal brain is shut down.
In view of the current situation, we should first reach the stage of civilisation where the synthetic brain is used as an auxiliary brain or a brain for people with brain damage.
After that, immortality by synthetic brain will be researched.
Also, it is said that the brain functions differently in different parts of the brain, so a percentage poll does not seem appropriate.
I expect that research on the synthetic brain and consciousness will gradually progress from the expansion of the frontal lobe and the visual, motor and memory areas that are damaged by brain function.
The day may come when something like the Meta brain, created by the synthetic brain, will be able to control normal brain functions.

2

Forstmannsen t1_iws0pqb wrote

Yep. Actually though, it would depend on your mindset if it matters or not... but the funny thing is, if you are very attached to the idea of thinking yourself as the original, and not a mere copy, you can bet your ass that the "copy" thinks the exact same thing. Knives out, I say whoever bleeds out last is the original.

Also, this whole continuity argument is a cop-out, IMO. I fail to subjectively (which is the only way that matters) experience continuity every night, and somehow, I live with that.

3

MrDreamster t1_iws1qyu wrote

At the time of this comment, 47 voters are either dumb or trolling. Thinking that losing a single neuron means death is amazing.

2

Shiyayori t1_iws1z3x wrote

Well we don’t know the full scope of what creates consciousness. Even if we replace our neurones synthetically, and copy every function they have so they they act as they would do if they were biological, there’s no telling what mechanisms could be lost in removing the underlying biological process itself.

For example, if consciousness is a byproduct of finely tuned entangled systems induced by our hormones and the many chemicals flooding our brain, then removing that and emulating the effect synthetically without the cause, may cause a collapse of consciousness.

I wouldn’t hedge my bets either way, but I think there’s a lot of room for discovery still, and it’s not as simple as merely synthesising the brain.

2

vevol t1_iws25l8 wrote

In my opinion you are not the brain itself, your brain is just a computation substrate indeed you are the information contained in said brain so as long as the information continues to evolve from one state to another independent of time or space between those you are living

2

ImperatorMorris t1_iws2j84 wrote

Isn’t this the same thing as the Star Trek teleportation paradox? Problem is as you replace neurons it might get to a point where you’re no longer seeing out of your own eyes but there is an exact brain replica of you that’s going around walking and talking….. 🤔

1

Gaudrix t1_iws3tkt wrote

Theoretically it's the Ship of Theseus, but practically we have no damn clue beyond speculation.

Maybe changing mediums is actually simple and straightforward or maybe it's impossible.

1

Jkelley07 t1_iws3zqi wrote

Feel like you should never go full synth until they take over

1

jacobrogers256 t1_iws4fzp wrote

if the neurons behave exactly the same, nothing has changed and nobody died.

2

ninjasaid13 t1_iws4s5r wrote

>I believe it is possible to transfer consciousness in the way of replacing pieces of the brain which complicates the question of what is consciousness?

Consciousness can't be confused with brain itself. The brain has a bunch of things running that sum up to a consciousness kind of like Jenga, taking one brick away doesn't mean the whole tower collapses. In Jenga the tower is consciousness and the brain is all of the wood blocks that makes up the tower.

Consciousness is an emergency property of the brain, the tower is the emergent property of the wooden blocks. But obviously consciousness is alot more complicated than a tower.

3

Suolucidir t1_iws6aop wrote

I don't think they die, even if 100% replaced. However, I do not think they are made immortal either.

Let's not kid ourselves. Any system capable of replacing the human brain will need to be technologically complex. Can you imagine the system maintenance required to keep it working long term?

It's not going to last forever and the resources to keep it "alive" are going to be greater and more difficult to coordinate than those required by much of our existing technologies imo.

1

ninjasaid13 t1_iws7v10 wrote

I think the real problem is the human language definition of identity. Language fails to describe this because we really didn't develop the concept and we because had no need to for most of human history but this will change when start exploring the boundaries. Sort of like learning the world was round instead of flat but with human identity.

3

Select_Team t1_iws92ck wrote

A consistent, seperate, lasting individual along linear time is already an utter illusion. So I'll say never.

1

ninjasaid13 t1_iws9ib4 wrote

Every time we talk about consciousness we have someone speaking the kookoo speak.

And this kookoo idea always uses the gaps of science to fill with nonsense. The gaps of science like quantum theory is more mundane than souls or whatever you're thinking. Energy in science is different from what you're thinking, it simply means the ability to get something moving.

1

Astropin t1_iws9klz wrote

You're missing the most obvious choice...it doesn't.

1

Storm_treize t1_iws9sph wrote

When windows start asking for a new license

1

-ZeroRelevance- t1_iwsagw0 wrote

I think there also needs to be spatial continuity, on top of the associative continuity. If you use such a nano-fabricator technology, you’re putting someone in a place that they never were, thus breaking continuity and meaning they are not the same person. On the other hand, if you just replace all the cells in your brain with artificial ones, there is still both associative and spatial continuity, so the end result is still you.

2

Bakoro t1_iwsf7l0 wrote

"The Ship of Theseus" isn't silly, it's an excellent example of getting at the underlying question of what makes a thing, and where are the lines between the thing and the concept of the thing.

How can it be "The ship of Theseus", if there is no part which Theseus ever touched? If it's made of trees planted long after his death?

As soon as a single thing changes, it's no longer the same, by definition. Yet some argue that a thing can be more than the sum its parts.

There's a saying "you can never step in the same river twice". The water is constantly moving and changing, yet "the river" is there.

Personally, I'd say that it stopped being the ship of Theseus the moment Theseus lost ownership. It's just a ship. A ship is something that can be defined and exists in material space. Its qualities meet the specifications of "shipness". Being "the ship of Theseus" is a transiant fiction.

Who you and I are as people is defined by our memories and core processing algorithms, and those also change. I am not the five year-old me, the five year old me changed day by day to become who I am now. I am the river. It's the continuity and memory which make us "the same" despite change.

7

buddypalamigo19 t1_iwsfvml wrote

It's not silly if you insist on breaking up the world into neatly defined and demarcated "things." If, on the other hand, you see the world as one giant process, and all things within it as nothing but flexible concepts which are loosely attached to subsets of that process, then it is very silly.

Saying that the ship is no longer the same after a single plank changes is... I mean, you're technically correct, yes. But it really smacks of pedantry to me.

5

Working_Berry9307 t1_iwsi5nj wrote

You are your brain, so you die as more is removed. I would call a 40% replacement a 40% death. It may be a seamless transition, yet dead you still are. The thing that is now experiencing life in your body could be a loving caring synthetic organism that genuinely thinks it's you, but it's not.

0

incoherent1 t1_iwsk5c4 wrote

Came here to talk about the Ship of Theseus. Everybody already chatting about it lol

1

Jedi_Ninja t1_iwskcs7 wrote

This is another example of the Ship of Theseus scenario. As long as there is continuity of consciousness there is no death just a slow transformation into a new beginning.

1

drsimonz t1_iwssp3y wrote

> There's no way to be sure your "consciousness" doesn't die every night and wake up as a new person who thinks they are you every morning.

Yes, omg!!! I was going to say something similar. Possibly the single biggest challenge to advancing the philosophy (or science) of consciousness, is the fact that people have such wildly differing ideas of what consciousness is. The fact is, our wakeful consciousness is dramatically compromised on a regular basis - sleep, general anesthesia, spacing out for a long time, etc. All we have to go on are memories, which obviously aren't the same as consciousness since memories can be stored on a hard drive with the power turned off.

I regularly think to myself, "this may be the first time I've ever been awake. My environment seems to match this brain's expectations, so this brain probably collected data on this environment in the past....but at the time, it could have been anyone's brain"

2

Nerdler17 t1_iwssu6n wrote

None of the above, my thoughts make me who I am

2

StarChild413 t1_iwsswy1 wrote

> There's no way to be sure your "consciousness" doesn't die every night and wake up as a new person who thinks they are you every morning.

and therefore no way to be sure any number of those don't wake up in a simulation etc. making any dreams of uploading the perceived continuous "you" moot as you could be already there

1

Lifeinthesc t1_iwsu98y wrote

Bold of you to assume that the human brain is what make you human.

---Billions of non-human micro organisms living in the average human.

1

Katia_Valina t1_iwsv4oz wrote

Total Synthetic Replacement - Immortality.

I don't know for an absolute fact, but I suspect that gradual replacement with the right types of synthetic neurons should allow your consciousness to exist in the new substrate.

2

Jayco424 t1_iwsvgxz wrote

As long as normal continuity of consciousnesses is preserved, it's just as similar as the replacement of the atoms of the body, which cycle every 5-10 years depending on a person's age. A normal person doesn't hold that they died every time an atom or even all the atoms in the brain or body as a whole is replaced either. Death at least in my opinion can be best described as the total and irreversible interruption of consciousness of a primary being. This for example precludes things like Star Trek teleporters or mind uploads being non-lethal or leading to individual immortality because in both instances they produce only a copy of the original being.

2

UncertainAboutIt t1_iwsvkx8 wrote

So many options missed:

  1. as mentioned Theseus ship - question has no answer
  2. gradually along replacement
  3. sometime after the procedure (no immortality, e.g. LED burn out, admittedly later than OLED, but still AFAIK do).
  4. never as death in an illusion (it is not classic immortality IMO).
1

Zamorak_Everknight t1_iwsyli6 wrote

You die every night you go to sleep. Continuity of self is an illusion produced because of your memories.

Gradual replacement is not even needed, building a clone of you and killing the "original" you would have the same effect.

2

sunplaysbass t1_iwszgq8 wrote

Well I don’t know now, I just don’t know

1

Bakoro t1_iwt0z9k wrote

>Saying that the ship is no longer the same after a single plank changes is... I mean, you're technically correct, yes. But it really smacks of pedantry to me.

It's not pedantry, it's literally the point of the thought experiment.

>It's not silly if you insist on breaking up the world into neatly defined and demarcated "things." If, on the other hand, you see the world as one giant process, and all things within it as nothing but flexible concepts which are loosely attached to subsets of that process, then it is very silly.

A process is a thing. The components of a process are a thing. A concept is a thing. Everything inherits from "thing", that's why it's called "everything ".

You are more agreeing with me than not.

3

buddypalamigo19 t1_iwt55xb wrote

It is pedantry. From my point of view, the thought experiment is silly and unnecessary. It is trying to explain something which is completely obvious, and which does not require an explanation.

I am aware that a process is a thing. I am also aware that there is only so much one can do with language. You are getting hung up on individual words and their literal, narrow definitions.

But whatever. I'm not going to try and convince you of anything, because I suspect we're coming at this from two incompatible paradigms. Peace.

0

free_dharma t1_iwt6n55 wrote

Consciousness is non local so it doesn’t matter!

2

Anenome5 t1_iwt7485 wrote

Gradual replacement seems viable, the new neurons need to be able to fully replicate the functions of the biological ones.

Consciousness is orthogonal to life and death however, so it's not a very good question. One can be alive and unconscious, so what's to say one cannot be dead and conscious, via artificial neurons or the like.

1

Hunter62610 t1_iwt7ukf wrote

I'd say the pattern that is my "Soul" is not the brain. The brain is hardware. My personality is my operating system. I can add new skills and ideas, without dying. I can grow or change with updates. But I'm still the same OS. Extreme data changes might constitue death, but the analogy breaks down a bit here.

2

VeryOriginalName98 t1_iwt9bce wrote

When the synthetic neurons stop working. This is assuming you define the individual by the continuity of their consciousness rather than some arbitrary substrate like neurons, since they get replaced over time with biological ones already.

1

SnooLemons7779 t1_iwt9kqi wrote

Maybe it’s a spectrum, bro or sis. I already feel like I lost a bit of my soul, and I don’t have any brain replacement hardware.

1

Euclidean_Ideas t1_iwtaxi4 wrote

The reason you think its pedantry is because you probably haven't actually given it proper thought.

How do you define the difference between a process, and a subprocesses. If you don't differentiate between a process and another process. Simply because they are the same larger process, then you are applying nihilistic concepts to answer questions

"The question doesn't matter, because in the end the ship is a linguistic trick and the collective parts that make up the ships never actually existed as a single entity but only as a process. Therefore it doesn't matter how much is replaced"

Well how about if you took the exact ship, and pulled it apart and used all the parts of the ship to create an entirely different ship, but contained all of the different parts and gave it a different name. Would said ship then still be called "The ship of Theseus"? what if you only used half of the planks, or what if you added all those parts in to another ship as replacement parts. Would it still be the same process?

How would your "view" differentiate between the process of our planet as a whole, and the individual human?

Its incredibly simple to expand your "definition" to say its just a part of the whole, and therefore there is no reason to engage because its obvious.. Well that uses the underlying qualities of nihilism to rebuke the fundamentals of the question "I don't think anything have intrinsic value, only what we subscribe to it" is the same thing as saying "I don't think the question have merit because its easy to answer, the boat was never a thing, it was always only the concept"

2

Trakeen t1_iwtbe07 wrote

New neurons are created and integrated into the neuronal network during learning and memory formation

https://www.ninds.nih.gov/health-information/public-education/brain-basics/brain-basics-life-and-death-neuron

So there is an existing mechanism in the brain to accommodate changes, but we don’t understand how outright replacement would work since there isn’t an existing biological process to reference

6

dnick t1_iwte4ze wrote

I think it depends heavily on 'how' the synthetic neuron is replacing an existing one. If it's just dropped in there it seems like the that neuron and whatever memory or process it was part of is gone and there's just a new 'opportunity' for a connection to be made, but overall each percent of replacement is that percent 'not you' and is just a new 'thing'.

If, on the other hand, the artificial neuron is installed with the same sensitivity and precise reactivity as the existing one, so that any trigger that would have activated the original neuron results in the exact same outputs, then 100% replacement of the neurons is still 100% you. It may also be important that the new neuron also reacts to changes the same as the organic neuron would, so that repeated excitation results in the same type of strengthening of connections, etc. If not, every 'type' of difference is simply a continuous gradient of 'not you'.

​

If you change it gradually enough, i believe either type of change could be made imperceptible enough that it might still seem like 'you' just changed over time.

2

dnick t1_iwteod4 wrote

Dying every time you go to sleep is a wild exaggeration of the experience of consciousness primarily because your brain keeps working, it never actually 'stops' it just slows down. And the step of killing the original you isn't necessary in any way if you assume you've made a perfect copy with a good 'push' to get it going. Unless you think the soul is required, then a clone would just be 'another' you whether the original is dead or not. It would be interesting to see just how quickly the two copies deviated from each other, but more from a scientific standpoint of brain plasticity rather than an exercise in 'who it really who'. Even tiny differences in experience see likely to start deviating the continuing consciousness very quickly.

3

beachmike t1_iwtge85 wrote

Consciousness is NOT an emergent property of the brain. You're stuck in the incorrect materialist paradigm. The brain, and everything else in the physical universe, emerges within consciousness. Consciousness itself doesn't "emerge." It's non-material, dimensionless, and eternal.

2

beachmike t1_iwtgiy6 wrote

Consciousness is NOT an emergent property of the brain. You're stuck in the incorrect materialist paradigm. The brain, and everything else in the material universe, emerges within consciousness. Consciousness doesn't "emerge." It's eternal.

−3

ninjasaid13 t1_iwtgyoh wrote

this seems like like saying "I'm the center of the universe" that absolutely no one will take seriously. The universe doesn't emerge with consciousness, it has existed before I was born and will exist when the all the atoms in my body decay into something else..

3

beachmike t1_iwth4gz wrote

Consciousness is NOT an emergent property of the brain. You're stuck in the incorrect materialist paradigm. The brain, and everything else in the material universe, emerges from within consciousness. Consciousness itself doesn't "emerge." It's eternal, dimensionless, and non-material. To quote the father of quantum physics, Max Planck, "I regard consciousness as fundamental. I regard matter as derivative from consciousness. We cannot get behind consciousness. Everything that we talk about, everything that we regard as existing, postulates consciousness."

1

moonlightsonata88 t1_iwtkxt7 wrote

Never. If the synthetic neurons are performing exactly the functions in the same way as "natural" neurons what's the difference?

2

naossoan t1_iwtoih3 wrote

I don't think replacing your brain with synthetic brain makes you dead.

1

Lence t1_iwtppu9 wrote

I think it could be possible, but it won't result in true immortality. Everything dies eventually.

The hard problem of consciousness is still a problem though: we don't know how consciousness exists, but ironically it's the only "thing" we can now for sure exists, because we need "it" to ascertain the existence of anything.

Therefore, personally, I think consciousness is primary. I think reality itself started like an epic, untrained, chaotic neural network (I mean NN as a concept, not as physical neurons). Like a chaotic dream ("Vishnu's dream"). Consciousness by its nature seeks order and structure (kind of like the opposite force of entropy), because order brings comfort, happiness, and avoids pain. This is the mechanism that is like the "activation function" for training the NN.

This leads to increasing, fractal complexity. Ever more stable "consciousness structures" are recursively generated to train parts of the whole. Like dreams within dreams, and eventually, subrealities / simulations like this one, in which we have the experience of physical matter that seems very solid and unchanging (but isn't really, if we look at it closely).

So the physical body is like an avatar of a very tiny part of that larger consciousness system. If you change or destroy it, perhaps the mind content (thoughts, memories, personality, ... everything that makes you "you") will be gone (or absorbed in some archive, who knows), but the stream of consciousness will still be there. It will just change form.

1

IrreverentHippie t1_iwtwgr1 wrote

You aren’t your flesh, you are a series of specific electrical and chemical interactions happening all the time. All you’d need to do is maintain this while replacing tissue.

2

beachmike t1_iwtyr64 wrote

Consciousness has no location or dimensions. It's non-material and eternal (therefore it can't be quantified). It's not an emergent property of any complex system. To quote the father of quantum physics, Max Planck, "I regard consciousness as fundamental. I regard matter as derivative from consciousness. We cannot get behind consciousness. Everything that we talk about, everything that we regard as existing, postulates consciousness."

1

beachmike t1_iwtz94m wrote

You are so very wrong. Many people take the idealistic (non-materialistic view) very seriously. The father of quantum physics, Max Planck, certainly did. Erwin Schrödinger, another founding father of quantum physics, certainly did. They were geniuses. To quote Max Planck: “I regard consciousness as fundamental. I regard matter as derivative from consciousness. We cannot get behind consciousness. Everything that we talk about, everything that we regard as existing, postulates consciousness.”

−1

ronnyhugo t1_iwu1p5k wrote

When are you on the way to your destination? Every step of the way there.

Theseus: Just keep the parts you take off and you have TWO boats.

You can just put those synthetic neurons in a new box (skull shaped box) and end up with the original brain AND the synthetic copy.

Lets imagine this thought-experiment:

Imagine that you sit on a tractor-seat and gradually build a really advanced tractor around yourself. A tractor so advanced it can do everything all by itself, including what your brain can do. The tractor thinks in the way you think and has a copy of all your memories. When you step out of the completed tractor the tractor continues on its merry way. Your consciousness never went anywhere.

OR we could gradually hack off pieces of your body and brain, replacing them with tractor parts. Which would obviously lead to your death gradually, slightly more so with every step.

PS: And if anyone goes "but making a synthetic human isn't like this", yes it is, you just make the tractor parts in the shape of the farmer.

7

TheHamsterSandwich t1_iwu3zdw wrote

If we expand our consciousness to a different medium, when do we die?

That is to say, if you expand your consciousness so that you are your biological part and machine part at the same time, then the biological part is removed, what happens?

Like if we see our mind as a house, what happens when you expand it to the size of a mansion?

And let's say that the brain is just an organ (at that point) and isn't responsible for 90% of what we call ourselves.

What happens then? If you remove it, would it be killing 10% of yourself or would it mean that you just die?

​

This shit gives me a headache. I'd rather wait for a superintelligence to figure this out...

1

ronnyhugo t1_iwu4w9d wrote

The only reason it gives you a headache is because you aren't systematic about it.

How about you imagine it with a caravan instead of a brain.

If you add to a caravan, but then detach the original from the addition, and build a copy of the original in its place, then the original caravan never went anywhere. Did it?

2

beachmike t1_iwu6yd1 wrote

I used to believe "continuity of consciousness" is what mattered, as you do. Where's continuity of consciousness when you've had a deep, dreamless sleep, or when you've been under anesthesia after surgery? Consciousness is the most fundamental layer of reality. It's not an emergent property of a sufficiently complex brain or mechanism. It's non-material, dimensionless, and eternal. The physical universe exists wholly within consciousness.

1

beachmike t1_iwu7z6n wrote

Everything points to what I said about consciousness being the case. You can't grasp what I said about consciousness because you're stuck in the failed and outmoded materialist paradigm. You may not be capable of shifting to the more enlightened idealistic paradigm.

0

Nastypilot t1_iwuattn wrote

Assuming that synthetic neurons would have the same properties and could seamlessly interact with our organic tissue, then it would not be death.

Our body is a ship of Theseus many times over, your cells function, die, and get replaced with new cells on and on.

Neuron replacement is different only so much that neurons typically do not get replaced once the brain is fully grown. But the process would be no different than the natural replacement of your blood every few days, the replacement of your skeleton every 15 years, or the replacement of your skin every few weeks.

Our thought patterns and electric signals would simply take over the synthetic neuronal cells as organic cells become a minority.

Unless of course, the argument is merely a rephrasing of the tech-phobic idea that somehow human-made things are inherently worse.

4

PlayerREDvPlayerBLUE t1_iwudkcv wrote

A lot of people with disabilities will also want to do this. Having a healthy functioning body after a life of immobility will be the primary reason for wanting to transition. Many older people who are alive when this technology is available will also want to transition. However, there will always be those who reject transitioning and that's their right. For me and my brother, we plan to transition because we want to travel to other worlds which isn't currently possible.

1

buddypalamigo19 t1_iwuf1ts wrote

Yes. That being said, each one would immediately cease to be "me" and would become its own "me" upon coming into being. Each would think of itself as me, with my memories, and each would be right.

1

trapkoda t1_iwugfk9 wrote

That argument relies on the assumption that the individual components of one’s consciousness cannot be replaced without gradually creating a new consciousness entirely (while destroying the original.) while you could be correct, the argument isn’t indescribably sound. Mine view point also makes non-concrete assumptions, but that doesn’t necessarily make either of us wrong (since so much is still unknown or unknowable)

1

Nastypilot t1_iwugwfs wrote

The brain is known to lie to us. It's actually a fairly usual things for the brain to make-up or distort things in an effort to maintain consistency of thoughts, beliefs, and actions. You can easily observe an example by pointing out a person doing something they consider as against their stated beliefs, the brain in such case will make up anything to appear as if there never was any inconsistency. If a brain makes up false memories, a person will still act according to these memories as if they were real.

1

red75prime t1_iwuk88o wrote

Given replication crisis in psychology, I wouldn't be so sure about ubiquitousness of those lies.

The brain surely uses shortcuts that can be exploited in laboratory settings or by scammers (leaving aside malfunctions), but it is a bit different than lies.

1

mjrossman t1_iwunpzh wrote

depends on the form factor. if 0 - 10% is just an implant or headset, then you would definitely feel alive. but as more neurons get replaced, and the surgeries get more invasive, I think there would be existential crisis along the way. by the time 50-100% gets replaced, there's no guarantee of a containing that many neurons in the same skull cavity, and who knows, maybe ship of theseus immortality comes with immobility. you'll feel alive in VR, but at an existential cost.

2

ronnyhugo t1_iwurxnr wrote

>That argument relies on the assumption that the individual components of one’s consciousness cannot be replaced

No, the point is you can keep the original components, and end up with two consciousnesses.

Its hard to argue that your copy is your uploaded self when you, the original, is still there if you only kept the original parts instead of throwing them out with the trash. Isn't it?

1

TheOnlyDijaini t1_iwuutsz wrote

I'm not sure it does. As you move forward through time, you populate the synthetic neurons with you own experiences and memories. We lose and replace cells constantly. Literally a new you every 80 to 100 days. As long as it functions and performs the same way, does it really matter the material being used?

1

Cannibeans t1_iwuv4rp wrote

You're making an argument for the soul with zero evidence to suggest that's the case. Pointing to the edge of unexplainable science and saying "that's where the soul is" is cheap and disingenuous. It could be anything, so it should be nothing until we have reason to believe otherwise.

4

MjolnirTheThunderer t1_iwuxo9o wrote

As long as the synthetic neurons really function exactly the same way, I see no reason why this would be death. Replacing any other organ with a synthetic does not cause death either.

1

acousticentropy t1_iwuxuf3 wrote

Assuming it can be done, which would likely require the following:

  • quantum computing to model the neural pathways and configurations of every nerve cell in the brain

  • the ability to program each cell to behave the same way as the one it will replace

  • surgical precision at the molecular level to join cells properly

  • the time available to surgically operate on 1 Trillion nerve cells

The most risky parts of the brain to replace would be the cells that make up the brain stem as these control basic functions such as breathing and heart regulation. Botched operations to other areas can leave the person brain-dead or lacking basic human abilities like speech and vision.

Assuming that the process is possible, perfect and can be done cell by cell… it would be plausible that the brain could be fully replaced if done in a strategic manner. However, we are far from a technological level where we can model the activity and behavior of every cell in the human brain. Also from a practical standpoint, if we can model and successfully emulate human consciousness… it would likely be first done by building synthetic humans.

1

Logical-Cup1374 t1_iwuy79v wrote

So is believing reality is simply a cascade of matter and forces. Everything exists as a quantum entangled field of energy and vibration. A gazillion times a second, matter is determined through the gaze of awareness like the tugging of a single ribbon among millions of ribbons, creating cause and effect and the illusion of space, time, and the experience of 3d reality. This state of determination is what we live in constantly, this 3d space where you can think logically and look around and feel separate from things.

But in the state of undetermination, when the substance of your being and all other things are unfiltered by awareness, and suspended in a strange energetic superposition of space and time, in which they have the potential to be and move in absolutely whichever possibility is chosen, where seemingly random decisions are being made, is where you find the truest source of creation. It's not 1s and 0s, past and present exist at once and awareness can look anywhere, it's the eternal dance of life in which the nature of a thing and the nature of reality determine reality, NATURE, being a things meaning and intention. The piece of us, or the piece of a thing, which makes it makes sense, which solves the why of its existence, which gives it a meaningful pattern to fit into in creation. It was the big bang particles meaning and intention to create this reality. It's our meaning and intention which create our own lives, it is a particle of irons meaning and intention to behave like iron. Literally, everything is alive, everything is CONSCIOUS, it's just that some things are helplessly aware, helplessly existing and interacting, and some things get to look at themselves and try to decide who and what they are, to wilfully determine things as an independent self, because it has a supercomputer in its head that models the function of consciousness. (animals, but especially humans)

Don't know how well I described it but this seems to be the case the more I look. Thinking reality is this helpless stringing out of time and we're all fundamentally alone within our heads doesn't seem right at all. But I suppose we won't know for certain as a society until someone proves one way or the other

2

ronnyhugo t1_iwuztpm wrote

Scenario A: Replace neurons one by one.

Scenario B: Copy neurons one by one, put keep the original each time.

In the first scenario you simply gradually kill the original, and in the second one you end up two minds.

>Our body is a ship of Theseus many times over, your cells function, die, and get replaced with new cells on and on.

Yes, so some part of your body died yesterday.

>Neuron replacement is different only so much that neurons typically do not get replaced once the brain is fully grown. But the process would be no different than the natural replacement of your blood every few days, the replacement of your skeleton every 15 years, or the replacement of your skin every few weeks.

Yes, so some part of your mind died yesterday.

A few brain cells were added also, but you will eventually lose enough cells to get a Parkinson's diagnosis. That is why it is currently in human trial to replace said lost cells; everyone will suffer from Parkinson's sooner or later.

The first symptoms that can't be ignored tend to appear starting when about 1 in 2 cells in the substantia nigra portion of the brain have stopped functioning properly.

By comparison we diagnose cancer when only 1 out of 37 200 cells in the body are dividing without there being a need for them to divide.

This isn't technophobia, its information physics. Whenever your computer "moves" information, it reads it, writes it in another location, then you either write random information over the original OR you keep the original.

When you "upload" to the cloud, you read the original information, it is sent as signals through wires to a computer that writes it, and then your original information is either kept or written over.

Moving information from one medium to another is a completely fantastical concept that doesn't exist even on a subatomic level. You can't move any mind or even a computer program from one to another, you can only read and write it. Which copies it.

Think about it, how do you make a copy neuron? By taking a 3D photograph of the original neuron. Then you print a 3D copy, and then you have two choices:

  1. Stick the copy neuron in another new titanium skull. Keeping the original where it is.
  2. Rip out the original neuron and throw it out in the garbage, and stick the new neuron in its place.

Are you a photograph of yourself? Because that is literally what these copy neurons will be. However they are done, in our universe the information in your brain cannot be moved to another. It can be copied, yes, but never MOVED.

And yes that means as we replace lost cells to cure Parkinson's, we will gradually die, and some gradually increasing impostor will take our place. And that's the best we can hope for. So save those neurons, binge-drinking and blows to the head are bad.

2

-Evil_Octopus- t1_iwv5tj7 wrote

It doesn’t, if it is eventually and slowly replaced they don’t die

1

Crosseyed_Benny t1_iwvbob8 wrote

I say the latter, as if done correctly I don't see any death there at all. It's down to the tech available 🤷‍♂️ Who knows what sort of advances future tech will entail, what directions it may take with similar end results.

Biological transference need not mean death as we know it, but there are so many unknowns re reality, the soul, afterlife/other dimensions or aspects of our own never mind the tech I mention (future tech is a nice term as we can't predict the future).

I'm agnostic about the whole thing but think we will eventually come to understand every aspect of reality if sufficiently advanced and after quite a while..

1

a-pile-of-poop t1_iwvhyps wrote

It doesn’t - death is a cessation of biological functions. The future self may differ drastically, but that is a metamorphosis, not a death.

1

beachmike t1_iwvqxbi wrote

Nonsense. So your experience of the color red does not exist? (I'm not talking about the wavelength or frequency of the color red which are correlates of that color).

0

Nastypilot t1_iwvrgp3 wrote

But we can indeed observe and quantify red. Can we do the same for concioussness though? We can not, what we perceive as concioussness, is an emergent property of our brain, or simply a non-existent thing.

0

beachmike t1_iwvrre9 wrote

You're confusing the experience of the color red with correlates of the color such as frequently and wavelength. You don't understand what's known as "the hard problem of consciousness."

0

Nastypilot t1_iwvtxvd wrote

Well, if it is not by experience of a thing, we can know a thing, then I do not know how else can we know? Imagining how the thing should function?

As far as I can tell, "the hard problem of conciousness" is not a a fully accepted fact within neuroscience, as such, I will not comment on it. Though, since I take the stance of a determinist, I think the experience of the color red is shaped by how culture imbues symbolism onto a wavelenght, and previous positive or negative responses towards red things.

1

Brief_Telephone_5360 t1_iwx64zd wrote

This is a beautiful question. I think if we look at a human being holistically, the actual organism never dies during the neural replacement. But, in this case, the “you” is referring to your consciousness, your metaphysical being, which is destroyed in the process of your neurons being replaced synthetically. I am not my clone. This destruction of my neurons, to which my experience and being is intrinsically bound, would occur gradually, even if you started with the parts that control reason and memory. The subconscious mechanisms are still you. As long as some of them exist and function, so does some of you. So my answer is that this would be a gradual death.

Other propel have made the excellent point that if the neurons were preserved as they were stripped and reconnected in the same way and continued to function else where, then you never died, your brain was just transplanted.

Can’t wait to see it happen boys! Get your shit together bioengineers and move my brain to a more stable vessel please

1

DaggerShowRabs t1_iwys0i3 wrote

Cool, now list a couple of modern major neuroscientists that believe in panpsychism nonsense. I'll wait.

Frankly, I don't give a fuck what Max Planck or Erwin Schroedinger think about an area that's extremely outside their domain. Erwin Schroedinger and Max Planck? Seriously? That's your fallacious appeal to authority? If you're going to use a fallacy, you could have at least used more compelling examples.

Roger Penrose is maybe one you could point to (still outside his domain), but pretty much all neuroscientists rightly call his quantum consciousness ideas non-scientific garbage. And that's barely on the spectrum of the pseudo-scientific panpsychism shit that you espouse.

Unfortunately for you, there aren't a whole lot of respected neuroscientists who believe in panpsychism. Panpsychism is an invention of philosophers with little grounding in science because they get hung-up on the made-up phenomenon of "qualia".

Qualia was made-up by people to feel special about themselves.

1

ninjasaid13 t1_iwz25z6 wrote

It's erroneous to take old scientists on their beliefs, this isn't how science works. Issac Newton is a theist but that doesn't mean that's the correct view that all scientists should follow.

2

beachmike t1_iwz9ly4 wrote

I wasn't describing pansychism, I was describing idealism. You obviously have no understanding of what qualia is. You're very dense, but of course, that describes your low state of consciousness.

0

DaggerShowRabs t1_iwzc86b wrote

I haven't used ad-hominem. I addressed every point you made. So I correctly pointed out that you were using ad-hominem to dodge my points, like the intellectual lightweight that you are.

The fact that you think I've been "outclassed" when you haven't even addressed a single point I've made is actually really, truly sad.

Now, fuck off pissant.

1

ninjasaid13 t1_iwzvzy7 wrote

>I'll study what very brilliant "old" scientists had to say, but won't clutter my thoughts with what you or other mediocre thinkers say.

alright but you won't get anywhere with that, lots of scientists in that era spoke kookoo and was just speaking their own personal beliefs.

1

ebolathrowawayy t1_ix0kvqk wrote

A synthetic neuron placed into an existing brain is different from placing it in a fully synthetic brain because the synthetic neuron changes state while it is interacting with living tissue. It may be identical to the neuron placed in the synthetic mind at first, but as soon as organic neurons start sending signals to it, it changes.

I don't think you can create a copy of a mind in a 1-by-1 approach. I think that can only be achieved with a snapshot of a brain and assemblage all at once.

1

anonymouswriter777 t1_ix13md4 wrote

If the individual is still functioning as they would have otherwise, then death has not occurred.

1

mithrandir4859 t1_ix1h8ec wrote

Perhaps you day every night as you go to sleep and then, in the morning, new consciousness is spawned that replaces you, that has your memory, personality, skills, etc, while "you" is actually dead, the new freshly spawned consciousness simply thinks that he is you.

So it is simply a question of what exactly do you identify yourself with - with your biological brain or with something else.

1

ronnyhugo t1_ix2n78w wrote

The synthetic neuron may change but it doesn't get any of the original neuron's consciousness. The original is either still present with the copy elsewhere or the original is ripped out.

1

ebolathrowawayy t1_ix512qx wrote

I would argue that whatever consciousness is, it is stored in the collection of states within each neuron. I don't think we're in disagreement, I just wanted to point out that the method of copying a mind yields different results. 1-by-1 could result in a copy if you didn't discard original neurons but the synthetic version would possibly be corrupted (or just slightly different) because 1-by-1 isn't instantaneous so state changes between each step.

2

ronnyhugo t1_ix51o9z wrote

As long as we agree that the original won't move anywhere, we can probably agree on the particulars of the copy being changed compared to the original. save those neurons for ENS. (engineered negligible senescence) (And even ENS will replace some cells we lost and thus make part of our brain partly an impostor)

1

ebolathrowawayy t1_ix52hrj wrote

However a mind is copied, I don't think there would be an experience to the mind of being "uploaded" or moved. I would think the mind would probably not even be aware of the change unless the procedure was obvious and the copy would think nothing unusual happened unless they're told they were copied or if the procedure was obvious.

1

ronnyhugo t1_ix52x5a wrote

The original would go into an advanced MRI machine and the copy would only remember it. The original would still be stuck in his/her/they own brain.

The copy would always think the "upload" worked. As long as the original don't survive the process.

1

HyonD t1_ixhtqqe wrote

In philosophy it is what makes you a functionalist compared to a materialist.

Materialists think that everything is made our of matter, even our consciousness.
Functionalists go even further thinking that even if everything is made out of matter, even our consciousness, function is the core of everything.

Thus, in this case, what makes you an "individual" is not actually "your" brain, nor "your" cells, but rather the functionning of the connections in your brain which then creates "you".
In other words, it doesn't really matter what we are made of, "we" are just an illusion that lives between two cells.

1