Submitted by [deleted] t3_11aedho in singularity

https://consc.net/papers/qualia.html

The fading qualia experiment imagines a scenario where part of a person's brain is simulated and is hooked up to the rest of the brain via a cybernetic interface so that the biological part cannot tell that the other part is being simulated.

What this experiment illustrates is that if only biological substrates were conscious, then the biological part of the brain would be experiencing a state of consciousness that contradicted its physical state. Essentially if a computer were able to compute the exact functions of neurons then it would have to be conscious. Perhaps there is an as-of-now undiscovered element to neuronal activity that is incomputable which would cause this hypothetical system to collapse thus preventing this paradox.

To make an analogy, I think consciousness is like gravity. An observer cannot tell the difference between real or simulated gravity. The force experienced in a constantly accelerating room is exactly the same as the force caused by gravity. In the same way, it is logical to assume that any system that generates the same information exchange at the resolution in which human consciousness emerges, must experience human consciousness.

20

Comments

You must log in or register to comment.

SnooHabits1237 t1_j9ryix4 wrote

Id love to see more about this experiment and more like it. It’s fascinating. In my (limited) opinion I think we will find out that consciousness itself is an abstraction. Like how a chair is a chair because we say it is.

6

turnip_burrito t1_j9rj81h wrote

Maybe, who can tell?

4

LambdaAU t1_j9vukof wrote

This will be an actually testable hypothesis soon so we could actually gain a lot of information towards understanding consciousness.

1

OtterPop16 t1_ja6jspy wrote

Can you test for quality or subjective experience? I don't think you can. We can make inferences for living things based on observing the nervous systems, especially those similar enough to ours like vertebrates and especially mammals.

But for other things? What does it feel like to be a nematode? Or a starfish with no central nervous system?

I mean, epistemologically, you can only know that you're having subjective experience.

1

LambdaAU t1_ja6tc7j wrote

You could have this experiment done on yourself and then document your experience. I suppose you would be able to tell wether or not your consciousness was expanded and make deductions based on that.

1

ChurchOfTheHolyGays t1_j9s1jii wrote

What if the brain can interface with the device on a hierarchical chain instead of equally? It may be able to delegate computing tasks to the interface while retaining everything that is related with consciousness within the biological part.

You would have to see what is the minimum biological brain size where a person is alive and functioning well, then connect the interface and show that - with the device attached - you can reduce the biological brain to a size that would otherwise be impossible and still have them retain consciousness. Then it must be shared with the machine.

4

petermobeter t1_j9sgoe2 wrote

i have another question: if any dynamic system in the same shape as a brain is conscious, regardless of material…….. are all dynamic systems various degrees of “conscious” depending on their complexity? is the earth’s ecosystem conscious? is an anthill conscious? is the tokyo subway system conscious?

what do they require to be conscious systems, as opposed to dynamic systems that arent conscious? inputs & outputs? feedback loops?

edit: oh and also: why am i stuck inside the consciousness of my own brain instead of, say, the consciousness of a stray dog in mexico? my memories make me think im me…… but if i fall asleep, will i wake up as a stray dog in mexico due to that dog having memories that make it think it’s a dog? what holds me here in my brain day after day, sleep after sleep?

3

[deleted] OP t1_j9sr20w wrote

Well, this is the essence of the hard problem. We don't know. Adding to that dog in Mexico thought experiment, the opposite scenario is the split-brain experiments. When a person's brain is split in two each half objectively behaves a separate agent yet somehow both identify as the same agent.

6

Surur t1_j9sloqf wrote

I personally believe any responsive system is conscious to a degree, reflected by their ability to sense, compute and respond. The more complex and rich that space is, the more conscious the system is.

For example a light switch is conscious of its state, on and off, while the tokyo subway station is not as conscious as a cell, as it has fewer inputs, fewer actions and fewer responses, but a lot more conscious than the light switch.

3

turnip_burrito t1_ja01qzg wrote

What defines the spatial borders of a responsive system?

1

Surur t1_ja0585j wrote

It that important? The spatial borders are the reach of your control.

1

turnip_burrito t1_ja0v8bq wrote

But for a human being, the spatial borders are smaller than our reach of control.

1

Surur t1_ja1iwc1 wrote

Is it really. When we control equipment we seem to adopt it's borders pretty well. We can slip into roles such as a person who controls a country pretty easily.

1

turnip_burrito t1_ja1mps9 wrote

No, what I mean is you only feel stuff directly touching your nervous system. That's what I mean by spatial borders.

The spatial borders of what a person experiences are their nervous system. Why? And furthermore, what is the equivalent for a light bulb or a piece of carpet?

1

Surur t1_ja1p6jo wrote

But that is not true. As I explained, we are easily able to expand our spatial borders to include machines we control.

And that question is not reasonable to ask for something which has only two states like a light switch or none like a carpet

1

turnip_burrito t1_ja1pqh4 wrote

That's not quite what I mean by spatial borders. I don't mean stuff you are causally connected to. I mean something different. I'm not going to go into any more detail though since it's a bit boring.

1

petermobeter t1_j9s88ne wrote

question: does this mean that if someone perfectly recreates my brain’s patterns (in say….. silicon) a thousand years after i die, then my death will feel like a short nap, after which i wake up (in my thousand-years-hence body)?

or will the recreation of my brain a thousand years from now simply THINK it’s a continuation of me after death, meanwhile my real stream-of-consiousness ended permanently when i died?

1

[deleted] OP t1_j9sapgr wrote

Here is a thought experiment for you. Imagine at age 5 you were frozen and then your mind was scanned and your body destroyed. Now imagine that at this time someone created an entity that was exactly the same as you are now (memories and everything). Now ask yourself, is this objectively any different from your current existence?

You in this very moment are literally a "clone" that thinks it's the same being that existed many years ago. How you achieved this belief of continuity is completely immaterial from the perspective of your brain as the only way it can consciously experience this is through memory.

3

petermobeter t1_j9setmv wrote

i understand that, i get it, youre sayin the 2 scenarios i proposed are the same thing because If it THINKS it’s me, it IS me

but……. im just really worried that my stream of consiousness is gonna end permanently when i die, regardless of future technology enabling full-brain emulation of ancestors.

will i wake up after i die, or will “i” wake up after i die? please please please tell me it’s the former 🥺 or that the latter includes the former!

1

[deleted] OP t1_j9ss14m wrote

Objectively speaking, "you" simply does not exist in the first place. The "you" that exists now is a different being than the one that existed when you were 3. What I think is that the self is a delusion similar to cotards syndrome.

https://www.youtube.com/watch?v=YBOfgTP0nVg

When a people suffer from cotards syndrome they are convinced that hey are dead despite the contrary. Selfhood seems to me as a similar delusion but one that we all share. We are convinced that we are all discrete and immutable entities despite all evidence pointing to the contrary. The difference between selfhood and cotards delusion though is that belief in the self has evolutionary benefits as it allows us to easily conceptualize the things we need to do to survive.

4

turnip_burrito t1_j9vv9m1 wrote

It may even be that we are also different second to second. 🤔

1

Glitched-Lies t1_j9u2xyu wrote

The scientists who have made contributions to the problem you say; the problem of incomputablity and simulation versus authentic consciousness, like Roger Penrose have a not 100% convincing science of Orchestrated Objective Reduction, accordingly have been pointed out that it's a fallacy to say at what point something is incomputable versus not. However considering quantum mechanics's counter intuitiveness with emperical experiment, you might be able to reconcile this fallacy. And if anything about science means anything then this is the first approach and nearest neighbor to what would be objective truth on the matter.

What someone needs is a bridge of this problem with epistemology, not the ontologies of simulation versus authentic, and that means deep work on how there is an approach to the science of consciousness. And how could any come to a conclusion over this? I don't think within the next 50 years. However I think like most scientists think, there is a definitive answer which can be ruled "certain".

1

Glitched-Lies t1_j9u4hou wrote

Obviously people like John Searle (who claims to think naturalism is key) say the key point is syntax versus semantics, this also endures a bit of a fallacy that doesn't directly say what it means. A paradox basically.

1

[deleted] OP t1_j9w62zm wrote

If the Penrose situation turns out to be correct, then attempting to replace quantum neurons using classical neurons will cause the system to crash.

1

Jawwwed t1_j9v235k wrote

Conscious is relative

1