nexusgmail t1_jacwie7 wrote
Reply to comment by streetvoyager in Scientists unveil plan to create biocomputers powered by human brain cells | Scientists unveil a path to drive computing forward: organoid intelligence, where lab-grown brain organoids act as biological hardware by chrisdh79
Imagine if those cells were even somewhat aware, and were forced into repetitive number crunching with no means to understand the cause of it's bondage or to ever escape, or even die? Would make for quite the horrific reveal for a horror movie ending.
Wandering-Zoroaster t1_jad2jtj wrote
I think you mean self-aware?
It’s an interesting question. That being said, the sentience that they would or wouldn’t have would depend completely on different circumstances than the one that generated us humans, so it’s fair to say it probably wouldn’t (behave like a human)/(have human desires)
nexusgmail t1_jaeiold wrote
Yes: self-aware.
I would argue that all living things have the same desires you might call "human", albeit simplified, and likely without the added complexity made necessary via the perception of tribe or familial group as an extension of self. Literally every single human desire is tied to survival via the neuronal survival-mechanism of the brain. Can you find a single thought you've had today that isn't (even loosely) related to survival/procreation? We are almost constantly attempting to seek out safety/security, comfort, and control; and to avoid danger, discomfort, or uncertainty. I'm not sure what "behave like a human" is specifically referring to, but I can certainly see animals following the same survival urges that we do.
I do agree that, in this imagined scenario, the sentience might develop differently than we can see in ourselves: having different parameters in which to define it's sense of self/identity, and that it's survival-mechanism movements might be calibrated via a difference in perspective and the definition of it's own sense of identity.
I'm not, or course saying this is all so: but I imagine it to be somewhat unethical, even arrogant to not consider the possibility.
Strategy_pan t1_jadl7nw wrote
Maybe the cells would try to imagine a whole new universe just to entertain themselv... Oh wait.
SnoDragon t1_jadyn6t wrote
200 quatloos on the new comer!
nexusgmail t1_jaej8rj wrote
I couldn't agree more! I imagine humans creating massive architectures of this organic technology, before going extinct and leaving it all in the hands of AI, who eventually abandon it, and leave it to it's own devices in this way. Universes within Universes within awareness.
[deleted] t1_jacy534 wrote
[removed]
BuckyRB6 t1_jaf4l65 wrote
The Hive Mind is coming.
SeaworthinessFirm653 t1_jadd3a0 wrote
Consciousness is logically computable. Consciousness is defined by architecture, not by whether something is organic or responds to electric pulses. You can theoretically store consciousness on a computer as a program with sufficient input/output.
Worrying about nerve cells becoming conscious is a little bit of a misdirected concern. Advanced AI deep learning architectures are far more concerning.
Crazy-Car-5186 t1_jaddq0h wrote
Asserting a belief isn't enriching the discussion without offering testable points
SeaworthinessFirm653 t1_jadelan wrote
Consciousness is a function whose input is environmental stimulus and whose output is a cyclical thought, and/or a physical action (muscle contraction). The more environmental-semantic information this entity encodes in its memory, the more “conscious” it is, but consciousness is not binary.
Logic gates form if:then statements that, when assembled together, creates a system of behavior that acts in somewhat logical ways. Human biological neuron cells form these.
Consciousness inherently requires at least some memory, input, and processing. Every neuron in the human brain is technically computable because it’s just input and output of electrical signals.
A nerve cell is effectively just an analog neuron with a few extra properties. It’s not logical to assume that consciousness is just a bundle of nerve cells. It’s a very architecturally-dependent bundle of if/then clauses and memory that, when combined, simulates consciousness.
If a system can be described by if/then, then it is computable.
Also, if you cut a living brain in half, it ceases to become conscious. The reason for this is that the architecture becomes incoherent. When you are asleep (beasides REM/dreaming) you are also unconscious.
Regardless, all my points to say: consciousness is computable through architecture, not simply through nerve cells. Biological human nerve cells are neither necessary nor sufficient for consciousness.
Sex4Vespene t1_jaefg4j wrote
As somebody with a degree in neuroscience, you are so out of your depth. I understand the logic behind how you got there, but is wildly inaccurate.
[deleted] t1_jae54t9 wrote
[removed]
Viewing a single comment thread. View all comments