SeaworthinessFirm653
SeaworthinessFirm653 t1_jadd3a0 wrote
Reply to comment by nexusgmail in Scientists unveil plan to create biocomputers powered by human brain cells | Scientists unveil a path to drive computing forward: organoid intelligence, where lab-grown brain organoids act as biological hardware by chrisdh79
Consciousness is logically computable. Consciousness is defined by architecture, not by whether something is organic or responds to electric pulses. You can theoretically store consciousness on a computer as a program with sufficient input/output.
Worrying about nerve cells becoming conscious is a little bit of a misdirected concern. Advanced AI deep learning architectures are far more concerning.
SeaworthinessFirm653 t1_iru86y6 wrote
Reply to comment by code_turtle in We'll build AI to use AI to create AI. by Defiant_Swann
Yes, I agree with that. I recall the analogy of taking the brain's neurons and connections, magnifying it in size to cover an entire block in a large city, and the immense density of connections would still be too large to make any meaningful observations even given our current technology.
I don't believe any optimism is required, though, to claim that we can be simulated. Unless we exist outside of the realm of physical things, that much is given. It's impossible to make good predictions about the future where the sample size is n = 0.
SeaworthinessFirm653 t1_irq7a9n wrote
Reply to comment by code_turtle in We'll build AI to use AI to create AI. by Defiant_Swann
Yes, I made my comment with the presumption that we are talking about AGI, not just a smart calculator bot making a slightly faster calculator bot. We have created some multi-modal AI that can accomplish different tasks, but the model itself is computationally inefficient and predictive rather than true learning (just like GPT-3 doesn't actually think logically, it's just a really advanced language prediction model).
As far as I am concerned, the difference between consciousness and AI is that an AI is an advanced look-up table using only simple logic while consciousness involves processing stored information for semantic meaning rather than adhering to an algorithmic process for syntactic meaning. See: Chinese translation room thought experiment.
AI today uses low-level logic en masse to produce high-level (relatively) thinking. With the addition of increasingly advanced neural networks, image-generation AI has utilized increasingly complex network structures, such as defining edges, shapes, complex shapes, and fuzziness around these levels. If we extend this notion to account for an AI that is capable of taking simple features such as moving shapes and we allow the AI to predict the shapes' locations, we may be able to reapply this scalable logic until the AI is able to understand complex ideas given sufficient inputs and sufficient training data. This is far-fetched from a modern technological standpoint, but not unbelievably far-fetched given how quickly we are advancing our AI.
If the human brain is made up of computations, then an elaborate series of computations is by definition what must define our consciousness, and thus it can be created with sufficient AI models. Switching to amplitude computers for computational efficiency or compressed memory models (current memory cell models scale linearly with space instead of logarithmically) may allow us to break through this barrier.
edit: sorry for the ramble
SeaworthinessFirm653 t1_irq2vcf wrote
Reply to comment by danielv123 in We'll build AI to use AI to create AI. by Defiant_Swann
That's actually a fair point; I hadn't considered the actual rate of growth in detail.
Edit: There is also an additional facet: If we are to assume that the AI in question is simply an AI designed to create superior AI, and this does indeed cyclically reproduce, then if you were to restrict it to its computational power, it would still run more efficiently than humans by such a large margin. It takes roughly 12 watts to run a human brain power-wise, and if a computer has access to enormously larger amounts of energy, then it is not unthinkable that a machine would be able to self-enhance to an insane degree. Sure, there may be logarithmically declining returns to a certain extent as exists with virtually any system, but the difference between a human and a machine that is at the point of diminishing returns would remain unimaginably wide. Humans were not designed to think; we were designed to be energy-efficient decent thinkers. A machine that can evolve at a million-times faster pace who is designed purely to think will inevitably pass us by a very long margin, even if the nature of acceleration belies an exponential growth function. The main caveat is that creating an AI that can produce superior AI relative to itself for the same intention of creating superior AI is incredibly difficult.
SeaworthinessFirm653 t1_irlmi91 wrote
Reply to comment by 205049 in We'll build AI to use AI to create AI. by Defiant_Swann
Extensive media coverage gives the implication that we are immoral much more in this era than prior when it is certainly not the case. Regardless, AI ethics continues to be a growing field.
SeaworthinessFirm653 t1_irlmegf wrote
Reply to comment by Basic_Description_56 in We'll build AI to use AI to create AI. by Defiant_Swann
Just as it is ridiculous to think that a constitution will limit the violation of human rights. Though, you may agree that a constitution is a good idea to secure rights despite its inevitable violation, no?
Ethics is a leverage point for utilizing AI safely. Your insult doesn't do you any favors.
SeaworthinessFirm653 t1_irlmaxf wrote
Reply to comment by code_turtle in We'll build AI to use AI to create AI. by Defiant_Swann
If AI is capable of producing AI superior than itself, then logically it creates a self-accelerating intelligence that will inevitably prove to be superior than us. AGI is implied when AI can produce better AI that can produce better AI.
SeaworthinessFirm653 t1_irjdpyb wrote
Reply to comment by chaos021 in We'll build AI to use AI to create AI. by Defiant_Swann
AI ethics will be critical in the coming decades.
SeaworthinessFirm653 t1_irjc2m6 wrote
Reply to comment by Devadander in We'll build AI to use AI to create AI. by Defiant_Swann
Yes. The moment an AI is capable of producing a superior AI than itself, the singularity would have been reached. However, this is not as simple as it sounds.
SeaworthinessFirm653 t1_jadelan wrote
Reply to comment by Crazy-Car-5186 in Scientists unveil plan to create biocomputers powered by human brain cells | Scientists unveil a path to drive computing forward: organoid intelligence, where lab-grown brain organoids act as biological hardware by chrisdh79
Consciousness is a function whose input is environmental stimulus and whose output is a cyclical thought, and/or a physical action (muscle contraction). The more environmental-semantic information this entity encodes in its memory, the more “conscious” it is, but consciousness is not binary.
Logic gates form if:then statements that, when assembled together, creates a system of behavior that acts in somewhat logical ways. Human biological neuron cells form these.
Consciousness inherently requires at least some memory, input, and processing. Every neuron in the human brain is technically computable because it’s just input and output of electrical signals.
A nerve cell is effectively just an analog neuron with a few extra properties. It’s not logical to assume that consciousness is just a bundle of nerve cells. It’s a very architecturally-dependent bundle of if/then clauses and memory that, when combined, simulates consciousness.
If a system can be described by if/then, then it is computable.
Also, if you cut a living brain in half, it ceases to become conscious. The reason for this is that the architecture becomes incoherent. When you are asleep (beasides REM/dreaming) you are also unconscious.
Regardless, all my points to say: consciousness is computable through architecture, not simply through nerve cells. Biological human nerve cells are neither necessary nor sufficient for consciousness.