Viewing a single comment thread. View all comments

Neurogence t1_irrvoks wrote

AGI absolutely does not imply consciousness. You can even have ASI with zero consciousness. Even right now, our most advanced neutral nets are smarter than cockroaches, they can even almost drive cars. Yet cockroaches are infinitely more conscious.

5

2Punx2Furious t1_irrx7hg wrote

It all depends on how you define consciousness. I think most people disagree because they have different definitions of the word.

4

sumane12 t1_irrx9q1 wrote

Bold of you to assume that considering our lack of understanding of consciousness. How do you know cockroaches are conscious, and how do you know current AI isn't?

4

Neurogence t1_irrxqso wrote

We can tell that animals are sentient. Do you seriously believe that current AI is conscious? GPT-3 is smarter than an elephant in some ways. But do you seriously believe that a system like GPT has anything approximating to consciousness at all? These systems have 0 consciousness.

2

sumane12 t1_irs8f2w wrote

How can you tell animals are sentient? Personally I don't know enough about consciousness to make either of those calls. Are trees sentient? What about stones? What about individual atoms?

I find it fascinating that people can judge this with such authority because they equate consciousness with agency, but if we can imagine a system of agency without consciousness, can it be possible for consciousness without agency? Can you prove your personal consciousness is not just the result of a network of smaller consciousness of individual neurons, which is in turn a network of consciousness of individual atoms which is a network of consciousness of subatomic particles?

Seems silly, but it's impossible to prove one way or the other. We have no idea what consciousness is so I think it represents extreme hubris, and is potentially dangerous to think something can't be conscious, just because it doesn't seem conscious.

2

Powerful_Range_4270 t1_irsg1u1 wrote

We should never assume anything is consciousness unless it's useful. Why would believing in that the food that we eat us is consciousnessly aware that it going to be eaten be helpful.

−2

sumane12 t1_irsvffs wrote

While I agree with what your saying, I think your example is lacking slightly. To consider food we are eating as conscious, could mean many things, but I'm assuming you're anthropomorphising the food to fear and not want to be eaten, and it makes sense that would not be the kind of consciousness food would have. Either way, I still agree, it doesn't benefit us to assume anything is conscious until it gives us a reason to think so

1