Viewing a single comment thread. View all comments

Shiningc t1_je1i77y wrote

It's not even as smart as a toddler, as it doesn't have sentience or a mind. If it were a general intelligence, then it should be capable of having a sentience or a mind.

1

skztr t1_je1s30l wrote

I am not familiar with any definition of intelligence for which sentience is a prerequisite. That's why we have a completely separate word, sentience, for that sort of thing. I agree that it doesn't have sentience, though that's due to completely unfounded philosophical reasons / guesses.

1

Shiningc t1_je1sbqg wrote

AGI is a general intelligence, which means that it's capable of any kind of intelligence. Sentience is obviously a kind of intelligence, even though it happens automatically for us.

1

skztr t1_je1t60r wrote

I would very firmly disagree that sentience is a kind of intelligence.

I would also very firmly disagree with your definition of "general" intelligence, as by that definition humans are not generally intelligent, as there are some forms of intelligence which they are not capable of (and indeed, some which humans are not capable of which GPT-4 is capable of)

Sentient life is a kind of intelligent life, but that doesn't mean that sentience is a type of intelligence.

Do you perhaps mean what I might phrase as "autonomous agency"?

(for what it's worth: I was not claiming that GPT is an AGI in this post, only that it has more capability)

1

Shiningc t1_je1tmp0 wrote

Humans are capable of any kind of intelligence. It's only a matter of knowing how.

We should suppose, are there kinds of intelligent tasks that are not possible without sentience? I would guess that something like creativity is not possible without sentience. Self-recognition is also not possible without sentience.

1

skztr t1_je20sfk wrote

"creativity" is only not-possible without sentience if you define creativity in such a way that requires it. If you define creativity as the ability to interpret and recombine information in a novel and never-before-seen way, then ChatGPT can already do that. We can argue about whether or not it's any good at it, but you definitely can't say its incapable of being at least as novel as a college student in its outputs.

Self-recognition again only requires sentience if you define recognition in a way that requires it. The most basic form, "detecting that what is being seen is a representation of the thing which is doing the detecting", is definitely possible through pure mechanical intelligence without requiring a subjective experience. The extension of "because of the realisation that the thing being seen is a representation of the thing which is doing the detecting, realising that new information can be inferred about the thing which is doing the detecting" is, I assume, what you're getting at ("the dot test", "the mirror test", "the mark test"). This is understood to be a test for self-awareness, which is not the same thing as sentience, though it is often seen as a potential indicator for sentience.

I freely admit that in my attempts to form a sort of "mirror test" for ChatGPT, it was not able to correct for the "mark" I had left on it. (Though I will say that the test was somewhat unfair due to the way ChatGPT tokenizes text, that isn't a strong enough excuse to dismiss the result entirely)

1

Shiningc t1_je235un wrote

Creativity is by definition something that is unpredictable. A new innovation is creativity. A new scientific discovery is creativity. A new avant-garde art or a new fashion style is creativity.

The ChatGPT may be able to randomly recombine things, but how would it know that what it has created is "good" or "bad"? Which would require a subjective experience to do so.

Either way, if the AGI is capable of any kind of "computation", then it must be capable of any kind of programming, which must include sentience, because sentience is a kind of programming. It's also pretty doubtful that we could achieve human-level intelligence, which must also include things like the ability to come up with morality or philosophy, without sentience or a subjective experience.

1

skztr t1_je3n60r wrote

I'm not sure what you mean, regarding creativity. ChatGPT only generates outputs which it considers to be "good outputs" by the nature of how AI is trained. Each word is considered to have the highest probability of triggering the reward function, which is the definition of good in this context.

Your flat assertion that "sentience is a kind of programming" is going to need to be backed up by something. It is my understanding is that sentience refers to possessing the capacity for subjective experience, which is entirely separate from intelligence (eg, the "Mary's room" argument)

1

Shiningc t1_je4crsl wrote

Sentience is about analyzing things that are happening around you, or perhaps within you, which must be a sort of intelligence, even though it happens unconsciously.

1