Representative_Pop_8

Representative_Pop_8 t1_jd7wa2b wrote

which project, I only heard the one with the microprobes powered by sails, starshot I think that was called. but that was one way as far as I recall, since you need a laser to power them, which you won't have at destination for the return trip

I don't see how we could do round trip with any type of technology we can design for at least several centuries.

5

Representative_Pop_8 t1_j9wp2cc wrote

I think ChatGPT has finer grained data. in fact I did teach Spanish piglatin ( geringoso) to ChatGPT and it did learn it after about a dozen promts even though it insisted that it didn't know and couldn't learn it.

i had to ask to play role as a person that knew the piglatin I tought him. Funny thing is it ranted about not being able to do the translation, and they l that I wanted to know I could v apply the rules muy self! but next paragraph it said something like. " but the person would have said..." followed by a pretty decent translation

1

Representative_Pop_8 t1_j9am232 wrote

>The simple fact is that at its most basic, consciousness means being able to perceive and respond to external stimuli.

if you mean perceive as consciously perceive then yes, you needed subjective experience to have consciousness. It is not just responding to external stimuli.

consciousness is having sentience and subjective experience in general.
a toilet can respond to external stimulus, remove water when you press the lever and add water until it senses it is full, I am pretty confident it is not conscious.

>It's merely because of all the nonsense you add that you can claim supremacy over a simple car.

what part is nonsense? all I said is the basic understanding of consciousness from everyday experience, medical definitions, and philosophical ones too.

I am also not saying a car can't have consciousness, it is just you seem to not know what consciousness is, and mix the concept with some mechanical response to inputs.

1

Representative_Pop_8 t1_j9acqb1 wrote

consciousness is having sentience at that instant, There are other uses of the word ofcourse like the moral consciousness , but that is not what everyone here is talking about. When people use consciousness / sentient in regards to AI they are pretty much using as synonims. Sentient is much more specific , while consciousness does indeed have other meanings not neceserily implying sentience. But even the first defintion you provided implies sentience. like mentioned before the difference between being awake vs not is being sentient or not, you dont feel anything when asleep you do when awake

1

Representative_Pop_8 t1_j9a915t wrote

it's not about being self referential, it is the subjective experience, the difference between what you Feel when awake vs when asleep (not dreaming) . The body is still making calculations like the tesla when asleep it regulates breathing and heartbeats, measures water and nutrients, it can wake you up if there is is a loud sound or if you really need to drink or go to the bathroom. The Tesla could be doing all those calculations without being awake.

even we when awake we do a huge part of our thought processing unconsciously. You are not aware of the thousands of cones in your eyes nor in the individual strength of the light each cone detects depending on light frequency, you just see the summary created by your unconscious brain, it unconsciously processed all the information and you just ( consciously) see an array of pixels classified in a totally arbitrary classification of "colors"

I am not saying that an AI, even a tesla can't possibly ever be sentient, just that it is not enough to have what you mentioned in your post on top.

1

Representative_Pop_8 t1_j9a49ry wrote

i would find it extremely unlikely but not 100%, what if consciousness is some quantum property, kind of like charge, that normally is balanced out so a rock would be neutral charged but if measured precisely surely has a tiny charge, while by special processes like a vandegraff generator you can break b that balance.

now even if a rock has some of that consciousness property it likely still wouldn't be conscious by the standards we normally use since they're is no thought process or input signals it can be conscious of..

1

Representative_Pop_8 t1_j9a2r04 wrote

you are not even understanding the definitions right. consciousness, as we are discussing here and generally understood implies an internal state of awareness or wakefulness, not just responding to inputs. its not mumbo jumbo and if you still don't know what consciousness is then you might be a philosophical zombie.

"the quality or state of being aware especially of something within oneself"

"the state of being characterized by sensation, emotion, volition, and thought : MIND"

1

Representative_Pop_8 t1_j967lff wrote

But that is not what consciousness is. consciousness is not about responding to surroundings , a toilet knows when it is full of water but that doesn't make it conscious.

Consciousness is being able to subjectively feel things in its inside, like we do, the difference between being awake vs when asleep and we dont feel anything.

−1

Representative_Pop_8 t1_j95q67u wrote

i doubt any company wants to create a conscious machine right now, since as seen by Bing the moment some people right or wrong assign it sentience is the moment you start getting discussions about regulating " rights " for AI systems , that is not good for something you wish to use as a usefully tool.

we couldnt really don't know what causes consciousness either so we wouldn't know how to make a conscious machine and be sure it is conscious if we wanted to, other than recreating a human brain molecule by molecule.

Now consciousness could well be something that can be made with a machine of different construction than a human brain, but we've don't know the method that does that. Due to this lack of knowledge , even though unlikely, we can't even truly completely rule out that a thing like chatGPT could be sentient( but I don't think it is)

2

Representative_Pop_8 t1_j8wbau9 wrote

Reply to comment by [deleted] in Is chatGPT actually an AI? by Snipgan

i find it very hard to classify chatGPT as narrow. Sure it was trained only on language, but that allows it to handle an extreme range of subjects, even if not being specifically trained to. Many of the things it can't do are not so much related to its internal capacities but to the lack of external sensors to connect it to the world ( no senses), it not able to see nor make images ( thogh its cousin dall-e already can) , and it is not allow to keep its memory between sessions which seriously cripples its ability to do on context learning.

So, while not as broad as a human intelligence yet, i wouldn't say it is narrow, it is an AGI but not yet at human level on most subjects.

0

Representative_Pop_8 t1_j8wahj1 wrote

Reply to comment by valis010 in Is chatGPT actually an AI? by Snipgan

the Turing test is not and no one pretends it to be a test of sentience, it is a test of intelligence which is completely dientes concept.
a dog is sentient and would never pass a Turing test. chatGpt is (most likely) not sentient but could pass a Touring test.

2