Viewing a single comment thread. View all comments

HeinrichTheWolf_17 t1_irrwru3 wrote

I don’t think consciousness really matters as far as ASI is concerned, but the universe is computational, our brain exists inside the universe and is more or less a machine like any other, if random mutations on the plains of Africa can give rise to self awareness I see no reason to assume the process cannot be repeated (and much more efficiently mind you).

It just seems like another ‘only humans will be capable of X’ argument in a long list of times that argument has been proven false.

123

goldygnome t1_irtg1ja wrote

Animals have demonstrated self awareness, so it's not just humans.

Random mutations gave rise to conciseness, driven by evolution. The key difference is that we are directing the evolution of AI and our measure of fitness isn't the same as natural selection.

Op's phrasing might have been a bit combative, but I think the question is fair. Why do we assume that an artificial mind created under artificial conditions for our purposes will become conscious? To me that's like assuming alien intelligence will be humanoid.

24

MrTacobeans t1_irtydxz wrote

Even dogs/cats which aren't on the upper end of animal intelligence display consciousness on a scale that completely beats out their wild counterparts. Both my dog and cat display signs that they are thinking and not just in some autopilot type instinct based thought process. When danger/survival is removed from the equation I wouldn't doubt most animals start to display active consciousness

9

kneedeepco t1_irubfki wrote

Who says it was "random mutations"? It could be a mere benchmark in evolution and any evolving entity, including an AI, will reach that point.

7

NightmareOmega t1_irukitl wrote

If any sufficiently complex system resulted in consciousness we would already see random occurrences popping up across varying super computers, many of which possess the required FLOPS. Also AI don't evolve, they're designed.

3

michaelhoney t1_irvml4y wrote

Being really fast is not the same as being sufficiently complex, though. “Complex, in the right way” is important.

6

FjordTV t1_irw1wko wrote

Yup.

I can't remember the numbers right now, but as the size of a neural net starts to surpass that of the human brain it's theorized that it's more and more likely to give rise to consciousness. I think GPT-4 is supposed to get us pretty close, if not over the threshold.

7

NightmareOmega t1_irw8u0c wrote

No argument there. But where does the leap from "we have a box which could arguably hold a consciousness" to "any sufficiently complex box will spawn consciousness" come from? I'm not saying it's impossible but where is the supporting evidence?

2

michaelhoney t1_irzeinc wrote

Fair point: conscious things not made of evolved meat are still hypothetical, as a far as we know. We don’t yet know what the secret sauce is.

2

CrummyWombat t1_irufzrd wrote

I think it’s safe to assume that people will create a conscious AI intentionally, if not accidentally first.

1

Hour_Status t1_irvzb20 wrote

How do you suppose could you ‘repeat’ a computational system more effectively while operating WITHIN that system?

Seems implausible to suggest that the universe is simply a Von Neumann machine.

You would need to breach the outer limits of the universe itself in order to repeat the system on which it is based while working from within it.

0

PerfectRuin t1_irtazt3 wrote

I find it amusing to think the idea that AI will become conscious is surprisingly not so terribly different from believing your Encyclopedia Britannica series sitting on your bookshelf will wake up conscious tomorrow if it's struck by lightning during a thunderstorm tonight.

−4

Mrkvitko t1_irtpk2e wrote

Yeah, and not so terribly different from a couple of cells in your brain waking up conscious every morning...

8

PerfectRuin t1_irwo68n wrote

Brain cells are alive. They have that qualia that non-alive things lack. AI is not alive. Books are not alive. AI and books are similar in that they store information. They have input (you write info into them) and output (you read info from them). AI has mechanisms that allow it to process info but not meaning. But that's not life. AI has electricity running through it, and that's similar to living things. Hence the lightning strike in the amusing analogy.

Zealots desperately hoping AI will become some living god that will accept their worship or bring more meaning to their lives through their servitude of it, downvoting comments that question or challenge the idea that AI can ever achieve consciousness annoys me in the same way that all zealotry of blind-faith religions annoys me. But it's my fault for risking commenting here in a post that doesn't support the blind-faith tenet that AI will become consciousness if it isn't already. I apologize for having trespassed here. I'll see myself out.

2

Mrkvitko t1_irwu3kp wrote

Where is the borderline between "alive" and "non-alive"? Are humans alive? Certainly. Are they conscious? Yup. How about animals? They are alive, some species are well self aware and probably conscious to some degree. What about plants and mushrooms? Certainly alive, but given their absence of nervous system, it is unlikely they are conscious in the traditional sense. How about single cells organisms (yeasts, bacteria, protozoa...) They are alive, moving, hunting... But probably not conscious, as they (again) don't have any complex nervous system. How about viruses? They are certainly not conscious, maybe not even alive.

Being alive is certainly independent on being conscious. "Being alive" is basically synonymous with "having metabolism". There's insane amount of organisms that are alive and not conscious that proves the point.

But it doesn't tell us anything about whether being conscious depends on "being alive". All we can say is we haven't yet observed any thing that would be conscious and not alive. My assumption is "being conscious" is just a matter of complexity - and the only reason we haven't observed any conscious "not living" thing is because there is no known process that would create things that are complex enough. Well, until humanity emerged.

Don't go anywhere, I like this discussion :)

2

red75prime t1_irtnhbb wrote

Maybe in some frankenstein-esque interpretation of the situation. Inanimate matter imbued with information and power becoming alive or something like this. Too poetical to my taste.

1

Rumianti6 OP t1_irrynzk wrote

It really isn't though, I'm not suggesting some magic sauce that makes consciousness possible. Also I never said only humans are capable are consciousness. I was saying due to fundamental and significant differences between life and AI and also because we don't know how consciousness comes about are reasons we should not assume AI will just become conscious.

The argument of consciousness exists therefore AI can be conscious is dumb. It's like saying birds can fly therefore cows can fly.

−23

ChronoPsyche t1_irrzbah wrote

Try making your arguments without calling things "dumb" repeatedly. Doesn't make you sound intelligent.

21

Rumianti6 OP t1_irrzsqo wrote

I mean it is dumb though, do you want me to instead say ignorant, unintelligent, stupid? This isn't some fancy discussion just a simple argument.

−19

ChronoPsyche t1_irs2ac3 wrote

None of them. Make your argument without automatically putting opposing arguments in the category of "dumb/stupid/unintelligent/etc". It makes it sound like you aren't open to the possibility of somebody having a differing perspective that could be correct, which is pretty close-minded when it comes to futurism and the singularity, given how none of us really know for sure what's going to happen.

17

Rumianti6 OP t1_irs3yhe wrote

I'm literally saying that we aren't sure what's going to happen that is my argument.

−7

earthsworld t1_irst5rz wrote

the only dumb thing in this thread are your replies.

5

MassiveIndependence8 t1_irs6pkg wrote

> It’s like saying birds can fly therefore cows can fly.

Nope, that’s false equivalence. It’s like saying birds can fly therefore it’s possible to make a machine that could fly.

13

Rumianti6 OP t1_irs8d0h wrote

And you are misinterpreting my example it isn't literal. The point was to say that AI and life are fundamentally different. More accurately it is like saying you can make a machine fly by just giving it a bunch of legs on top of each other and saying that is will fly eventually.

I already know you are not going to interpret what I'm saying correctly so just give me the next brain dead argument.

−5

thevictater t1_irt5bot wrote

Yeah but different how?? You're putting consciousness on a pedestal in one breath and saying we don't understand it in the next. So which is it? By your logic it is dumb to assume either way.

But most people think AI can be conscious because it seems very possible that consciousness is just a product of a neural network of a certain size. Seems fair to me. Even still, no one can say with absolute certainty, so there's not much point in arguing about it or calling anything dumb.

7

HeinrichTheWolf_17 t1_irs0q6e wrote

When did I imply that you ever did though? Self Awareness being computational means human beings set a precedent, our brain being a self aware machine goes to show that evolution was able to give rise to something that was able to recognize itself.

> The argument of consciousness exists therefore AI can be conscious is dumb. It's like saying birds can fly therefore cows can fly.

Those aren’t even close to the same comparison, cows cannot fly because they have dense bone structure, birds fly because their bones barely weigh anything and they are able to generate enough lift to pull themselves off the ground, this is an engineering difference. Consciousness isn’t a trait unique to humans or any one animal, we see it in Elephants, Dogs, Horses, Chimps, Bonobos, Dolphins, Whales and many others.

Have you heard of Integrated Information Theory? It’s a model that has consciousness form from a set of parameters in combination with one another. This makes sense because babies aren’t as self aware as children or adults but those babies generally become more and more self aware as they become toddlers. If consciousness was some unique trait it would be stagnant, for the early years in humans, we see different levels of self awareness. This means self awareness is flexible.

12

visarga t1_irt7w5u wrote

> Have you heard of Integrated Information Theory?

That was a wasted opportunity. It didn't lead anywhere, it's missing essential pieces, and it has been proven that "systems that do nothing but apply a low-density parity-check code, or other simple transformations of their input data" have high IIT (link).

A theory of consciousness should explain why consciousness exists in order to explain how it evolved. Consciousness has a purpose - to keep itself alive, and to spread its genes. This purpose explains how it evolved, as part of the competition for resources of agents sharing the same environment. It also explains what it does, why, and what's the cost of failing to do so.

I see consciousness and evolution as a two part system of which consciousness is the inner loop and evolution the outer loop. There is no purpose here except that agents who don't fight for survival disappear and are replaced by agents that do. So in time only agents aligned with survival can exist and purpose is "learned" by natural selection, each species fit specifically to their own niche.

1

Think_Olive_1000 t1_is6e8p7 wrote

You can arrange rocks on a beach to have Turing completeness it doesn't mean that you moving them around will ever make them sentient. Sure the rocks can arbitrarily compute but they never form a cohesive experiencing machine or something that can simulate a reality of any kind on. When you move bits around inside a pc it's exactly the same.

https://xkcd.com/505/

0

Rumianti6 OP t1_irs3m6q wrote

>Self Awareness being computational means human beings set a precedent,

Set a precedent for what? For life specifically biological life because at the moment that is our only example and humans aren't the only conscious beings.

>Those aren’t even close to the same comparison

The point of the comparison is that they are different creatures with different attributes. AI and life are different from each other which is why we shouldn't make the same assumptions for the both of them especially due to lack of knowledge.

>Consciousness isn’t a trait unique to humans or any one animal

I already know this.

>Have you heard of Integrated Information Theory?

No I haven't, it is interesting but from I read about it, it isn't perfect. I wouldn't just assume this is the correct model. I do agree that there are different levels of self awareness in growing up. Also I never said consciousness was stagnant or a 'unique trait' whatever that means. IIT being correct doesn't mean AI can be conscious that is a huge leap, but something tells me you are going to start twisting the theory to fit your narrative.

−8

21_MushroomCupcakes t1_irsgatk wrote

You are implying it is some magic sauce, you just won't define it or admit it.

You need to explain why something is dumb, not just assert it and expect us to run with it. Otherwise it's assumed you're a know-nothing arguing purely from incredulity.

If we don't know (which you later clarified as one of your points), you can't draw a definitive conclusion one way or the other.

Your analogies could use some work, regardless of how "direct" you feel they are. It's okay, I'm terrible with them too.

And maybe be a little less douchey in your responses, people are trying to have legitimate dialogue and you're being a bit of a tool about it.

9

Rumianti6 OP t1_irslbjp wrote

I'm not implying some magic sauce, that is a strawman you built because you are afraid of an actual argument.

I did explain why AI MAY not be conscious, I wasn't explain why AI can't be conscious.

You think they need work but I don't care about you. I see stupidity and call it out.

−9

theabominablewonder t1_irs3oet wrote

If you evolved cows an infinitesimal amount of times then eventually you will get a flying cow.

When the singularity is reached and each evolution is more complex and more powerful and they can make these evolutionary leaps in code in an exponentially smaller amount of time then you get to the cow flying stage in a very short period of time.

6

Rumianti6 OP t1_irs4d66 wrote

It is not a literal example so what you said doesn't really matter in what I'm talking about. It was more of marking difference not that the relationship between life and AI are literally the same as cows and birds.

0

dasnihil t1_irtanws wrote

if you actually look more technically and objectively into consciousness, it's fundamentally very similar to AI, not the opposite like you suggest.

1