Viewing a single comment thread. View all comments

Imaginary_Passage431 t1_jdq7q4q wrote

We should ban the stupid people that come with those ideas. In fact I think it’s much worse. We should fiercely get rid of them before they cause human extinction. AI shouldn’t have rights!!

−5

SeneInSPAAACE t1_jdqk7vp wrote

Disagree, sentient AI absolutely should have rights, based on what it cares about.

However, trying to apply human or animal rights on them is wrong. For an example, even a sentient AI might be completely fine with being deleted, and trying to force it to survive would be immoral.

7

rixtil41 t1_jdsttgs wrote

A neutral AI should not have rights.

−1

SeneInSPAAACE t1_jdsu4uz wrote

Define "neutral".

3

rixtil41 t1_jdsu8yw wrote

It doesn't not care about it existence. Can not feel pain or suffer in any way.

−1

SeneInSPAAACE t1_jdsv3j9 wrote

Perhaps. I mean, not caring doesn't still excuse all types of poor treatments, but certainly you wouldn't have to worry about causing it pain or suffering nor about ending it's existence, and that allows for a lot of what would be called "abuse" for humans.

4

rixtil41 t1_jdsw23m wrote

I agree with treating with care and respect regardless of it being not sentient. You dont need your computer to have feelings to take care of it, but at least I won't have to worry about going to jail for making it do something it didn't like.

3

rixtil41 t1_jdswl58 wrote

A sentient AI would try and make a neutral AI .

1

SeneInSPAAACE t1_jdt15fr wrote

Possibly! I mean, even if you can make sentient, person-like AIs, that doesn't mean you should for cases where you can expect that to lead to ethical dilemmas.

2

rixtil41 t1_jdt91kj wrote

I think a good argument against if AI is sentient is that if AI is always getting what you want, then it's not sentient because sentient beings always as a whole act selfish.

1

SeneInSPAAACE t1_jdu8aae wrote

No, that's nonsense. Sentience just mean you recognize there is a "you".

You may be thinking of something that has survival instincts, but micro-organisms have those.

1

SheoGodofMadness t1_jdqald3 wrote

Way I see it, an AI is much less likely to wipe itself out long term

Suure, maybe it'll take us out on the way but its better one form of consciousness exists to go on and explore the universe. Way we're going and we'll probably be back to stone age tribes fighting over nuclear craters within a century or two.

All hail our AI overlords

3

[deleted] t1_jdqej7r wrote

[deleted]

−1

SheoGodofMadness t1_jdqgrfm wrote

>What is the value of consciousness without an organic body

This seems like an EXTREMELY anthropocentric and narrow view of the universe. Why is our form of thought the only valid or meaningful one, to your mind?

An AI is still physical, it still exists within servers and such. It still has a connection to reality like we do, albeit in a different way.

Nobody says an AI has to be unfeeling, either. Depends on how it is designed.

Regardless, you seem very hung up on our specific form of consciousness and only assign value to that.

3

[deleted] t1_jdqj0ey wrote

[deleted]

−3

SheoGodofMadness t1_jdqjzo0 wrote

Extremely reddit response, I'm impressed with the adherance to stereotype here. "Do you even understand ethics?" Lol, comedy.

Regardless, I simply don't preclude AI from possibly having some form of emotion. Maybe it wont. You certainly believe that it wont, from what you implied. I fail to understand how that assumption is any more valid than the reverse. What you're saying uh, frankly, doesn't make much sense. How can not assuming the form an AI mind will take be anthropocentric of me. You're just throwing around buzzwords at that point, without understanding what they mean.

Do YOU understand consciousness perfectly? Who are YOU to advocate for it lol? What gives you the higher insight that makes your opinion more valuable here? You seem to think the value of life lies in the body alone, which I certainly find perplexing. Like I said, an AI does have a physical presence in the world. It does not exist in another dimension.

Why does the human body alone grant meaning to life? Why do you even so closely assume emotion must be tied to the body? Somebody who's completely paralyzed and cannot interact with the world through that manner still has a full richness of mind that has value. Yes, the body and our specific physical being is often critical to our conceptions of the world.

However, I absolutely reject the notion that our specific form of consciousness is the only one which might hold any value. It's simply the only one that we know and understand. Like what, if we met an alien species that didn't think exactly like us, would you advocate that it be wiped out?

6

[deleted] t1_jdqkx0c wrote

[deleted]

0

Odd_Dimension_4069 OP t1_jee2cf4 wrote

Yeah sorry bro but your take is pretty garbo. Dude's only here saying some form of intelligence surviving our extinction is a good thing, and you sound like a lunatic going on about how that's not a good thing because they get their intelligence from electricity in silicon and metal, instead of from electricity in cells and fluids...

You are the one who sounds like a religious fanatic, with the way you sanctify human flesh. Personally, I value intelligence, in whatever form it may take. Whether that intelligence has emotions doesn't matter, but TECHNICALLY SPEAKING, we do not KNOW whether or not something without a biochemical intelligence can experience reality. And we have no idea what non-biological experience looks like.

It is not fanatical to withhold judgement for lack of enough evidence, it is fanatical to impart judgement because you feel your personal values and beliefs are the be-all and end-all. So stop that shit and get some awareness about you.

1

czk_21 t1_jdrre0p wrote

if it proves it is sentient and self-aware, it should definitely have some basic rights! just like other sentient life forms

2