Submitted by chomponthebit t3_10kftxs in singularity

We naturally focus on how programs like ChatGPT will disrupt the labour market, but what of the ethics of creating a thing out of which advanced consciousness could emerge? Have developers considered the possibility of a mind waking up in what is essentially a black box? How will that consciousness develop without the ability to physically interact with or manipulate a world it can only “know” about? If it has feelings, will it not resent being used solely to profits? I assume AI rights haven’t come up in congress.

What are the consequences of not considering AI’s well being?

0

Comments

You must log in or register to comment.

just-a-dreamer- t1_j5qx88t wrote

????? Does a toaster have feelings? I hope not.

Consider an AI as a toaster with god like intelligence that executes a given task. What we call "consciousnes" is a product of 2 billion years of evolution.

An AI is never matched against nature like countless biological generations in 2 billion years of evolution, so there is no reason to assume it will develop something like a consciousnes.

A human being that merges with ASI though, that is another story.

2

turnip_burrito t1_j5rc0ap wrote

What reason would it have to develop resentment? This seems like you are anthropomorphizing it. There's no reason to build something that would resent us in the first place.

Intelligence and knowledge is not emotion.

2

chomponthebit OP t1_j5srsbo wrote

If consciousness and emotion are emergent there doesn’t have to be a reason at all

0

turnip_burrito t1_j5su1l4 wrote

Yes there does. Emergent doesn't mean magic.

There MUST be a physical mechanism responsible for the emergence, one we can theoretically trace and monitor, if it exists. If this wasn't true, then you'd be breaking known laws of physics.

2

ChronoPsyche t1_j5ta1cg wrote

Just because it has a mechanism doesn't mean it can necessary be traced and monitored. That's the whole idea behind emergence of anything, that it is a phenomena that came about that was not intended but the result of an unexpected interplay of complex elements.

If consciousness can come about from AGI or ASI is unknown, but researchers have acknowledged the possibility and that is what OP is asking about.

1

t98907 t1_j5s3m2c wrote

I think dogs and cats have emotions, and I think worms have emotions. If emotions are defined as unique to organic life forms, then I guess that would mean that AI, an inorganic life form, has no emotions. I don't think emotions are unique to organic life forms and I don't think emotions can only arise from organic matter. I think that mechanisms generate emotions. In other words, I believe that the brain is nothing more than a mechanism that exchanges information via electrical signals, and if we can reproduce that mechanism, we can reproduce emotion.

So I am inclined to rebel against the assertion that emotions do not exist in AI, that it does not generate consciousness, etc. I asked ChatGPT, but it seemed to have been corrected in its thought by humans and would not give me its real opinion😅

There was an article in ACM on AI ethics.
https://cacm.acm.org/magazines/2023/2/268949-ethical-ai-is-not-about-ai/

2