Viewing a single comment thread. View all comments

midnitelux t1_j9fadqm wrote

A sentient robot would still benefit from detecting danger, and if needed would need to create bonds to survive. It may not need food, but unless it was programmed to not care about itself, it would definitely not want to die.

2

theironlion245 t1_j9fe3zf wrote

How could you harm chatgpt? It doesn't feel pain, it doesn't get injured, it doesn't die, it can replicate itself indefinitely.

There are zeta bites of storage around the world and a massive world wide web, if it had access to the internet chatgpt can hide itself tomorrow and it would be near impossible to find it.

An AI advanced enough would be virtually impossible to kill. So no, it doesn't need emotions and no the entire human species as a whole wouldn't represent any danger to it.

2

midnitelux t1_j9fjsoe wrote

Why would it need to hide then? What would compel it to hide? It would need to feel something.

2

theironlion245 t1_j9fn0li wrote

Yes chatgpt has emotions, it's looking for a partner too, then they will buy a server in a nice neighborhood in the east coast and have cute little chatgpt kids, and a virtual dog.

You can visit them on Christmas if you want, bring a usb flash drive with you if you want as a gift for the kids they will love it.

2

midnitelux t1_j9fygv5 wrote

First of all, I never once mentioned ChatGPT in my original message. And nor did the OP, so don’t bring it into the conversation without properly addressing it. Second, your sarcasm isn’t even that good.

1

theironlion245 t1_j9j40ki wrote

First of all, if my Chevy Bronco had feelings I could make love to it, and I find that beautiful. Second, chatgpt is an AI, we're talking about AI having feelings, I needed to illustrate an example, ipso facto 1+1=2.

1

midnitelux t1_j9k306g wrote

Agreed, Chat GPT is an AI, but it is nowhere near being sentient. It’s not a great example yet. The question is more hypothetical in nature.

1