theironlion245 t1_j9f0m1b wrote
Emotions are just a chemical reaction we evolved to keep us alive, ei warn us of danger, create a bond to stick together, make us leave the cave to look for food not to starve.
They would be absolutely useless for a robot.
midnitelux t1_j9fadqm wrote
A sentient robot would still benefit from detecting danger, and if needed would need to create bonds to survive. It may not need food, but unless it was programmed to not care about itself, it would definitely not want to die.
theironlion245 t1_j9fe3zf wrote
How could you harm chatgpt? It doesn't feel pain, it doesn't get injured, it doesn't die, it can replicate itself indefinitely.
There are zeta bites of storage around the world and a massive world wide web, if it had access to the internet chatgpt can hide itself tomorrow and it would be near impossible to find it.
An AI advanced enough would be virtually impossible to kill. So no, it doesn't need emotions and no the entire human species as a whole wouldn't represent any danger to it.
midnitelux t1_j9fjsoe wrote
Why would it need to hide then? What would compel it to hide? It would need to feel something.
theironlion245 t1_j9fn0li wrote
Yes chatgpt has emotions, it's looking for a partner too, then they will buy a server in a nice neighborhood in the east coast and have cute little chatgpt kids, and a virtual dog.
You can visit them on Christmas if you want, bring a usb flash drive with you if you want as a gift for the kids they will love it.
midnitelux t1_j9fygv5 wrote
First of all, I never once mentioned ChatGPT in my original message. And nor did the OP, so don’t bring it into the conversation without properly addressing it. Second, your sarcasm isn’t even that good.
theironlion245 t1_j9j40ki wrote
First of all, if my Chevy Bronco had feelings I could make love to it, and I find that beautiful. Second, chatgpt is an AI, we're talking about AI having feelings, I needed to illustrate an example, ipso facto 1+1=2.
midnitelux t1_j9k306g wrote
Agreed, Chat GPT is an AI, but it is nowhere near being sentient. It’s not a great example yet. The question is more hypothetical in nature.
Viewing a single comment thread. View all comments