Viewing a single comment thread. View all comments

skraddleboop t1_j9wbupc wrote

I don't get it. Why would I care if something is going around looking/acting like me when I'm gone? Why would I care to interact with a digital imposter of a loved one? Genuinely curious about the other perspective.

8

RaccoonProcedureCall t1_j9xmk3o wrote

I find it difficult to identify precisely what I dislike about the idea of a digital simulacrum of me in some way taking my place after my death, so I can’t offer much help with that if you don’t see any reasons why it could be objectionable. Nevertheless, I think most people agree that certain wishes of a deceased person ought to be respected even if the deceased person is no longer around to care (e.g., whether one wants to be buried, cremated, etc.), and I would hope that could extend to this issue.

As far as why one might want to interact with the simulation—I think that’s much easier to see, though specifics would depend on how far the technology goes. On the simpler end, a basic chatbot that simulates the deceased’s voice might at least be comforting to someone grieving. I know people who say they would like to use similar technology to have one last chance to talk to someone they loved, even if they knew it was fake. On the more sophisticated (and much more hypothetical) end, I suppose such a simulation could allow some bereaved to function almost as though their loved one never died. Hopefully it’s easy to see why someone might want to live their life as though their dead friends or family were still living.

1

stefanica t1_j9xxv19 wrote

I'd love to hear some of my grandparents' stories and my grandmother's recipes again. I'd really love for my kids to. They passed not long after my younger children were born.

1

skraddleboop t1_j9y55tb wrote

But that would be an argument for making sound recordings of people while they are alive, not so much for trying to create an AI version of them complete with deep fake voice capabilities.

4