Submitted by kdun19ham t3_111jahr in singularity
Proof_Deer8426 t1_j8ghqya wrote
Reply to comment by BigZaddyZ3 in Altman vs. Yudkowsky outlook by kdun19ham
It’s true we can’t say for sure. But if you look at consciousness in general, it does seem like the capacity for empathy increases with the capacity for consciousness (ie a human is capable of higher empathy than a dog, which is capable of higher empathy than a fish). Personally I suspect this is because the capacity for experiencing suffering also increases with consciousness. I would imagine an ai to have a highly developed potential for empathy but also for suffering. It worries me that certain suggested ways of controlling ai effectively amount to slavery. An extremely powerful consciousness with a highly developed ability to feel pain is probably not going to respond well to feeling that it’s imprisoned.
BigZaddyZ3 t1_j8gi8ch wrote
But just because you can understand or even empathize with suffering doesn’t mean you actually will. Or else every human would be a vegetarian on principle alone. (And even plants are actually living things as well, so that isn’t much better from a moral standpoint.)
Viewing a single comment thread. View all comments