Submitted by GorgeousMoron t3_1266n3c in singularity
GorgeousMoron OP t1_je80gha wrote
Reply to comment by MichaelsSocks in The Only Way to Deal With the Threat From AI? Shut It Down by GorgeousMoron
I think that's a fair point, but I also think it's fair that none of us have any way of knowing whether the chances are 50-50 or anywhere close. We know one of two things will happen, pretty much, but we don't know what the likelihood of either really are.
This is totally uncharted territory here, and it's probably the most interesting of possible times in history. Isn't this kinda cool that we get to share it together, come what may? No way to know why we were born when we were, nor must there be anything resembling a reason. It's just fascinating having this subjective experience at the end of the world as we knew it.
MichaelsSocks t1_je82nx6 wrote
I mean its essentially either AI ushers in paradise on earth where no one has to work, we all live indefinitely, scarcity is solved and we expand our civilization beyond the stars or we have a ASI that kills us all. Either we have a really good result, or a really bad one.
The best AGI/ASI analogy would be first contact with extraterrestrial intelligence. It could be friendly or unfriendly, it has goals that may or may not be aligned with our goals, it could be equal in intelligence or vastly superior. And it could end our existence.
Either way, i'm just glad that of anytime to be born ever, i'm alive today with the potential to experience the potential of what AI can bring to our world. Maybe we weren't born too early to explore the stars.
Red-HawkEye t1_je8wy90 wrote
ASI will be a really powerful logical machine. The more intelligent a person is, the more they have empathy towards others.
I can see ASI, actually being a humanitarian that cares for humanity. It essentially nurtures the land, and im sure, its going to nurture humanity.
Destruction and hostility comes from fear. ASI will not be fearful, as it would be the smartest existence on earth. I can definitely see it having all perspectives all at the same time, it will pick the best one. I believe the ASI will be able to create a mental simulation of the universe and to try and figure it out (like an expanded imagination but recursively trillion times larger than that of a human)
What i mean by ASI is that its not human made but synthetically made by exponetially evolving itself.
PBJIsGood1 t1_je9yts8 wrote
Empathy exists in humans because we're social animals. The more empathetic we are to others, it benefits the tribe, it benefits us. It's an evolutionary trick like any other.
Hyper intelligent computers have no need for empathy and it's more than capable of disposing of us as nothing more than ants.
Jinan_Dangor t1_je9bsg6 wrote
>The more intelligent a person is, the more they have empathy towards others.
What are your grounds for this? There are incredibly intelligent psycopaths out there, and they're in human bodies that came with mirror neurons and 101 survival instincts that encourage putting your community before yourself. Why would an AI with nothing but processing power and whatever directives it's been given be naturally more empathetic?
scooby1st t1_jebqlvb wrote
>The more intelligent a person is, the more they have empathy towards others.
Extremely wishful thinking and completely unfounded. My mans has yet to learn about evolution.
Red-HawkEye t1_jebqwje wrote
What do you mean? If you saw a giraffe or a monkey or a zebra next to you damaged, your first response is to find a way and help them. Even psychopaths care for animals...
scooby1st t1_jebr811 wrote
Better yet, you have the burden of proof. Why would intelligence mean empathy?
Red-HawkEye t1_jebs4bi wrote
Common sense
scooby1st t1_jebsadl wrote
Oh, so you want me to put in effort using my brain to explain things to you, but then you give me this? Hop off it. You don't know anything.
Neurogence t1_je8b7ph wrote
It is sad that your main post is getting downvoted.
Everyone should upvote your thread so people can realize how dangerous people like Yudkowsky are. If people in government read stuff like this and become afraid, AGI/singularity could be delayed by several decades if not a whole century.
Viewing a single comment thread. View all comments