Submitted by TheOGCrackSniffer t3_10nacgd in singularity
ftc1234 t1_j691weh wrote
Reply to comment by Rogue_Moon_Boy in I don't see why AGI would help us by TheOGCrackSniffer
>But it’s a machine without feelings…
What are human feelings? It’s an early signal that tells a human that they have or may encounter something that is beneficial or harmful to them. There is an evolving school of thought that consciousness is simply a survival mechanism or a neurological phenomenon.
I think OP has a valid point. Why would a self aware system that is conditioned to survive (eg., a robot that is trained to not fall off a cliff) prioritize some other human unless it is hardcoded to do so?
Rogue_Moon_Boy t1_j6cenjm wrote
Avoiding falling off a cliff is not the same as having survival instincts. It would just mean it knows the rules of physics, looks at a cliff and the ground below and calculates the impact velocity and sees it would harm itself when it would jump down. It would be a specificly trained feature.
That's not the same as being self aware or having "instincts". It's just one input value into a neural net that has a greater weight than everything else and says don't do it because it's bad.
Instincts in a human are mostly guesstimates because of irrational feelings, and we are actually really bad and inaccurate at it eg. stage fright, fear of rejection, the need to show off as a breeding ritual and many other instincts that would be totally useless for a machine.
A machine like an AGI is the opposite of irrational, it's all about cold calculations and statistics. You'd have to deliberately train or code "instincts" into an AGI for it to be able to simulate it.
Sci-Fi literature always tries to humanize AGI for dramatic purposes, and tries to portray it as that one thing that out of nowhere boooom -> is self aware/conscious. In reality, it will be a very lengthy and deliberate process to reach that point, if we want it to in the first place. We have all the control over it to learn or not learn stuff, or check/prevent/clamp unwanted outputs of a neural net.
ftc1234 t1_j6dt7f5 wrote
Instincts aren’t irrational. They are a temporal latent variables that are indicative or are a premonition of one possible future. Instincts are derived based on past experiences which have trained your model. Current neural nets aren’t temporal nor do they do online learning. But that will change.
You say instincts are irrational. Many people trust their instincts because they are pretty accurate for them. If it’s irrational, that’s likely because it’s a poorly trained (human) neural model.
Viewing a single comment thread. View all comments