Submitted by Snoo-82132 t3_y9a6rk in askscience
Alblaka t1_it6qhns wrote
Reply to comment by YashaAstora in Is building dams a learned behaviour for beavers? by Snoo-82132
Ye, no, sorry to burst your bubble, but that's just 'any biological lifeform', and it does include humans.
We, like any other animals, have a lot of instincts that don't make much sense in our modern lifestyle, or can even be actively counterproductive. I.e. Arachnophobia (or most fears, really). Herd mentality can likewise cause humans to behave hilarious; There's been experiments showcasing this, such as placing some actors in an elevator, back-facing the entrance. Any other person that entered would join them in staring at the back side of the elevator, regardless of the fact that was highly impractical for using that elevator, and utterly pointless. But it's simply instinct to imitate what other humans are doing, even if the action doesn't make any rational sense (instead of, as would be proper, to question whether there is an actual reason for performing that action).
There's also a lot of less-clear examples of instinctive behaviors screwing with us: i.e. over-patternization; Our brains developed two ways to deal with situations: The first is active analysis, which is high in energy-cost, but allows us to make complex deductions. The other is passive repetition, which is extremely quick and energy-efficient, but can only do what we have already established. I.e. driving/riding to work/school on autopilot. It's a neat trick by which we can combine high intelligence with moderated energy consumption. But it also generates a problem, in that the brain will try to aggressively patternize EVERYTHING, in order to turn it into a low-level automated behavior. This means you brain will actively try to class, stereotype, simplify, automate everything it can. Which can lead to, as implied, unjustified stereotypes or in fact oversimplifying problems. This again is a cornerstone of Populism, which tries to stimulate the brain of listeners into simplifying complex societal problems into seemingly simple (but factually incorrect) answers.
So, from these examples alone: Don't, for one second, believe that we humans aren't also animals with very daft instincts. The only thing that differentiates us from most (not even all) animals, is that our consciousnesses have developed to a point where we can (not 'always will') actively recognize when an instinct is kicking in, and might be even able to actively suppress it in favor of a cognitive choice.
kazarnowicz t1_it7ajdn wrote
I've been trying to dig into what we understand about consciousness from a scientific perspective (turns out it's not much) but one conclusion I drew looking on history is that science inherited the religious bias that humans are special. One thing that appalled me was how long veterinarians in the US were taught that dogs don't require anesthesia as their reactions are purely reflexes, they cannot experience pain (I'm paraphrasing, but the gist of it is correct). This lasted into the nineties IIRC.
In denying other conscious life forms their consciousness, we have also stunted our understanding of it.
Alblaka t1_it7higx wrote
> In denying other conscious life forms their consciousness, we have also stunted our understanding of it.
Word.
It's not going to be easy to clearly define (if ever possible) what kind (or even individual!) of animal contains what level of consciousness, but the very least we can do is recognize that it's not a binary toggle, and that we're far from alone on one side. Heck, for all we know we might not even be the extreme of the scale.
I'm hoping we can figure this out, at least partially, before true sentient AI comes into play.
kazarnowicz t1_it7nfnp wrote
This is what really intrigues me, because when I was looking into physics and consciousness I realized that there's nothing in currently understood physics that prevents consciousness to be the fundamental nature of the universe, rather than emerging from matter.
If that is true, then I'd wager you need biological components when creating a sentient AI (or technology that today is indistinguishable from magic).
Alblaka t1_it7ut8g wrote
If so, yes. That's a pretty big if tho. Might be why decoding whale / orca language could be the next big step in AI development: Being able to communicate with other beings that may be sapient (aka holding consciousness) as well is going to be the only way in which we might be able to truly understand what common denominators constitute consciousness... only then would we be able to replicate it artificially (unless we succeed at that by accident).
iamthegodemperor t1_it7ko71 wrote
Ostensibly maladaptive habits that lead to depression are better examples than populism. You can be convinced to discard a populist worldview with some books or some critical thinking.
By contrast, no amount of reading or high level thinking is going to make your brain not instinctively interpret social rejection as pain and aggressively react and plan against it.
ImSwale t1_it7bp3i wrote
What study I found even more weird is when participants in a study were identifying the lengths of line segments relative to other segments. This one’s longer, that one’s shorter, etc. But, only one participant would be in the experiment. The fake participants unanimously identify a line that is shorter than another as longer and the real participant would agree even though the statement was obviously wrong. So afraid of being different.
[deleted] t1_it7ii0a wrote
[removed]
Viewing a single comment thread. View all comments