Viewing a single comment thread. View all comments

TMax01 t1_irth0xy wrote

I just read a news story about a couple of very young children that were mauled to death by the family dogs. Somehow I think the idea that morality is related to being mammalian is inaccurate.

Morality did not evolve. Consciousness did, and consciousness necessarily results in morality. Discovering, as every human generally does, for themselves, that we experience things and prefer pleasure rather than pain, we are rather automatically going to wonder if other people are conscious as well, and just as automatically going to expect that they would share this preference for good rather than bad experiences. Real morality, ethics, and the formal morality philosophers try to deduce is just a reasonable effort to accommodate the equally undeniable fact that not everyone can feel pleasure or avoid pain all the time, simultaneously.

4

Ma3Ke4Li3 OP t1_irvned8 wrote

Yes, I think it would be wrong to say that mammals are moral, period. The point is rather, I think, that something we might call "morality" is more likely to emerge in mammals or birds, rather than lizards, at this is because our neurobiology allows us (does not necessitate us) to feel deep care for others. Might be factually false (think of crocodile parenting), but the logic is a bit more nuanced.

1

TMax01 t1_irwq8ro wrote

There are plenty of reptiles and even insects that engage in nurturing of young, though admittedly it isn't common. The problem is that no, the "logic" isn't nuanced at all, since it doesn't start out explaining why crocodiles don't appear to act morally apart from their parenting but we do, or why lions kill the young of rivals but we don't. Morality didn't "emerge" any more than it "evolved". It just is, like physics itself, and like physics it requires a conscious mind to notice it, but it is there whether it gets noticed or not. A crocodile does not act morally nor a lion act immorally because they do not (despite much confusion on this matter) experience consciousness, they do not make decisions using self-determination, and they are not moral agents. (These three are all identical things, btw.) Only moral agents can act morally or immorally; non-conscious creatures simply exist, without moral repercussions.

I was surprised the references in the original article didn't mention Richard Dawkins. The "logic" of 'endothermic morality' presented is pretty much the same in effect (and affect) as the hypothesis of adaptive altruism he developed in The Selfish Gene back in the 1970s.

>our neurobiology allows us (does not necessitate us) to feel deep care for others.

Actually, I think you have it backwards (whether in terms of fact or the theory being presented): our neurobiology necessitates that we feel deep care for others (because we must nurture our young), and this 'allows but does not require us' to apply (or 'misapply', in evolutionary terms) this compassion to a broader target than our offspring.

Any biological explanation for morality is a blind alley and a dead end, since morality is not a biological imperative, it is a conscious observation. Even someone who believes firmly that they know what their own personal morality demands they do is free to act immorally. Or vice versa; a conscious creature is capable of recognizing what is moral even when it doesn't describe their personal actions. It seem self-evident to me that this is the very nature of morality, as it is the nature of consciousness as well: not to determine what we do, but to determine why we might want to do otherwise.

Thanks for your time. Hope it helps.

2

Krasmaniandevil t1_irtoitv wrote

Would you hold a chimpanzee to the same moral standard as an adult human? Chimpanzee vs. homo erectus? Bonobo vs. Chimpanzee? Bonobo vs. Neanderthal?

We're apes in suits, man. Deal with it.

0

Ma3Ke4Li3 OP t1_irvnhh4 wrote

We are apes in suits, but apes differ. We live in huge groups and constantly deal with strangers. This requires different rules for our sociality (whether innate or enforced). Or what would you say?

5

Krasmaniandevil t1_irvraej wrote

I would agree, the way we evolved necessitated a different set of moral rules (usually adapted to local conditions). This is why I think morality is adaptive and emergent.

1

TMax01 t1_irtzyoh wrote

I don't hold any non-human creature to any moral standards at all. We're apes that wear suits because we aren't just apes. Deal with it.

1

wodo26 t1_irum25p wrote

Before you two commit too deeply on a "deal with it" type of argumentation, what do you think is so special about humans? And how do you take into account the gulf between a human with severe cognitive deficit vs Albert Einstein level of cognition? What bare minimum do humans have that other animals lack?

3

TMax01 t1_irwf7o5 wrote

>what do you think is so special about humans?

The ability to argue such things, the interest in doing so, and the capacity to benefit from the activity. (These are all the same thing, BTW.)

>And how do you take into account the gulf between a human with severe cognitive deficit vs Albert Einstein level of cognition?

What account needs to be taken? Are you suggesting that we should consider people who aren't smart to be less human? I dispute that there is anything such thing as a "level of cognition", merely an apparent difference in the results of cognition based on the circumstances. You can either accept that all humans are conscious because they are humans and consciousness is endemic (not guaranteed but probable) in humans, or you can insist that some humans are less human because of what you describe as a "cognitive deficit" of some arbitrary "level" of severity. I take the simpler approach, and stick with considering consciousness as either a categorical property, or an instance of behavior, as appropriate for the context, without falsely assuming all contexts must be considered identical in this regard. For example, it isn't really as confusing as you might believe to say that a sleeping ("unconscious") Einstein is not smarter than a developmentally delayed child who is awake, yet both are equally conscious creatures and fully human.

>What bare minimum do humans have that other animals lack?

The scientific phrase would be "neural correlates of consciousness". The philosophical term would be "mind". AKA "reasoning".

Deal with it. 😉

2

Krasmaniandevil t1_iru5i2h wrote

If morality didn't evolve, but humans did, how did the two become disentangled?

1

TMax01 t1_irwak8z wrote

What makes you think they are disentangled? What do you even mean by "disentangled"?

Let me try to address what might be the premise of your question, based on some presumptions about why you would ask it. I believe you may be wondering how humans can be observing rather than creating 'morality' as a real feature of the world if morality has no effect on any part of the world besides humans. I would answer, if that is indeed what you meant, that the situation is no different than any other aspect of the world.

Gravity and aerodynamics did not evolve, they are intrinsic physical principles. Birds did not need to consciously observe gravity or aerodynamics to evolve the ability to fly, nor did they decide whether or not to do so. Unlike birds (please ignore the controversial nature of this premise, just take it for granted for the purposes of this explanation, or substitute insects instead if it bothers you too much) humans possess consciousness. This allows us to observe birds, gravity, and aerodynamics and build airplanes which enable us to fly. Standard philosophy (ie normative ethics) expects morality to be like airplanes, consciously constructed based on logical principles. When that approach fails, because morality is an aspect of existence rather than a technological development, many people (both philosophers and laymen) assume morality must be like birds or else it can't exist, and ethics becomes either nothing more than cultural norms or personal preference (or a contrived combination of both: imaginary airplanes.) But morality is not airplanes (real or imagined) neither is it birds; it is aerodynamics. We can (both as individuals and as societies) accept it is real or deny it is physical, and we can use it to rise above the ground and conquer gravity by flying, acting morally, or instead ignore it and act dishonestly and selfishly. But it is real, it is something we observe rather than create, even though, like the principles of aerodynamics, we can reconstruct the principles by carefully considering the causes and results in the world.

I hope that makes sense. Thanks for your time.

3

Krasmaniandevil t1_irwql69 wrote

A few thoughts and clarifications

I do not believe that morality exists independent of sentient existence, or what might be called consciousness. For example, I do not think anything of moral significance occurs on Mars or Pluto or the Sun. If the world were to be obliterated in a nuclear holocaust, there would be nothing left to assess the morality of that action, nor any future actions. Without God, or some other type of observer, the universe is indifferent as to our existence, and the notion of morality would retrospectively focus on a tiny sliver of time that wouldn't be relevant for the remainder of existence.

But sticking with the nuclear example, suppose someone took control over the nuclear arsenal of the United States and threatened to launch the missiles unless you tortured them (think of Joker trying to get Batman to break his rules). One might say torture is inherently wrong, but here it would be necessary to maintain a world of any moral significance. To choose not to torture would be the height of hubris to me because it puts maintaining an individual moral code on a pedestal at the expense of countless others, most of which will have different values but none of which will exist if the deontologist maintains their chastity. I choose this example because I think that stakes change when we're talking about the continued existence of life itself, which I view as one of the foundations of morality at its most basic level.

I do not think humans are the only animals capable of moral reasoning. Rats save rats they know, but not strangers. Dogs react if you give one more treats than the other. Primates adopt orphans that were sired by others, etc. The podcast lists some of these examples, but there are many more (e.g., dolphins protecting humans).

Perhaps you distinguish those examples from morality and view them as instinct, but don't humans have instincts as well? We see it all the time, sometimes in things as simple as reflexively turning our head when we see an attractive prospective mate, or in body language. I think philosophy has a huge blind spot about putting humans on pedestals compared to other life forms.

You might also distinguish modern humans from our ancestors, but do you believe that early humans were not conscious? Is there a discrete moment in time where human evolution progressed sufficiently to trigger moral duties that did not exist moments before? Would that moral code apply to sentient aliens who evolved from different species in radically different biomes? Putting these notions together, morality is path dependent based on adaptive behaviors/intuitions that are species-specific. Although a chimp or gorilla is capable of communicating with humans and otherwise emulating some human behaviors, it would be a mistake to hold them to the same standard as a human who is capable of those same functions simply because chimps are wired differently.

I resolve these issues by placing perpetuating the species as the prime directive of morality, and taking humans off of a pedestal. Humans have a care-intensive, cooperative survival strategy because that's what we're wired for, and as we got more sophisticated we attempted to refine those intuitions into religious/cultural/philosophical beliefs about morality. We see this with practices like cannabilism and polygamy, as well as science-fiction shows (e.g., Star Trek with Klingons, Vulcan, Ferengi, etc., all sentient but each having vastly different moral sensibilities).

Some of those beliefs were not adaptive, such as Christian sects that thought reproduction was a sin. Some of these moral codes work better than others, but if they're not responsive to the circumstances that surround them, it doesn't really matter if they were "correct" from a deontological standpoint, at least in my opinion.

Thanks for your time as well.

1

TMax01 t1_iry0rt5 wrote

>I do not believe that morality exists independent of sentient existence, or what might be called consciousness

It is called consciousness, and you are saying you don't believe morality exists.

>For example, I do not think anything of moral significance occurs on Mars or Pluto or the Sun.

Nothing of moral significance occurs in the absence of moral agents. The events of the physical world are amoral, but the actions of moral agents are either moral or immoral.

>nothing left to assess the morality of that action,

So you're saying that murdering every human in existence wouldn't be immoral? That seems like an odd, and very immoral, position to take.

>a tiny sliver of time that wouldn't be relevant for the remainder of existence.

Much like everything we do. Oh well, everything will end someday and the world does not care, so might as well murder your family and all the witnesses, then you'll have done nothing wrong.

>threatened to launch the missiles unless you tortured them

When someone makes a threat and then kills someone by following through on it, they have killed someone, regardless of the demands they made or whether anyone tried to appease them. It can be a tough thing to accept, but that's just how it works.

>I do not think humans are the only animals capable of moral reasoning.

Humans are the only animals capable of any sort of reasoning. Humans can interpret the behavior of other animals however they like, the animals cannot know or care, they simply act in response to stimuli however their genetic programming causes them to, without thought or remorse.

>don't humans have instincts as well?

The real question is whether humans have anything other than instincts. Your ability to ask the question proves you have self-determination. You don't need to agree with this for it to be true. In fact, it is your ability to disagree which makes it true. Certainly, you can mischaracterize everything that every human has ever done as "instinct", but it seems more like a cross between semantic bullshit and hiding from reality than a reasonable intellectual position.

>I resolve these issues by placing perpetuating the species as the prime directive of morality,

That is the opposite of morality, just as choosing to do so is the opposite of instinct, despite the fact that it could (attempts to) replicate the results of acting on instinct alone would do, if every human did the same.

If only your immoral pretense of morality were actually as satisfying as you wish it would be. Then humans would just be apes, and nobody would have to spend any time considering your opinion about anything. 🙁

1

TR_2016 t1_irzd8kc wrote

"the animals cannot know or care, they simply act in response to stimuli however their genetic programming causes them to, without thought or remorse."

"The real question is whether humans have anything other than instincts. Your ability to ask the question proves you have self-determination."


We are also acting based on our genetic programming and the input from the outside world. Nothing more than AI on a body really, we just don't know our exact code. Our actions are nothing more than the output of a model. Randomness might be involved as well, but that is not free will.

"So you're saying that murdering every human in existence wouldn't be immoral? That seems like an odd, and very immoral, position to take."

It wouldn't be moral or immoral. Assume you are testing out a short trial period of a "game", limited to 3 hours. You could do whatever you want, once the trial runs out, none of your actions matter anymore and you are locked out, game and your save files are erased forever.

Does your actions actually matter? No. Are we really in a different situation than an AI playing this game? The AI is coded to perform certain actions and preserve itself, while its actions can be heavily manipulated depending on input presented by other players. Players are different variations of the same AI.

Lets take up morality. The only basis for morality i can see is by indoctrinating society that it is "bad" to do certain things or harm others, we are reducing the likelihood that someone else will harm us. It can be argued that we are following the self preservation task of our code by creating a "morality".

Lets say however, you have come to the conclusion that even if you harm others, you will still be safe from others inflicting the same harm on you, what incentives are there now to follow morality? There is another incentive. Due to a combination of our code and indoctrination, one might avoid harming others to avoid feeling "bad".

In the end it is not so different than a set of conditional jumps in assembly. Basicly a long set of risk/reward assesments with everyone having a different threshold. Again, since we can't read our brains like a computer code, we don't have the whole formula for a calculation.

It might be impossible for some to arrive at a certain conclusion no matter the inputs presented to them due to their "code", or if it is a critical function or a linear code, it might be executed regardless of which inputs we present in to the calculation.

Each person might come to a different conclusion on a single "moral" question, even if inputs are the same, because their code might be different. Or you could indoctrinate two people on the same/very similar code with different inputs in their development stage and later, they might each come to a different conclusion on the same moral question.

Since we don't know our code, our observation is mostly limited to check which inputs produce different outcomes. There is no objective correct or incorrect outcome.

It is entirely possible that if you could somehow modify Putin's brain like we could modify an AI's code, you could easily make him declare peace or launch nuclear weapons depending on your preference. So where is his free will? How are his actions anything but the output of a complicated model?

2

TMax01 t1_is0ygq2 wrote

>We are also acting based on our genetic programming and the input from the outside world.

That is merely the starting point of our behavior. Certainly, you can maintain your assumption as unfalsifiable, and believe that humans (including yourself) are nothing but biological robots like all other forms of life, with no conscious self-determination, by defining every act of art, poetry, philosophy, science, engineering, hope, or emotion as "genetic programming" or operant conditioning. But in doing so, you are, admittedly or not, proving the opposite, because animals have no need, desire, or capacity to do such a thing.

>So where is his free will?

This is the root of the problem, indeed. The conventional perspective you are regurgitating is the same dead end philosophy has been mired in for thousands of years. But it is a false assumption that self-determination is or requires "free will".

All it would take for Putin to realize he made a mistake and call off his unjustified and illegal invasion of Ukraine would be recognizing that the teleology he used to justify it is both backwards and upside down. According to your worldview, and therefore as long as everyone else continues to agree with and promulgate the conventional philosophy underlying your worldview, this is effectively impossible: Putin believes he based his choice on logic, and so he will continue to see that decision as logical. In my philosophy, it is merely unlikely, because he has as much faith in the conventional philosophy as you do: he believes he is acting reasonably, despite the rather obvious fact that he is not. If everyone around Putin rejected the conventional belief that self-determination is free will, it wouldn't matter if Putin did, he would still be much more likely to act reasonably rather than continue to use false logic to remain convinced he is no different than an AI ape.

Thanks for your time. Hope it helps.

2

Krasmaniandevil t1_is8pwe5 wrote

Stop staw-manning, you can't characterize someone else's position with so much rhetorical flourish so early in your response.

Dirty pool.

1

TMax01 t1_isancbe wrote

Your contentiousness fails to provide a rebuttal to my comment.

1

Krasmaniandevil t1_is8pnxj wrote

You've misunderstood my argument and made multiple logical fallacies.

Morality cannot be compared to gravity. Gravity can be measured. It can predict outcomes. It exists regardless of humanity. You've simply presumed the concept of morality (which apparently only humanity can recognize?) can exist independently of the only species which qualifies as a "moral agent." Your argument begs the question in that it presumes that which it seeks to prove.

1

TMax01 t1_isaln3i wrote

>Morality cannot be compared to gravity.

It's an analogy. Deal with it.

>It can predict outcomes. It exists regardless of humanity.

Hmmm...

>You've simply presumed the concept of morality

I've observed the process of morality. Your assumptions (or my presumptions) about what it is or how it works can be quite inaccurate, and are certainly imprecise, without shedding any doubt on the existence of that process.

>can exist independently of the only species which qualifies as a "moral agent."

Indeed, the presumption that morality would be observed by any other species (not in result but in process) which are moral agents (experience consciousness) is a necessary reflection of what morality is. The fact that we know of no other such species does not preclude their existence, and does not change the nature of morality. You may, if you wish, demand that this uncertainty limit your notions of what morality is, but I am not required to join you.

>Your argument begs the question in that it presumes that which it seeks to prove.

My argument seeks to explain, not to prove. Morality isn't quantitative, as you've mentioned, and cannot be 'proved' in the way you are suggesting, it can only be recognized by other moral agents. The truth is, however, that all moral agents (consciousness) does recognize it, even when they seek to avoid it by claiming blindness to it.

Thanks for your time. Hope it helps.

1

Krasmaniandevil t1_isapv42 wrote

The analogy strikes at the core of the debate: whether morality is objective (like gravity) or subjective (like art).

Morality is a process now? How does that track with the gravity analogy?

I never said morality was quantitative, but you compared it to a phenomenon that could be.

"Recognizing" morality suggests that, like physics, it exists independent from "moral agents."

If we agree that the existence of moral agents is a necessary precondition of the existence of morality, are you saying that there is some universal standard of morality that applied at the dawn of man, remains the same today, but that did not exist until human evolution passed the "moral agent" threshold?

1

TMax01 t1_isb1gxj wrote

>morality is objective (like gravity) or subjective (like art).

Your false dichotomy strikes at the core of the conundrum. Morality is objective like consciousness, undeniable regardless of whether it is quantifiable. It is fashionable to assume and insist that gravity is absolutely objective and art is entirely subjective, but the truth is not that simplistic.

>Morality is a process now? How does that track with the gravity analogy?

It shows your reaaoning to be a matter of assuming a conclusion, namely, that if morality cannot be simplistically quantified then we should presume it doesn't exist. As if works of art stop existing for those who proclaim "that isn't art!"

I could belabor the point further, identifying how gravity is not directly quantifiable, but can only be measured in terms of mass and acceleration. Does this mean gravity is not real? In some ways, it actually does, in some ways, it doesn't.

>I never said morality was quantitative,

Is there some other way of interpreting "gravity can be measured" that I'm not aware of? Perhaps you don't want to admit it, but the premise that morality must be quantitative like gravity in order to exist is clearly the foundation of your contention.

>but you compared it to a phenomenon that could be.

I used an analogy. It is a kind of comparison, but relies on a level of engagement you haven't provided.

>"Recognizing" morality suggests that, like physics, it exists independent from "moral agents."

It does more than suggest that, it declares it directly. It is a necessary presumption that anything we percieve exists independently of our perceptions if it exists at all. Some people believe that they can be amoral intellectual agents without being moral agents, and if they try hard enough they can refuse to recognize morality at all. It is an easy assumption to make, because of the nature of morality (including the ways it differs from time, mass, velocity, momentum, and gravity, though all of these things can be considered useful analogies for understanding what morality is) but it is a mistake nevertheless. To be conscious is to recognize moral truths, despite any difficulty we might have expressing, describing, or comparing them.

>If we agree that the existence of moral agents is a necessary precondition of the existence of morality,

We don't. In fact, you have it backwards, that is the opposite of what I directly said. The existence of moral agents is a necessary precondition for observing the existence of morality, but not for that existence to occur. The existence of morality is a necessary precondition for the existence of moral agents, but recognizing morality is not a precondition for its existence, just as recognizing gravity as gravity is not a precondition for gravity. Gravity is quantifiable even when it isn't quantified.

>are you saying that there is some universal standard of morality

Apart from "morality exists", no standard, universal or otherwise, is necessary for morality to exist.

>but that did not exist until human evolution passed the "moral agent" threshold?

You keep repeating the same error. Morality exists independently of moral agents, but can only be observed by moral agents. Perhaps "time" would have been a better analogy than gravity? Any analogy would suffice if you were interested in understanding it, but no analogy could suffice if you are not.

Thanks for your time. Hope it helps.

1