Moral judgment and Intuition


How most of us reach moral judgment is a matter of some debate.

There are two main views:

  1. Rationalist perspective (Kohlberg and followers): People (even children) arrive at moral judgment by engaging in conscious moral reasoning on the basis of rules they have learned from social interaction with their peers.  Intuitions and emotions may provide input and otherwise affect the outcome, but conscious moral reasoning is at the center of the process. This view was popular from the 1970's to the 1990's.
  2. Intuitionist perspective (Haidt, Bargh, etc.): People (not only children) arrive at moral judgment on the basis of intuitions which are often associated with emotions and whose automatic mechanisms are inaccessible to consciousness.  Conscious moral reasoning is causally ineffectual most of the times, as in many (most?) mental processes leading to evaluation, emotions, moods, and goals. 

 NOTE: By and large, moral reasoning does not move us to action in the absence of moral emotions.   For example, psychopaths show a deficit in affective emotions (sympathy, love, grief) and moral emotions (guilt, shame), but display intelligence and no irrational or delusional thinking.  However, in real situations their moral performance is lacking and are often unable to separate immoral from convention-breaking behavior, a feat that three year old children can already achieve.   People with a damage in the area of the brain behind the bridge of the nose display similar symptoms (acquired sociopathy).  So, it seems that emotions are necessary to morality.



Although (1) seems natural, and obviously true, there is strong evidence for (2):


The intuitionist perspective, which has recently become the center of much discussion, can, but need not, be married with the view that the underlying machinery for moral intuition is innate, much like that for language.  In this view, sections of the brain are prewired to accept morality automatically, much as to accept individual languages.  However, innatists disagree on how much content is culturally determined  




Moral reasonings are often post hoc constructions after moral judgments based on intuition have occurred; they are more like a lawyer defending a client than a judge seeking the truth Typically, they


This might explain the intractability of some moral disagreement (often dealing with life and death issues) even at the philosophical level among members of the same culture.

However, intuitions do not force us in the sense that sometimes we change our views based on reasoning.  This seems to occur in three ways:

  1. Application of some general rule, e.g., a moral imperative or cost-benefit, to the case
  2. Redescription of the case which triggers a new intuition
  3. Presentation of new evidence, typically from others.

Neurologically, this seems to happen by the activation of the anterior cingulate cortex, a part of the brain dealing with internal conflict, and of the dorsolateral prefrontal cortex, which deals with discursive thought. 



There are, then two parallel process in moral judgment, an intuitive and a reasoning one.    Although moral intuitions, as all automated responses, may seriously misfire in novel situations, we could not function if we constantly engaged in conscious thought regarding our behavior, as conscious reasoning is slow, resource intensive, and resource depleting (Ego depletion phenomenon).  We should instead train ourselves to use it to review our intuitions and to explain our judgment to ourselves and other people.