Ethics and Decision Making

The last chapter introduced you to several schools of ethical thought. These time-tested systems provide powerful frameworks for analyzing big problems, but they seemingly miss a crucial component of ethics: the people making the decision. What factors influence how we make ethical decisions? Are these psychological underpinnings of the decision-making definable? While myriad elements make up the psychology of decision-making, this chapter will focus on some of the major psychological elements that influence how people make, or fail to make, ethical decisions.

Awareness

Before a person can make any sort of ethical decision, they must first be aware that they are dealing with an ethical problem. David Foster Wallace told the following story as part of Kenyon College’s commencement in 2005.

“There are these two young fish swimming along and they happen to meet an older fish swimming the other way, who nods at them and says "Morning, boys. How's the water?" And the two young fish swim on for a bit, and then eventually one of them looks over at the other and goes "What the hell is water?”

Far too often people exhibit the young fishes lack of awareness when it comes to ethical issues. We are surrounded by countless ethical decisions that need to be made every day, yet we often fail to recognize them for what they are. Since we do not perceive the ethical elements in our decisions, we often fail to bring ethical decision making to bear. To understand why we do not always act in what we or others regard as ethical fashion requires an understanding the decision-making process itself.

Ethics and Decision Making

Every day we face countless choices and decisions. From choosing what to eat for breakfast, to what to wear, to what we read online, to the route we take to work; each of us is bombarded with constant information and countless choices that must be dealt with a daily basis. To reduce the information processing and number of mental calculations we must perform on any given day, we rely on cognitive biases and shortcuts ( heuristics ) that make our decision making more efficient. view them as part of our brain’s self-defense mechanism that keeps us from becoming overwhelmed by all the minutia of everyday life.

While these shortcuts can be helpful, they can be problematic, especially if they reduce our awareness of ethical decision making errors or limit our awareness of ethical dilemmas or situations themselves. They are part of the explanation of why seemingly “good” people do bad things.

Scripts

For example, one reason people often fail to recognize the ethical elements of their actions is because most people live their days following behavioral scripts, almost as if going on autopilot. By this, we mean that they are responding automatically to situations, without really thinking about what they are doing. According to behavioral psychology , a script is a series of behaviors, actions, and consequences that become routine or expected in a particular situation or environment.

After driving for a year or two, how many of you think about whether or not you should stop at a stop sign? Most of us stop without really thinking about it, even when there is no one around. What about the path you take home? Do you consciously think about the best way to get home, or do you just drive the same path you always do? These are examples of scripts that we all create and follow. They are stored in our memory and are evoked when we encounter routine or repetitive situations. Scripts simplify things and free up our brain’s decision-making resources to think about and decide issues that require more mental energy.

But there is a downside to scripts. While scripts are normally helpful, they may reduce our awareness of ethical dilemmas or blind us from potential morally dangerous situations. If we are on autopilot, we are not fully engaged with the environment around us, which reduces our awareness of important changes in the situation that might pose ethical problems. It may also make us less empathetic to the needs or feelings of others. Scripts may lead us to avoid taking responsibility for our actions because we do not feel like we were consciously engaged when making scripted decisions.

When our brain relies upon scripts or heuristics (which are essentially mental rules of thumb), our thinking and decision making is fast, intuitive and automatic. Nobel Prize winner, Daniel Kahneman, calls this System 1 type of thinking. System 2, on the other hand, is our brain in its slower, deliberative, and thoughtful mode which requires more mental energy. Both are key to decision making and help us understand why our choices and behavior may not always seem “rational” or may deviate from what we see as our ethical self.

To help you become more familiar with these heuristics and psychological constructs related to ethical decision making and behavior, we have included video links within each chapter of this online book that provide examples of how these work in real life .

Moral Intuition

While most psychologists argue that awareness of the errors and biases in our decision making processes can increase our ability to make more reasoned and ethical choices, others have raised doubts. Using insights from Kahneman, psychologist Jonathan Haidt and colleagues have developed a moral foundations theory . According to the theory, most moral judgements, decision making and actions are quick, intuitive, and tied to emotions. Moral reasoning and justification, in contrast, is more deliberative and conscious. Haidt uses the metaphor of an elephant and a rider to help illustrate this process. The elephant represents our emotional, automatic, and instinctive self – which does not always behave or act rationally. Conversly the rider is our more controlled, analytical, and rational self. In terms of relationships, “elephants rule although they are sometimes open to persuasion by the riders.”

Haidt’s work attempts to understand how morality, and the ethical frameworks that often accompany moral foundations, differ across and within cultures. His theory argues that individuals innately vary in their intuitive, emotional response to situations (referred to as System 1). They then justify and rationalize these moral judgements and responses after the fact (using their System 2 analysis), based on their ideological or religious predispositions which are shaped and reinforced by their social, cultural environment. In other words, while cultures can reinforce and inform intuitive moral judgments, such differences are real and explain the pluralistic approaches to moral and ethical issues across and within societies.

Ethics and Personality

While Haidt’s work in evolutionary psychology is relatively new and not without its critics, the research linking personality characteristics and our ethical choices and behavior is fairly deep.

Locus of control

A key personality concept that explains how we view our relationship towards our environment is locus of control. Famed psychologist Philip Zimbardo defines locus of control as "… a belief about whether the outcomes of our actions are contingent on what we do (internal control orientation) or on events outside our personal control (external control orientation)."

This means people with an internal locus of control believe that their actions and decisions impact the world and others around them, while those with an external locus of control often feel as though the world and others impact them. Think about people in your own life. Think of the person who gets a scholarship, makes the team, does well on an exam, or gets a promotion at work. A person with an internal locus of control would attribute each of these achievements to their hard work and perseverance. Someone with an external locus of control would claim they were just lucky or received those good grades and promotions because of the help of others or because the system was set up unfairly to benefit them.

We are not always consistent with our locus of control. You will often hear somebody say, “Professor X gave me an F,” but then also say, “I earned an A in Professor Y’s class?” The reality is that locus of control is not a binary designation, but is really more of a continuum. It is common for people to believe that they are responsible for the good things in life and fate forces bad things on them.

Knowing your locus of control can be difficult. Most people like to think they have an internal locus of control. Over the past decades of teaching, the authors have noticed most students (easily 80% or more) self-report that they have high level internal locus of control. This is clearly a trait that many people want to see in themselves as it indicates some level of self-empowerment. But the reality is that many people have more of an external locus of control. It is important to note that this is part of an individual’s personality and is unlikely to change drastically or quickly. When people do change in this area, it usually takes extended time and effort.

Individual Cognitive Moral Development

Central to any ethical decision making process is the developmental stage of the person making the decisions. A five-year-old child has a far different mental capability for making ethical decisions than a forty-year old adult (at least in most cases.)

Kohlberg’s Theory of Moral Development

In the mid-1950s, Lawrence Kohlberg began research to develop a theory of moral development. He wanted to see how people developed their moral understanding and motivations. He began his research with a group of 10-16 year-old American boys and studied their moral development over a twenty-year period. From this research he created a hierarchy of moral reasoning that people potentially develop throughout their lives. This original research has been exhaustively reviewed and is a central part of the Human Resource literature used by businesses in determining the ethical potential of employees.

Kohlberg’s hierarchy identifies three levels of moral development, each containing two stages. These levels represent the progression of moral development that people can go through in life. Progression through these levels is triggered in a number of ways, though moral education and cognitive dissonance seem to be the two most effective means of development.

Moral education means a person focuses on moral issues and thinks through moral problems. This kind of mental exercise sensitizes a person to moral issues and creates greater empathy with the various parties involved in moral conundrums. This can happen in number of ways, though research demonstrates that structured moral training, especially in higher education, helps students to be more aware of ethical problems and to be better prepared to respond to them when they occur.

Cognitive dissonance occurs when a person is confronted with something that does not fit into or coincide with previously held understandings or worldview. This is usually due to exposure to new experiences or learning new things. Tension between the new experience and old understandings is often uncomfortable but can lead to personal growth. While age often coincides with these new experiences and the development that follows, some people have a variety of challenging experiences early in life that cause them to progress when they are much younger. Others may live a sheltered life where they are seldom exposed to new ideas or experiences. These limitations can frequently limit a person’s development.

Kohlberg’s three stages are referred to as pre-conventional, conventional and post-conventional. Below are the stages and characteristics of each.

Pre-Conventional Level

The pre-conventional level is the most basic level of moral reasoning. Often associated with children, at this level a person tends to focus on ethics only to the extent it impacts them.

At stage one of the pre-conventional level, a person makes ethical choices solely to avoid punishment. A person working at this level will avoid stealing something, not because they know it is wrong or will hurt someone else, but because they know they will be punished if they do. Right and wrong is determined almost solely by what will get a person punished.

Stage two moral reasoning is focused not only on the punishment that will result but also looks to the rewards a person receives from engaging in moral behavior. Right or wrong now becomes dependent on whether the person is praised or receives some other reward for the activity.

Stage one and two seem rather Pavlovian, and ultimately rely on operant conditioning to make sure people behave in ways others want them to. These levels are most often seen in children (usually nine or ten and younger), but sadly there are some adults who never develop beyond these early stages.

Conventional Level

This is the level most adolescents and adults achieve. At this level a person internalizes moral issues and can think about ethics more abstractly.

At stage three a person looks to their immediate family and friend group for acceptance and validation. This community creates the norms that an individual wants to abide by in order to be seen as a good person.

People progress to level four when they see beyond their immediate community and see the broader rules of society and that these rules are beneficial. At this stage people will follow the laws and rules of society in an effort to be a good citizen.

Post-Conventional Level

The post-conventional is the highest level of moral development. It is characterized by those who don’t just follow the laws in place, but in some cases supersede the law, often creating new paradigms of ethical decision making.

Stage five represents those people who understand laws play a role in society, but are willing to break the law if it is seen as morally lacking or wrong. The motivation here is not to break rules or violate laws out of caprice or for selfish motives, but out of a sense of higher morality.

Finally, stage six of moral reasoning is the theoretical highest level of the hierarchy. It is considered theoretical is because it has not been studied or observed in a scientific manner. At this level, at person would no longer be concerned with current systems of rules but would instead seek to change the paradigms of ethical understanding to create whole new types of universal ethics for all people. Some theorize this would apply to people like Jesus, Ghandi, or Buddha.

Ethic of justice or ethic of care?

Critics of Kohlberg’s hierarchy argue that his methods may not have been inclusive of diverse identity elements. For example, Kohlberg used a masculine approach for his research and theory. Central to Kohlberg’s system is the idea that ethics are most frequently a measure or aspect of justice. But others, most notably psychologist Carol Gilligan, argues that ethics are not solely a function of justice. Instead, she contends the narrative should be about relationships and community. Gilligan came to this conclusion after working with Kohlberg and helping to develop his theories. According to Gilligan, “… research suggests that men and women may speak different languages that they assume are the same, using similar words to encode disparate experiences of self and social relationships.” Because of this, she argues that Kohlberg’s system may not adequately represent females and their development.

The core of Gilligan’s arguments surround the idea that men tend to focus on individuality and logical justice issues when making moral decisions, while women are more communally focused. Since women tend to be more relationship oriented, they determine morality based on caring for others and protecting relationships. Men tend to focus more on individual notions of justice. This can cause confusion when people speak of morals and ethics yet may have quite different meanings for those terms.

Whether this criticism of Kohlberg is ultimately persuasive or not, managers and anyone else who has to deal with a diverse group of people need to recognize that ethics may look different for men and women, as well as other diverse groups. An ethic of justice may not value and focus on the same issues as an ethic of care will.