A Leader’s Guide to Why People Think the Way They Do
There is a scene in Clint Eastwood’s movie “In the Line of Fire” where his character, Secret Service agent Frank Horrigan, is sitting on the steps of the Lincoln Memorial and trying to impress fellow agent Lilly Raines, played by Renee Russo. Frank makes a prediction about which of two pigeons will fly off first. Lilly, clearly dubious, says “How do you know?”
Frank grins and replies, “I know things about pigeons, Lilly.”
Frank uses variations of this line in other places in the movie, telling other characters that he “knows things about people,” and the line is meant to capture something fundamental about Frank: Despite his sometimes-abrasive personality, he is really good at his job because he understands human nature and such understanding ultimately helps him stop an attempted assassination of the president.
Leaders, too, would do well to “know things about pigeons.”
One of the primary responsibilities of leaders is to make good decisions and then influence others so they can see the wisdom of those decisions. Leaders are also responsible for deploying their subordinates—assigning them goals and tasks. Leaders will be better at these responsibilities if they understand how our minds work and the obstacles to clear thinking we put in our own way.
The pyramid at the right shows a number of factors that affect our ability to think clearly, starting with embedded mechanisms in the brain at the bottom (the “bugs and features”), moving up through personality-related filters, poor logic, lack of information, and the embrace of misinformation.
This pyramid is the foundation of Awareness to Action International’s “Critical Thinking for Leaders” program and this article takes a brief look at each level of the pyramid. Sources for more information on each topic can be found at the end of the article.
The System’s Features and Bugs
Our minds are a product of biological evolution and there are sophisticated and broad fields of evolutionary and cognitive psychology that go far beyond pop psychology and the notions that many leaders took in their Psych 101 course so many years ago. Here are some of the important concepts leaders should understand.
Argumentative Theory of Reason
A theory gaining popularity among cognitive theorists today is that the brain did not evolve as a tool for accurate understanding of our world; it evolved to equip us to survive more effectively and we can often thrive when we are fooling ourselves and others. One of the realizations to stem from this view is called “the argumentative theory of reason,” developed by Dan Sperber and Hugo Mercier, which basically says that our ability to reason is not to help us to solve problems but to argue in favor of our intuitive, emotion-based desires and assumptions. In other words, our decisions are not always as logic-based as we assume they are and we are really good at arguing for our perspective whether it is the right one or not. Further, we are prone to fooling ourselves, and many of the cognitive biases that have long been thought to be “bugs” in our cognitive systems are actually “features” of the system, selected through hundreds of thousands of years of evolutionary pressures to help us get our survival needs met.
This theory fits well with other theories about how the mind works—primarily Robert Trivers’s work on self-deception; Leon Festinger’s, Elliot Aronson’s and Carol Tavris’s work on cognitive dissonance; and the work of Daniel Kahneman and others on cognitive biases–all of which are useful for leaders to have at least a passing knowledge of. We’ll touch on these shortly.
The important thing to realize here is that we can’t always trust our own reasoning and we need objective, external tools to help us uncover the ways we may be deceiving ourselves. The best example of this is science, which sets up objective tests that are repeatable by others in order to evaluate a hypothesis. External and objective tools for evaluating arguments are the groups way of protecting itself from forceful arguments by influential individuals.
Leadership Lesson: Before we start to argue for our point of view we should step back and ask ourselves, “How do I know this to be true?” and argue against our own view. It is important for leaders to establish effective group mechanisms for evaluating arguments in open and objective ways. Simply adhering to what the boss says or following the most charismatic person in the room can be a recipe for disaster. The same thing that the great physicist Richard Feynman said about science applies to business as well: “it is important that you don’t fool yourself, and you are the easiest person to fool.
As social creatures, it is in our interests to collaborate with others. As individuals, we sometimes find ourselves competing with others as well. Thus, we have evolved tendencies to both collaborate and compete, frequently at the same time. One of the ways we compete more effectively is to deceive other people. Evolutionary biologist Robert Trivers’ has written an excellent book called “The Folly of Fools” that shows how the best way to deceive others is to deceive ourselves. This deception is not deliberate, it is one of the features of our cognitive system and we do it without realizing it.
Leadership Lesson: Since we are wired to deceive ourselves, we shouldn’t demonize self-deception or when that self-deception in others leads them to inadvertently deceive others. Whenever possible, take the judgment out of the evaluation of points of view, and remember that when the stakes are higher people are more likely to deceive themselves. (Deliberate deception, of course, is unacceptable in the workplace.)
Cognitive dissonance is the tension caused by contradictory ideas battling for space in the psyche.
We all want to think well of ourselves. Few of us see ourselves as bad people, even if we sometimes make mistakes. When we do something that contradicts our perception of ourselves, we experience the discomfort of cognitive dissonance. We then fall victim to a variety of cognitive biases, such as the confirmation bias, which is the brain’s way of dispelling that tension and protecting our self-esteem.
For example, if we think we are doing good in our role and we get negative feedback in a 360 assessment, it is tempting to rationalize the feedback by attributing ignorance or malice to the source of the feedback, or to assume that it is based on insufficient data. If we design a product we truly believe in but it is not well-received by the market, it is tempting to blame the users or believe that our product is “too ahead of its time” rather than think there might be something wrong with the product.
Leadership Lesson: Learn to look for cognitive dissonance in yourself—the internal stress when feedback or observations rub you the wrong way for reasons you can’t identify. Avoid the temptation to flatly reject ideas that don’t fit your worldview. Avoid either/or thinking—the giver of feedback may be biased AND you may still have the flaws he identified; the product may be ahead of its time AND it may still have some flaws, etc. Develop what F Scott Fitzgerald called the mark of a first-rate intelligence, “the ability to hold opposing thoughts in mind at the same time and still retain the ability to function.”
We tend to assume that attractive people are also smart; we assume that when we make a mistake it is due to circumstances but when others make a mistake it is because they have some character defect; we recognize evidence that supports our beliefs far more easily than evidence that contradicts them; we tend to inflate our role and minimize the role of others when describing an event.
These are all examples of cognitive biases—distortions of perception and analysis—that we unconsciously use to argue for our intuitive point of view or system of beliefs.
Understanding cognitive biases and being able to recognize when we or others fall victim to them is a fundamental skill in the critical thinker’s tool kit. Charlie Munger, Warren Buffet’s lesser-known partner in Berkshire Hathaway, places great value on understanding cognitive biases and how they affect business and financial decisions. His writings on them are a central part of “Poor Charlie’s Almanac,” a compendium of his articles and speeches.
Leadership Lesson: Learn about cognitive biases and teach your team about them. Reward people for recognizing their own cognitive biases rather than shaming people for committing them.
Personality-Driven Focus of Attention
Beyond the bugs and features of our cognitive system, our personality style influences how we think and view the world. People with different personality styles pay attention to different things and place value on different things.
If our personality styles are different, you will focus on some things and I will focus on others. You will value some ways of thinking, feeling, and behaving; I will value others. These differences are not inherently bad; in fact, we can’t all focus on everything so personality differences are one way of raising a group’s level of competence. I can be good at this and you can be good at that, and if we work together we are good at two things instead of being mediocre at everything.
People fall into patterns because the brain likes to habituate behaviors that appear to work. Our brains use about 20% of our body’s energy and is always looking for ways to reduce that load. Therefore, we have evolved as cognitive misers and our brains tend to find short cuts by creating habits. These habits allow us to run on autopilot much of the time. Our collection of habits of thinking, feeling, and acting are the root of our personality style.
My preferred model of personality styles is the Enneagram, which identifies three clusters of instinctual values and nine strategies for satisfying those values. You can read more about it in the sources below. I also think there is value in the Big Five model that explores a person’s level of extroversion, agreeableness, conscientiousness, neuroticism, and openness to experience.
A note about personality models: Personality models are “heuristics” shorthand mental models that help us simplify complex creatures. They should be used as such—tools to understand and enlighten, not weapons to minimize and stereotype. In my nearly 20 years of working with leaders I have seen leaders of all personality styles who were very successful and leaders of all styles who failed. There is no ideal personality style for success in any role. Using a personality model to stereotype someone is to fall victim to the “correspondence bias,” a cognitive bias that causes us to make broad but short-sighted generalizations about others based on a few observed traits.
Leadership Lesson: The value of personality models is to help us see our own habitual patterns so we can minimize the damage they do and to understand that others are just as trapped in their habitual patterns as we are so that we can be more sympathetic to them. This sympathy allows us to interact with them in ways that will speak to their values rather than in ways that speak to ours. This dramatically increases the chances of successful communication and collaboration.
Logic and Logic Fallacies
Given the challenges already stated, it is easy to see that we are not innately logical creatures. In fact, our intuitions, cognitive biases, and emotions can often undermine logic and lead us into logical fallacies.
Logic is a skill, and is very different from “common sense.” Einstein identified “common sense” as “the collection of prejudices we acquire by the age of 18.” Everyone thinks they have common sense and that most other people lack it, which should be the signal of a problem with the concept right away. Logic requires training and education, and leaders would do well to acquire it.
There are many interesting ways to learn about logical fallacies online, and some sources are listed at the end of this article. It helps to know that logical fallacies come in two forms: formal and informal.
Formal fallacies are marked by a factual or structural flaw in the argument. An example is: All dogs are mortal. Mittens is mortal. Therefore, Mittens is a dog. Even though both premises are true, the conclusion does not follow from the premises if we also know that Mittens is actually a cat.
Informal fallacies are often more subtle and nuanced, and frequently the result of the conflation, or the mixing up of, our thinking, naïve intuitions, and cognitive biases. Thus, arguments often hold appeal to our emotions and seem intuitively correct even if they don’t withstand scrutiny.
An informal fallacy does not necessarily invalidate a conclusion, but it calls the conclusion into question and points to the need for additional analysis.
A good example of this is the “post hoc, ergo propter hoc” fallacy. This is Latin for “after the thing, therefore because of the thing.” I recently overheard someone saying that he would never get the flu shot again because: “I never had the flu or got a flu shot before last year. But then I got the flu shot and I got the flu shortly. So the flu shot caused the flu.”
The flaw is thinking that just because he got the flu after he got the shot, the shot caused the flu. Since it well-known scientifically that the flu shot does not cause the flu, it is far more likely that he contracted a strain of flu for which the vaccine was not effective. It is also likely that since he got the flu shot at the beginning of the flu shot, the timing was a coincidence.
Awareness of the post hoc fallacy helps us separate true causes of events from the noise. Understanding other fallacies—argument from authority, argument from antiquity, tu quoque, the sharpshooter fallacy, etc.—can be equally useful.
Leadership Lesson: Recognizing the value of expert intuition, emotion, and gut feeling are important, but nothing beats cold, hard logic applied at the right time. A rule of thumb is that when it comes to linear problems, logic rules. When it comes to complex problems, do all your factual due-diligence; apply logic; correct for logical fallacies, cognitive biases, and personality biases; then trust your belly.
Lack of Knowledge
A person’s knowledge and experience shapes their worldview, and their worldview shapes how they think about the world around them. A person who has a broader range of knowledge and life experiences will generally have a broader perspective. Someone with a narrower range of knowledge and experiences will have a narrower perspective. There is always a trade-off between specialization and breadth, but what separates the best thinkers from the rest is curiosity.
Unfortunately, most people aren’t that curious. As we have already seen, the brain likes to create habits and stick within its comfort zone of habitual thoughts whenever possible. Exposing ourselves to new information and other perspectives can seem threatening to that comfort. As we’ve also seen, the brain likes to filter out information that doesn’t fit our existing mental models.
The result is that many people end up not pursuing new knowledge and filtering out alternative perspectives without realizing it. The result—we don’t know as much as we think we do.
There is even a name for this—the Dunning-Kruger Effect. Psychologists David Dunning and Justin Kruger studied expertise and found that people with less expertise in a topic tend to overestimate their knowledge while people with more expertise tend to underestimate their knowledge. Since we tend to overestimate our knowledge of a given topic, it is very easy to stop working to learn more.
The real world result of this is that we miss opportunities of all sorts. People with more knowledge and experience are able to see nuance, context, and connection much better than those with less. Ignorance can make us miss business opportunities, keep us from improving our products, and keep us from improving ourselves.
Leadership Lesson: Embed a culture of continual learning in your organization. Be sure to reward curiosity and experimentation. Create a “liberal arts” mindset” where people are encouraged to learn outside of their scope of responsibility.
Perhaps even more dangerous than ignorance—not knowing something that is true—is being misinformed and believing things that are not true. The challenge of living in the Information Age is that not all information is created equal—it often seems we are exposed to as much bunk as we are to actual facts. For example, it is easier to find information telling us that vaccines or GMOs are dangerous (they are not) than it is to find solid science. (Fortunately, Google is to address this and other topics prone to misinformation by adjusting its algorithms.)
Mark Twain famously said that “a lie can get halfway around the world before the truth gets its boots on.” And while he “famously” said it, Twain didn’t actually say it—this quote is in itself a great example of misinformation. No one knows who said this first but it predates Twain by at least a century and it is not included in any of his writings (or so I’ve read…hmmm…). This is just one simple example of the flood of misinformation we are deluged with when we step onto the information super-highway.
It is difficult to protect ourselves. It is easy to accept the quote as original because it feels right. “Genetically modified organisms” sound like dangerous things, so it is easy to accept the misinformation that they are. Few of us have the time to debunk quotes or research genetic engineering or any of the other topics we are faced with but expected to have an opinion on. So we embrace what feels true.
There are two cognitive biases that reinforce our embrace of misinformation: biased assimilation and belief polarization.
Biased assimilation is the aspect of the confirmation bias that inclines us to embrace information that conforms to our existing beliefs. Again, the brain likes habits and this applies to ideas that fit an existing groove in our gray matter.
Belief polarization is what happens when people are confronted with information that contradicts their beliefs—no matter how good the argument or clear the facts, people are more prone to end up even more convinced of their existing beliefs when they are they are confronted with contradictory evidence. In other words, the more you try to convince someone of the errors of their beliefs, the more entrenched in those beliefs they will become.
Finally, we tend to live in echo chambers—surrounded by people who think like we do and are all too happy to tell us how correct we are. We attend conferences filled with people from our business or academic fields. We watch news channels or read editorial pages that reinforce our political views. We join clubs made up of people with the same socio-economic background. All of these things become part of our intrinsic identity and stepping outside of them can feel very threatening, so we tend not to do it. This leaves us vulnerable to not only missing the opportunity to learn new things, but vulnerable to accepting misinformation.
Leadership Lesson: Practice skepticism. Employ the Devil’s Advocate. Become an amateur epistemologist.
- To be skeptical is to be open-minded but also to weigh a claim against the evidence. The grander the claim, the more robust the evidence must be. The Devil’s Advocate is the person who’s role it is to argue the other side. It is helpful to assign one in each meeting to avoid group-think.
- Epistemology is the branch of philosophy that examines how we know what we know. Understanding how to tell accurate information from inaccurate information and solid evidence fro poor evidence is a critical skill for leaders.
Argumentative Theory of Reasoning:
- “The Argumentative Theory—A Conversation with Hugo Mercier,” http://edge.org/conversation/hugo_mercier-the-argumentative-theory
- Sperber, Dan and Mercier, Hugo, “Why Do Humans Reason? Arguments for an Argumentative Theory,” https://hal.archives-ouvertes.fr/hal-00904097/document.
- Trivers, Robert, “The Folly of Fools”
- Galinsky, Adam and Schweitzer, Maurice, “Friend & Foe: When to Cooperate, When to Compete, and How to Succeed at Both.”
- Tavris, Carol and Aronson, Elliot, “Mistakes Were Made (but not by me)”
- Kahneman, Daniel, “Thinking Fast and Slow”
- Wood, Jennifer, “20 Cognitive Biases That Affect Your Decisions,” http://mentalfloss.com/us/go/68705
- Wilke, A and Mata, R, “Cognitive Bias,” The Encyclopedia of Human Behavior, http://people.clarkson.edu/~awilke/Research_files/EoHB_Wilke_12.pdf
- Tallon, Robert and Sikora, Mario, “Awareness to Action: The Enneagram, Emotional Intelligence, and Change.”
- Sikora, Mario, “What is the Enneagram? (And How Can If Help You?)” http://www.awarenesstoaction.com/ataconsulting-blog/what-is-the-enneagram-and-how-can-it-help-you/?lang=en.
- Sikora, Mario, “A Summary of the Three Instinctual Biases,” https://www.linkedin.com/pulse/summary-three-instinctual-biases-mario-sikora?trk=hp-feed-article-title-publish.
- Barondes, Samuel, “Making Sense of People: Decoding the Mysteries of Personality”
Logic and Logical Fallacies:
- DiCarlo, Christopher, “How to Become a Really Good Pain in the A**”
- “Your Logical Fallacy Is” website, yourlogicalfallacyis.com.
- “Master List of Logical Fallacies,” http://utminers.utep.edu/omwilliamson/ENGL1311/fallacies.htm.
- Sikora, Mario “What Leaders Read; or Management as a Liberal Art.” http://www.awarenesstoaction.com/ataconsulting-blog/what-leaders-read-or-management-as-a-liberal-art/?lang=en.
- Grazer, Brian and Fishman, Charles, “A Curious Mind”
- Steves, Rick, “Travel as a Political Act”
- Bazerman, Max, “The Power of Noticing.”
- Trompenaars, Fons, and Hampden-Turner, Charles, “Riding the Waves of Culture.”
- Ellenberg, Jordan, “How Not to Be Wrong: The Power of Mathematical Thinking.”
- Pigliucci, Massimo, “Nonsense on Stilts: How to Tell Science from Bunk.”
- Mitroff, Ian I., and Bennis, Warren, “The Unreality Industry: The Deliberate Manufacturing of Falsehood and What it is Doing to our Lives.”
- Thompson, Damian, “Counterknowledge: How we surrendered to conspiracy theories, quack medicine, bogus science, and fake history.”
- Huff, Darrell, “How to Lie With Statistics.”
- Gardner, Martin, “Science: Good, Bad, and Bogus”