So, which is it? Do emotions support excellent decision making, or do they interfere with it? As with so many of my articles you’ll see that my answer is “it depends.”
But before I get into that, I wanted to pause for a moment. I now find myself at the halfway mark in this one year article challenge that I have set for myself. I was recently very moved when, after presenting at a workshop, several audience members told me they had been positively impacted by my series, especially the sharing of my inner critic voice which often mirrored their own. If you’re reading this article, you know who you are, and thank you. Your feedback was impactful.
Emotions can have complicated impacts upon our decision-making process. This played out for me as I faced a big decision in the last year—whether or not to write this series! Sixteen months ago, in an hour of spontaneity on the subway, I wrote what would become my first article in this series. I did nothing with it at the time, but this idea of writing and finding my voice kept coming back to me over and over again. I felt both nervous and excited.
I felt nervous to put my voice out there (a challenge for me since before my dissertation), and about taking risks with self-disclosure (despite the many brave students and professionals already doing so online and elsewhere). I came to realize that my nervousness stemmed from fear (primary maladaptive) which summoned the question: what did this fear need? My response: more information to assess the risks. So I sought out supportive colleagues to explore questions of stigma in the workplace—what were the possible merits and drawbacks of acknowledging that I, too, have experienced symptoms of depression and anxiety? Of sharing my moments of inner criticism? What might be the implications personally and professionally?
I also reflected upon my excitement. I have been very surprised over the last five years to find myself falling in love with supervision, teaching, and training. It’s not an area I thought I would enjoy (my critic wasn’t a big fan of the idea), but I love the moment when a student or a supervisee “gets” something new. That moment of excitement when their eyes light up, a big grin appears; when a student takes a risk in applying new learning that lies just outside of their comfort zone—and it works! It’s this amazing moment of witnessing joy, excitement, and pride alongside the slower building of confidence in newly developed competence. In approaching this series, I was excited to find a way to put theory and practice into words in a way that makes the complicated information counsellors spend years learning accessible to supervisees, to clients, and to readers who work with emotion outside the realm of mental health care.
The months went on, and this feeling of excitement persisted. It was a signal that some aspect of putting my voice out there was important to me and aligned with my underlying values. The question of mental health stigma among mental health professionals inside and outside of academia is one that bothers me. As a therapist, it is important for me to make peace with areas of my life where I have felt pain and shame so that I can be most effective in helping to bring clients to and through their own painful emotions, which are not uncommonly accompanied by shame. I believe we need to humanize the process of the profound vulnerability inherent in meaningful psychotherapy; having been in the client chair allows a deep empathy for what it is we are really asking clients to do. I am also aware that much of what we do, as therapists, is by nature done privately behind closed doors. By writing this series, I hoped that I could communicate transparently about the theory and practice that underlies some approaches to psychotherapy (primarily emotion-focused and experiential therapies), and help raise the profile of emotion in our work as student affairs professionals.
Having taken responsibility for sorting out patterns of old, stuck, maladaptive fear; having attended to “here and now fears” and gathered more information about the possible impact of mental health stigma on me personally and professionally; having clarified my underlying values; and having sought support in a few trusted readers, eventually, I decided to take a risk and sent my article draft to Lucas Gobert, the RyersonSA editor, who was intrigued by what I had to say (take that, inner critic!). [Editor’s Note: Got your back, Jack!] He also taught me how to approach a sustained writing project, creating a comprehensive series outline at the outset (without which I would have become lost and mired in the weeds), and who has provided humour, support, constructive criticism, and gracious feedback over time! Hmm… That almost sounds like the basis for secure attachment—but more on that next month!
What I’m saying here is that feelings were important in my decision to write this series. They inspired something new, pulled gently yet persistently at my attention, and motivated a direction. But they were also a barrier. Join me now as we jump into a (non-exhaustive) examination of the role of emotions and feelings on decision-making.
To be able to make decisions, we need to first access our emotions, however, as we learned two articles ago, not all emotions or emotion processes are helpful; some can derail our decision-making efforts. Let’s tackle each of these areas in turn.
“Fundamentally, decision-making does require access to emotions. The somatic marker hypothesis proposes that individuals make judgements not only by assessing the severity of outcomes and their probability of occurrence, but also and primarily in terms of their emotional quality…The fundamental notion of the somatic marker hypothesis is that bioregulatory signals, including those that constitute feeling and emotion, provide the principal guide for decisions…” – Bechara, Damasio, & Damasio
Making decisions involves complex operations in our brains, relying on numerous interconnected neural structures and processes—including parts of the brain essential to using emotion cues. Through a series of patient studies, neuroscientist Antonio Damasio discovered that individuals with damage to the ventromedial prefrontal cortex (an area of your brain roughly behind your eyes) experienced severe impairments in their abilities to make decisions in daily life. While they continued to be able to engage in learning, memory, language, and attention functions, if you asked them to make a simple, every-day decision (the red shirt or the blue shirt? Product A or product B?) they could be stymied for hours.
What’s so unique about the ventromedial prefrontal cortex? Individuals with damage in their ventromedial cortex lose their capacity to anticipate how a future situation will feel; they may know if they feel good or bad now, but can’t imagine how they might feel differently in the future based on the outcome of a decision. According to Damasio, each time we take a risk, our bodies react and give us information about how it may feel should the risk pay out or not. These somatic or “body” states give us valuable information that inform whether we will choose a riskier or safer option. In individuals with damage to the ventromedial cortex, this signalling process no longer functions. As a result, when available facts don’t give a clear expectation of success or failure, these individuals have no compass to guide their decisions: will I feel better if I go out with Julie, or stay in and read a book? Do I wear the blue shirt or red shirt? Order cheese pizza for the corporate banquet or hire a professional caterer?
When we remember something, Damasio argues, our brains reconstruct a previously learned “factual–emotional set”. That is, our brain builds associations between a fact and how we felt when we learned it. (Cool, right?!) For example, my brain might remember that to stay upright on my bicycle, I need to first push one pedal down and then the other…and that I felt exhilarated when I first was able to do so! Or that, when my teacher got angry at me for saying a poem too fast in grade one, my whole body tensed up and I felt ashamed. Some of these factual-emotion sets we are consciously aware of (I know I feel delight when I kick my leg over my bike to start a ride), while others we are not (I don’t realize that I tense up and feel fear when I see an angry facial expression flicker across the face of a person with whom I’m speaking).
When we are not aware of our own factual-emotion sets, these hard-wired reactions can become what Damasio calls “biasing signals”—signals which subtly direct our attention and guide implicit memories and emotions to influence our thinking without us realizing that this is occurring. For example, following a car accident, a driver may become quite anxious each time they enter the driver’s seat of a car and may be obviously and consciously hypervigilant for signs of danger while driving. Or, the driver may be irritable and on edge while driving, unaware of their heightened blood pressure and heart rate, and that they are speaking sharply to passengers in the car when they were typically calm and pleasant while driving before the accident.
Biasing signals are in fact a natural part of our everyday lives—decision-making becomes far more efficient when we rapidly “feel” something is right or wrong for us. “I’ll take TTC; I just don’t feel like driving today.” Such patterns, however, can also create room for implicit bias that can negatively impact our judgment and decision-making. For example, consider a scenario where an interviewer decides they will hire Candidate A. Candidate A and Candidate B were similarly qualified, but Candidate A somehow just “felt right”. If the interviewer learned through prior life experience that what is familiar is safe while what is novel is risky—and in the context where Candidate A is similar to the interviewer in beliefs, in ethnocultural identity, or in communication style, while Candidate B was equally qualified but less familiar in some way—Candidate A may be selected because familiar = safe in the mind of the interviewer. This is an example of implicit bias based on an old factual-emotion set connecting familiarity with safety. It’s also an example of feelings interfering with effective and fair decision-making.
Without being aware of physiological arousal, including signals stemming from our emotion systems, we’re not going to be able to make good decisions—but while our brains need an emotional push in one direction or another, a few processes can trick our brains into making bad decisions; like biasing signals mentioned above. To that end, here are my top five tips to help you avoid emotion-based pitfalls to effective decision-making.
Tip #1: When making a decision based on intuition, or what feels right, stop and check for implicit bias.
Evidence suggests that implicit bias is incredibly hard to overcome—in large part because we are not aware of when we are impacted by it. Research led by social psychologist Patricia Devine, however, suggests that prejudice—a form of learned implicit bias—is a habit that we can (work hard to) break. In their 2012 study, Devine, Forscher, Austin, and Cox studied a four-week intervention that resulted in what they defined as long-term reductions in implicit race bias (positive results were maintained at eight weeks after the intervention occurred). Devine’s team’s intervention was based on previous research into factors that successfully reduced implicit bias and included the following steps:
- Making individuals aware of their biases.
- Teaching individuals about the negative impact of biases (in this case links between bias, discrimination, and negative health, employment, and interpersonal outcomes associated with discrimination) to create motivation for change.
- Teaching individuals to predict when implicit bias may emerge.
- Helping individuals to identify, select, and use explicit strategies to replace biased responses with responses in alignment with their explicit values and overarching goals.
If you are interested in increasing your ability to break your own habits of implicit bias, I recommend the following places to begin:
- Project Implicit can help you to explore your own patterns of implicit bias through their online Implicit Association Test.
- To learn more about the consequences of bias, consider contacting your institution’s Equity, Diversity, and Inclusion office or Human Resources Employee Development group for workshops and learning opportunities. If you do not have such an office available, make a concerted effort to learn about the costs and consequences of implicit (and explicit) bias. Starting points may include taking a college or university course related to understanding the experience of underrepresented and marginalized groups, readings such as documents produced by the Truth and Reconciliation Commission of Canada, or curating your own reading list. Books I have found to be incredibly powerful include In Search of April Raintree by Beatrice Mosionier, Gender Outlaw by Kate Bornstein, and Zami: A New Spelling of My Name by Audre Lorde; just to name a few.
- Read about intervention strategies for reducing bias. In her research article, Devine summarizes and utilizes five:
- Stereotype replacement—identifying and labeling when a response is based on a stereotype; asking yourself why the response occurred; identifying strategies for avoiding that particular response in the future, and replacing the biased response with a balanced, non-biased response.
- Counter-stereotypic imaging—imagining in detail individuals you know in person, are aware of via media, or imagining members of the group you may hold implicit bias towards, but who manifest traits counter to your implicit bias. For example, if you were raised in a community in which you learned negative associations towards people from LGBTQ communities, you might think of the characteristics of a valued LGBTQ friend, political leader, or pop icon.
- Individuation—seeking to learn about specific details of the unique individual you are engaging with when that person is a member of an underrepresented or marginalized group, this will allow you to be more aware of unique personal details, rather than stereotyped generalizations.
- Perspective taking—using empathy and your imagination to put yourself in the shoes of a member of a stereotyped group. This involves imagining what daily life might be like on the receiving end of microaggressions and systemic explicit and implicit bias, all while going about one’s daily tasks like standing in line for coffee, selecting a washroom, shopping for clothes, driving home, managing a child’s temper tantrums, and attending job interviews.
- Increasing opportunities for contact—actively seeking opportunities to get to know individuals from marginalized groups with which you have little experience. This allows for new learning opportunities and for the rich data of real relationships to replace (at least some of) the biases that develop over years of social learning within our societies that privilege some groups over others.
A caveat: the larger scale impact of implicit bias on explicit bias and behaviour remains unclear. Specific strategies targeting explicit bias reduction and promoting behaviour founded on equity, diversity, and inclusion appear to be important independently of those targeting implicit bias.
Tip #2: If your emotions are very strong—stop. Intense emotions don’t tend to lead to good decisions.
While there are many factors that make emotion-charged decision-making unwise, we’ll limit our discussion here to something known as confirmation bias.
Confirmation bias is “the tendency to search for, interpret, favour, and recall information in a way that confirms one’s preexisting beliefs or hypotheses, while giving disproportionately less consideration to alternative possibilities.” Of note to us is that confirmation bias tends to be stronger for emotionally charged topics.
In her New Yorker article, Pulitzer Prize winner Elizabeth Kolbert reviews recent literature on confirmation bias as she explores why “reason” sometimes fails so miserably. She considers this specifically within the context of the rising acceptance and tolerance of “alternative facts” and the polarization of opinions leading up to and following the recent U.S. presidential election.
“The vaunted human capacity for reason may have more to do with winning arguments than with thinking straight.” – Elizabeth Kolbert
In her review of Mercier and Sperber’s book, The Enigma of Reason, she summaries their argument that “reason is an evolved trait” that “emerged on the savannas of Africa, and has to be understood in that context.” Briefly, she summarizes their argument that reason developed not in order to solve logic-based problems (no rubiks cubes in sight for our ancestors) but to increase human capacity to solve problems that arise when living within larger groups based on mutual collaboration. Within this context, the most useful outcome of reasoning is identifying strategies for winning arguments in social groups so that we are individually risking as little as possible while gaining maximum benefit from collective life. For example, it may assist my individual survival if I can convince others of the merits of expending their energy risking their lives as exalted hunters for the group, while highlighting the merits of my own planning capabilities to prepare for the hunters’ victorious return. In this light, the ability to quickly identify gaps in others’ logic without seeing gaps in my own or while continuing to push my point may be adaptive.
Kolbert goes on to focus on Jack Gorman, a psychiatrist, and his daughter, Sara Gorman, a public-health specialist, who argue an alternative explanation for confirmation bias, citing research demonstrating that people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs—even when their beliefs are wrong. The Gormans have noted that providing individuals with accurate information, say about the scientific evidence supporting the efficacy and safety of vaccines, is not always effective in changing beliefs; people with strongly held convictions tend to discount facts.
Inner Critic: You don’t know enough about this topic. You’re not up to the challenge—the literature is too broad. You should skip this topic and move on.
Self: Listen critic, you’re depressing. It hurts when you say things like that. I feel exhausted and down when I hear this over and over again. Enough already! Why on earth do you work so hard to go after me this way? Just stop it!
Inner Critic: …I don’t want you to get hurt. If you fail, it will hurt too much. It’s better to just not try. Then, you won’t be disappointed.
Confirmation bias tends to occur when feelings are very strong. Conversely, having a deep understanding of an issue, including proactively seeking balanced facts, tends to reduce strong feelings and limit confirmation bias.
In a previous article I highlighted that whether you’re a dandelion- or an orchid-type feeler, alignment between your thoughts, your feelings, and your values may indicate that the course of action you are contemplating is a good one. Now we can clarify that alignment between your thoughts about balanced facts, your low to moderately aroused emotions, your conscious feelings, and your values may reflect an optimal decision. We can also add to this that having no feelings or very strong feelings may both pose serious threats to the integrity of your decision-making process.
If you feel very strongly about a decision or an issue, write out in detail how the underlying principle, policy, or procedure you are deciding upon actually works (or will work). Research the facts until you understand and can articulate them. From a balanced knowledge base, you are likely to feel less intense emotion, and will be less susceptible to confirmation bias. When your mild to moderate feelings and your understanding of the facts align and push you in one direction over another, it’s more likely to be the right decision for you (or your team).
If you feel very strongly about a decision based on areas where values, rather than facts, will predominate, clearly articulate your rationale for your preference—preferably in writing. Now write out an equally robust rationale arguing for the opposite position—if you can’t think of the rationale, go and talk to someone with the opposing view and really listen to understand. Taking into account facts and values underlying both sides of a position will help you to make a better decision.
Tip #3: Trust to feelings that are fresh, new, and stick around for a while…
As we learned in my previous article “SA Has the Feels 201 (Part 1): A Primer on Managing Your Emotions and Feelings”, we experience different categories of feelings. Some are old, stuck patterns that are more reflective of what we needed years ago, typically in an overwhelming situation, than they are reflective of what we need now. Old, stuck, and repetitive maladaptive feelings are not a good guide for decision making. Adaptive feelings that are fresh and new and emerge as you understand and synthesize new information and possibilities are likely to be more trustworthy. At the same time, it is wise to look for some consistency in your feelings across situations, rather than making a big decision on a whim, or in response to a single experience of feeling excited or hopeful (more on that below when we discuss misattribution of arousal).
Tip #4: Identify your baseline feelings as you head into a decision-making process or environment.
Emotion schemes, once activated, can colour our perceptions and decision-making. Many of us have had the experience of feeling more critical of others’ ideas when angry about something unrelated that happened earlier in the day, or feeling especially patient and willing to collaborate when feeling seen, heard, and valued ourselves. Francesca Gino, a professor at Harvard Business School, documents several pieces of research identifying how strong feelings can colour our perceptions of new ideas and our evaluations of others. Her advice: choose when you are most capable of calmly and objectively deliberating on important decisions. This will involve, in part, being able to identify what and how strongly you are feeling in any given moment (and, as we saw above, learning to identify when you are caught in implicit bias).
While knowing how you feel as you head into a decision-making process may help somewhat in being able to subtract out the impact of your pre-existing feelings on the decision at hand, proceed with caution. Even when you know how you are feeling, research suggests that you will not always be able to subtract out pre-existing feelings from your current context.
Misattribution of arousal is a phenomenon that can sometimes lead us astray and can cause us to both make incorrect assumptions about the triggers for our feelings and can influence our decision-making even when we know we are in a specific mood from the outset. In classic studies of this phenomenon, study participants engaged in tasks such as intense exercise, or even received an injection of epinephrine (yes, that particular experiment was conducted in the 1960s) in order to raise heart rate and blood pressure. Researchers across various studies then noticed how participants behaved when exposed to scenarios in which they would make an implicit value judgement or an active decision. Overall, more robust findings over time include:
- heightened physiological arousal impacts how we perceive new situations.
- when people are unaware of why they are feeling more aroused, they are more likely to assume that a recent thought or immediate recent experience caused their arousal, whether or not that is true.
- emotional states will be exaggerated or intensified by heightened physiological arousal—like turning up the volume by giving a stronger emotion signal to the feeling one has labelled themselves as having.
The overall recommendation here: Tip#2 still stands.
Tip #5: For big decisions, especially decisions within a position of leadership, step back and look at the bigger picture. Find the greater good and beware your empathy.
In the weeks before writing this article, I happened to hear a brief CBC interview and review of the book Against Empathy: The Case for Rational Compassion by Dr. Paul Bloom. Obviously, the title caught my attention.
“The case of empathy, the sort of empathy I am against—and actually Obama nicely described it—is putting yourself in other people’s shoes, feeling their suffering, feeling their pain. This is very appealing to us. It feels great. But given the nature of our minds, it leads us to biased decisions, to innumerate decisions and often to cruel decisions. I think we’re much better off when we use reason plus a more distant compassion.” – Dr. Paul Bloom
But how can empathy lead to cruelty? Dr. Bloom makes a compelling argument, within the context of larger-scale decision-making. He makes the case that, in evolutionary terms, our emotions—especially what he calls our “moral feelings”—have been shaped to bias us towards the groups we most strongly identify with (remember our discussion on implicit bias). This bias may have made particular sense when various tribal groups were vying for scarce resources and when bonds to “our group”, whatever that group was, were likely to promote survival. But consider this particular juncture in history when resources are plentiful on a planetary scale. At this point in time, empathy may pose a significant risk to global well-being at the very time that we have the resources at our disposal to support access to basic food, water, shelter, and health on a global level—if only we (as in, the collective human species) could distribute resources more evenly and effectively. By focusing empathically, Bloom argues, we prioritize some few over the many when it comes to resource distribution.
Bloom cites the following drawbacks of using empathy in large-scale decision-making:
- You’re more likely to experience burnout or exhaustion as a helper when frequently feeling deep empathy.
- When empathically attuned, you risk focusing on the short term over the long term.
- Empathy is subject to significant bias effects including racial bias and bias based on perceived attractiveness. Research has shown that we are more likely to feel empathy for those who have a similar skin colour to our own, and for people we perceive as attractive and/or safe. For example, we are likely to feel highly motivated to help a specific child living in poverty, but typically less so a child’s homeless father.
- Empathy is “innumerate”—we are wired to care more strongly about a single poignant story, especially one with a face, than about hundreds or millions of people simultaneously.
- Feeling empathy for one individual’s pain can increase desire to lash out at those perceived as perpetrators—without taking into account future consequences, or the full complexities of the situation.
- Empathy is strongly elicited by anecdotes, leaving us susceptible to being unduly swayed by individuals using anecdotes unscrupulously to sway public opinion, without always having the data to back up the narrative.
“It’s because of empathy that people care more about a little girl stuck in a well than they do about climate change.” – Dr. Paul Bloom
Rather than relying on empathy (again, largely for big decisions, as opposed to daily interactions with others), Bloom admonishes us to rely on more “distant compassion” and on solid data. He argues that compassion stems from different neural pathways than empathy, and that when you are grounded in compassion, you are both better able to help than when you are feeling another’s pain, and will enjoy helping more, making it more sustainable over time.
And how do we avoid empathy traps in decision-making? Dr. Bloom (much like Dr. John Austin) admonishes decision-makers to rely on data first, and then go to feelings. He also asks decision-makers to work harder to recognize that “what feels right and what is right” are different, and even to consider regular engagement with mindfulness meditation which has been shown to lower empathy (!) and increase kindness and compassion.
On a personal note, when I first stepped from the role of psychologist into the role of Clinical Coordinator of Ryerson’s Centre for Student Development and Counselling, I was used to relying upon empathy in my daily practice when working therapeutically with students. In my first year in a leadership role, every student’s story pulled significantly at my heartstrings. It took a year or more of practice to be able to step back and ask myself: how do we manage our system to do the greatest good for the greatest number of students? One of these ways was to set more stringent criteria for accessing our same-day crisis appointments, meaning that more people not in imminent danger, but in significant pain, had to wait. This felt like an incredibly difficult decision at the time but was the only way I could see, with existing resources and options at that time, to ensure that we would see those whose safety was at risk on the day they needed help.
That didn’t mean that we stopped planning and strategizing to increase resources and alter the structure of our system, as we did in September of 2016, allowing us to begin seeing 60% of first time callers on a same-day basis while still having flexibility to offer a range of short-term and longer-term treatment options to students based on an individualized care plan. Making successive changes to benefit the greatest number of students, while still retaining quality care for those already connected to our services, did require a shift from empathy for each individual story, to data alongside compassion for the greater student community. Doing so also helped me sleep at night, focusing on the greater good at times when much individual suffering (alongside tremendous strength and resilience) continues. It truly is helpful to step back and look at the bigger picture. Does your strongly held conviction help one person with a truly heart-wrenching story, or is it in support of the greater good?
In researching for this article, I found a blog called The Intentional Workplace by Louise Altman. I read a few posts that I really enjoyed. One quote in particular stood out in summarizing neatly what we know about emotions and rationality in our decision-making processes:
“To optimize your decision-making process, you have to build capacity in both your brains—the rational and the emotional. They’re brilliantly interwoven to maximize understanding the world around you and the vast world within you. Jonah Lehrer offers a good analogy here, “The human brain (the “rational” brain) is like a computer operating systems rushed to market with only 200,000 years of field testing…it has lots of design flaws and bugs. The emotional brain, however, has been exquisitely refined by evolution over the last several hundred million years. Its software code has been subjected to endless tests, so it can make fast decisions based on very little information.” – from How Emotion Shapes Decision Making
We just need to remember that mind and body are indeed not separate, but are one integrated system of feedback loops and co-influencing mechanisms. And we need to remember that our emotional brain must be tempered by our rational brain—including learning about our hard-wired (rapid decision-informing) blindspots. In the end, it’s never just thoughts or feelings, body or mind. It is time to move from Descartes’ Error to Damasio’s (and others!) evidence.
Join me next month for an article I’ve been looking forward to writing since the beginning of this series: Attachment and All That Monkey Business. We’ll tackle more of our “exquisitely refined software code” as we play with Bowlby’s concept of attachment in understanding emotion.