Posted Posted in MetaThinking

Social Immune Systems

The Hidden Psychology of High-Stakes Organizational Decisions

 

 

The hidden psychology of high stakes organizational decisions

          Some of the very best critical thinking research in the world has been done with flight crews and medical teams.  There is an almost endless number of fascinating cases to explore. There's one, though, that I've become more obsessed with than others.  It happened on Thursday, January 25th, 1990.  At 9:34 pm, after a long flight and multiple holding patterns, Avianca flight 52 was given clearance to land on runway 22L at JFK.  Wind shear was terrible that night and visibility was limited.  The crew found themselves descending quickly and only 200 feet off the ground more than a mile from the airport.  If you listen to the cockpit voice recorder, you hear the captain abruptly ask the first officer where the runway is, and the first officer answers “I don’t see it."  The ground proximity warning system gave them voice alerts to pull up, and they executed a missed approach.  Their go-around ended in disaster when they crashed 20 miles northeast of the airport in Cove Neck, killing 73 of the 158 people on board.  The crumpled fuselage looked a lot like a crushed soda can and came to rest only 20 feet from the back of John and Katy McEnroe's house, the parents of tennis legend John McEnroe.  What's so frustrating about the Avianca accident is it seems that the warning signs should have shone like neon beacons.  The fact is though, they didn't, they only became visible in hindsight.  I recognize these symptoms better now and after you hear the rest of the story, I bet you will too.

          I speak for expert audiences that make high-stakes decisions as a routine part of their jobs.  One day I showed an audience crash scene photos from three different perspectives of flight 52’s crash site and asked them to tell me what clues they saw as to the cause of the accident.  The air in the room thickened.  There was a fear to share ideas that might be less than perfect.  But, after a little nudging, some thoughts started to leak out.  The first person to speak pointed out that the weather had been bad.  Someone else said the plane was close to a house, while another observed that the cockpit wasn't even visible in the pictures.  If you take a second to think about these statements, you realize they are safe and superficial.  People are only acknowledging what everyone else can plainly see.  The hesitancy to share their personal thoughts created a veil over evidence hidden in plain sight.  I worked a little more to try to remove the threat, and finally, someone sees it and says, “Why isn't there a fire?”  This response is so different because it's not just an observation of what's obvious, it's also a personal insight.  This doesn’t come out until someone feels safe enough to risk being curious, instead of wanting to avoid criticism.

          The audience on this day was full of forensic experts, but not in aviation; they were financial auditors.  People ask why I would have auditors or analysts examine things outside of their specialty.  First of all; I've found the benefits of cross-disciplinary training to be remarkable.  The reason someone in aviation may say “get your head out of the cockpit,” is why all specialists should step outside the rigid structures of their industry.  It helps them see counterintuitive patterns and discover new problem-solving strategies.  When I speak for aviation experts, I start with anything except aviation before we look at how the concepts apply to their specialty.  The second reason is that understanding why flight 52 ended in a disaster has very little to do with the technical aspects of aviation.

The problem with silence

          The case of Avianca flight 52 deserves a more detailed review.  It started out in Bogotá, Columbia and was headed up the east coast of the United States when air traffic control directed them to take a holding pattern just off the coast of Virginia.  This would be the first of three holding patterns that would keep the aircraft circling for a total of 89 minutes.  As their fuel levels dwindled to almost zero, they never declared a fuel emergency.  The first officer had passively disclosed their concern, but as they proceeded through subsequent holding patterns, they were handed off to different air traffic controllers who were unaware of the impending crisis.   Passengers knew they were descending but saw there was no airport in sight.  No warning ever went out from the cockpit.  The engines flamed out and went silent one by one.  The emergency lights suddenly flashed on over the exits shortly before the crash.  A survivor reported that the pressure from the impact left people packed like sardines.  Early the next morning, dozens of white body bags were scattered across the McEnroe’s lawn.

          On the surface, it's easy to assume bad weather was the explanation for the crash, but the official reason was fuel exhaustion which is why there was no fire.  The real cause though was a failure of communication.  Flight 52 drew attention from social scientists who observed that these disasters don’t usually occur due to a single catastrophic event, but rather the accumulation of smaller factors that lead to stress and poor communication.  Dozens of articles examined the fact pattern leading up to the crash, and as usual, early reports were dubious in their accuracy.  The Avianca cockpit voice recorder transcript offered clear evidence that there had been a communication breakdown.  Some researchers suggested a controversial idea; that the cause of the crash was really a cultural problem.  Malcolm Gladwell devoted part of a chapter in his book Outliers to this idea.  He pointed out that in such cases, the weather may not be terrible, but bad enough to add some stress. The flight may be a little behind schedule causing pressure to hurry, which frequently leads to decision-making errors.  People may say things in ways others don’t fully understand, and the crew may not have flown together before and don’t know how to work together during a challenge.

          A defining feature of Avianca's culture was that it was very hierarchical.  The organization was high in what social scientists call “power-distance” which is the way power is distributed in an organization.  High power-distance cultures have centralized command structures, and tend to over-rely on fixed plans, guidelines and decision making aids such as checklists.  The result is that people are so locked into a program that it goes unquestioned until you need to adapt to a situation but can't change procedures fast enough to deal with a problem.  Flight 52 could have requested to divert to Boston long before they ran out of fuel.  A chilling symptom of such a culture can be found by listening to the cockpit voice recorder from the doomed flight.  People are afraid to speak up,  so the crew doesn’t directly tell air traffic control they are running out of fuel until the end of the third holding pattern, after its too late to make other plans.  In such a culture, the hierarchy is trusted to such a degree that anything challenging the centralized command structure is considered a threat.  Organizational members never challenge up the chain of command.  The crew of Flight 52 never speaks up to air traffic controllers as they order them into one holding pattern after another.

A different philosophy of power

          Almost 19 years to the day after the Avianca disaster, and with less than ten miles separating a section of their flight paths, another crew would get not just the attention of social scientists, but of the entire world.  On January 15, 2009, US Airways flight 1549 was piloted by Chesley Sullenberger and Jeffrey Skiles when at 3:27 pm, 2,818 feet above the Bronx they struck a large flock of Canada geese and lost power in both engines, leading to the most successful controlled ditching of an aircraft in aviation history.

          The passenger in seat 26 C was Brent Cimino, who I had the chance to talk to about what it was like to be a passenger on the infamous flight.  “I was towards the back of the plane,” he said, “which was a pretty harrowing experience itself.”   Passengers heard loud bangs when the birds struck.  Some passengers saw flames coming from the engines, and then they smelled smoke and the odor of jet fuel.  Without the sound of the engines, the cabin was eerily silent, with only the sound of a few babies crying.  Cimino described the awful feeling in realizing such young lives might be about to be lost.  “I was pretty shell-shocked…I didn’t have much thought besides, this was probably going to be it” he said.

          Cimino described what it felt like when they actually hit the water as similar to getting hit from behind by an SUV if you were sitting at a stop light.  It was “parallel to getting rear-ended,"  he said.  He also offered some thoughts on what might have been going through Sullenberger's head. “In that moment, even though I think, even in his head might have known that there’d be some scrutiny for putting that plane in the Hudson, his skillset and his decision to do what was right is what saved us all.”   I wondered what he saw from the side of coordination of the crew.  “We didn’t hear much from him (Sullenberger) of course,” Cimino said, “ other than the famous line ’brace for impact’ but his crew, even post-event was such a tight-knit unit that you could tell they followed his lead and had the same resolve.”

You can’t be transparent about problems you don’t know you have

          For a team to make good decisions in high-stakes situations, they have to break down any barriers that exist in their lines of communication so people can adapt under pressure.  Sullenberger starts by having crew members know each other by name.  Simply knowing each other’s names means people are more likely to bring up potential problems and talk about small issues that might not seem particularly important at the time but allow for insightful associations when needed the most.  Johns Hopkins adopted a checklist that explicitly spells out that everyone should introduce themselves by name and role before they begin working together.  Sullenberger has pointed to the importance of setting a tone of psychological safety, where there are no stupid questions, and there is a responsibility to speak up if you notice something others haven’t.  Some leaders want to be followed without question, Sullenberger says, but a good culture is one where the important thing is not who’s right, but what’s right.

          Research shows many managers react to a crisis by consolidating decision-making authority.  This is usually the wrong answer though and can result in an even greater disaster.  FEMA’s handling of emergency operations in the wake of Hurricane Katrina is a good example.  The traditional system of command and control became immediately overwhelmed after the storm.  In his book The Checklist Manifesto, Atul Gawande remarked on how those in authority at FEMA demanded to follow the standard procedures and protocol, requiring decisions to be pushed up the traditional chain of command.  There were far too many decisions to be made though, and upper levels of management didn’t have enough information on exactly what type of help was necessary or where it was needed.  Under these circumstances, communication becomes too slow, and the system is too cumbersome.  Leaders are so far-removed from the contextual details of the problem that they can’t possibly make good decisions.  During the Katrina crisis, trucks carrying desperately needed supplies were turned away from checkpoints because they weren’t part of the plan, and people died as bureaucrats argued about who had the power to make decisions.

          People also need to understand the type of problem they face so they can match their approach.  Much of this is based in the predictability of the outcome.  Complicated problems, those that have a lot of parts but follow a highly predictable course can be handled best in many cases using traditional tools like checklists built from experience.  Complex problems, however, those with many different moving parts that interact in ways that are difficult to predict will require a different approach. The fields of aviation and medicine have taught us that routine problems can be handled efficiently using standardized decision aids, but solving complex problems requires an ability to adapt using localized knowledge.  What management at FEMA failed to understand, Gawande pointed out, is that when you face complex problems, decision making power should be pushed as far away from the center as you can, and placed squarely in the hands of the people on the front lines.  People need some space to use their expertise to read a situation.  They also need the ability to share information fluidly back and forth with other team members.

The social immune system

          The scientific literature has increasingly relied on metaphors to examine group decision making.  It has become common to refer to organizations as brain networks or to refer to organizational “DNA.”   My son Caleb interpreted just about everything in relation to natural systems and said a good way to understand group decision making was to think of groups as an adaptive immune system, or “social immune system.”  A sort of superorganism.  Caleb was on the autism spectrum and had an extraordinary gift for patterns.  He enjoyed contradictions, looking for similarities between things that seemed different and differences between things that seemed similar.  This made him open to being wrong, which helped make him an extremely fast learner.  The social immune system view is valuable, he said, because it’s not just a metaphor.  If you look at nature, the patterns of interaction in complex, adaptive systems whether it be ants or the human immune system, you can better understand how groups share information and make decisions.

          Researchers at MIT’s Human Dynamics Laboratory examined team decision-making related to this concept by recruiting teams of professionals to participate in a large study.  The research led by Sandy Pentland outfitted participants with electronic sensors that collected data on their social behaviors such as their body language and who they communicated with, and how much.  The data clearly showed the best predictor of team success was patterns of communication.  Patterns of communication were a better indicator of a teams’ success than individual skills, intelligence and personality combined.  The most successful teams, said the researchers, had members that talked, listened and interrupted a lot, and in roughly equal amounts.  No one, including managers, monopolized conversations, and barriers to communication along the front lines were eliminated.

Ideas that inoculate

          The social immune system theory turns out to be a fantastic way to understand how cultures can support learning.  Let's start with threat recognition.  Teams need to recognize members from non-members and to assess whether new information represents a threat or a benefit.  Inaccurate threat screening puts the system at risk because it redirects finite resources to the wrong places, increasing vulnerability to actual threats.  Like an inoculation, members must be able to spread their knowledge of new information efficiently to others in the system.  This allows the system to develop a memory and helps to accurately and efficiently recognize information in the future.

          If people fear to explore beyond their standardized decision tools, they will abandon their unique recognition skills which will destroy the systems' ability to learn.  Leaders are sterilizing the team of individual thought, and fear of sharing means critical information will be quarantined.  Teams with this problem have acquired what is akin to an immunodeficiency disorder.  Ultimately the members become emotionally disengaged, further suppressing (in the literal sense) the neural networks of individuals required for localized threat interpretation.  Finally, when cultures are averse to different ideas, threat recognition becomes confused because any ideas that don’t match organizational traditions are interpreted as a threat even if they come from within the group.  This is an auto-immune disease.

          Cultures inoculate their members with varying levels of fear for error-making and spur different levels of discomfort in novel ideas.  In a 2011 study published in Psychological Science, a research team led by Jennifer Mueller found that people reject creative ideas even when they say creativity is their goal.  Using implicit association tests, the researchers found that participants related creativity with uncertainty, and when people were motivated to reduce uncertainty, they gave lower scores to creative ideas saying they were impractical or unreliable.  The authors say the results point to the hidden barriers people face when they attempt to introduce new ideas in certain cultures.

          One way organizations can support good decision-making would be to routinely inoculate themselves with contradictory views to challenge their blindly held assumptions.  Like an actual inoculation, view it as a small, non-lethal dose of an antigen (in this case, threatening information) to stimulate the production of antibodies (or thinking approaches that can combat the threat).  Since mistakes are a powerful way for people to learn, this approach allows us to prototype unexpected outcomes before they occur in the real world.  Exposure to the unexpected can spur creative solutions to protracted problems.  Caleb enjoyed absurdist thought experiments for their ability to jolt someone out of habituated thinking.  This approach was investigated in a 2010 study led by Travis Proulx and published in the Personality and Social Psychology Bulletin.  Participants were shown an absurd Monty Python parody.  The people that had been led to expect a traditional story saw the content as a threat to their model of thinking, and those individuals were more likely to score higher on an assessment called the personal need for structure, which measures a persons need for certainty.

          People that score higher on the PNS are more likely to force information to fit their pre-existing beliefs and fall prey to confirmation bias.  This helps to explain why the participants that expected a traditional story compensated by forcing the details of the absurd story into a familiar framework so they could make more sense of it.  I recently conducted a scientific study with professional auditors, finding that their average PNS scores tended to be higher than scores observed in the general population.  In the study, the participants who scored the highest on the PNS were much more likely to fall prey to confirmation bias when solving a routine audit problem than those who scored the lowest.

The learning culture

          Another way to improve decision-making culture is to focus on learning instead of liability.  Sullenberger has said that when the NTSB transformed their formal lessons-learned process from a blame-based system to a learning-based system, the benefits were invaluable.  Consider for a moment that many smaller danger-signs of accidents usually show up before an incident actually occurs.  Many times, the only thing standing between a near miss and a disaster is nothing but luck.  A culture of psychological safety supports the reporting of small errors that could culminate in a disaster so we can identify contributing factors and probable causes.  This allows us better foresight to detect emerging issues and avoid accidents.

          To use the audit profession as an example, it is well known that regulators are more likely to punish bad outcomes even if they resulted from the best of intentions than good outcomes that happened to result from a poor decision-making process.  Back to the social immune system idea: A question that should be examined is whether, based on organizational culture, a type 1 error (false positive) or a type 2 error (false negative) poses a greater perceived threat to the decision maker.  If an auditor raises suspicions of fraud without knowing in advance whether it can be proven, they could lose a client and ultimately their job.  However, if they follow traditional practices and that model fails to detect signs of a fraud that occurs, the auditor is likely to escape criticism.   This type of industry landscape means it's critical for organizational members to begin speaking up to leaders, and leaders to have the courage to speak up to policymakers.

          A few years ago, I was fortunate enough to spend over a year on the road with Caleb traveling around the country to speak at events and to perform research that was inspired by what I had learned from him.  The concentrated time we had together helped me understand him better, and the debilitating symptoms he battled.  We talked about a lot of things; his struggle to connect with others, his sense of patterns, and what made him laugh.  We talked about Avianca flight 52, US Airways flight 1549, and a lot about decision making and the social immune system.  Our discussion eventually came around to research being conducted at NASA.  Caleb was fascinated with experiments that were then being planned using Scott and Mark Kelly, identical twins that were also both astronauts.  One of the things researchers wanted to know was whether there would be a difference in how their immune systems functioned if one twin spent a year on the space station while his brother stayed on earth.

          Caleb dealt with a variety of sensory issues, not uncommon for people on the spectrum.  I wasn't surprised when he was preoccupied with wondering what it would feel like to be launched in the space shuttle.  "I imagine," he said, “that it would feel like the most intense pushing feeling you could ever have.”  He wanted me to ask an astronaut about it if I ever had the chance.  An opportunity I doubted I would ever have.  Despite Caleb’s deep understanding of social systems, he had difficulty engaging in them, and tragically we lost him to suicide shortly after we got home from one of our longest trips.

Toby Groves and Mark Kelly

          In September 2016 I was speaking at an event in Phoenix, AZ.  I walked backstage and immediately came face to face with a man wearing a blue flight jacket covered with NASA emblems.  You can imagine my surprise when I realized it was shuttle commander Mark Kelly.  We had a little time to talk about culture, and decision making, and Caleb.  I wondered what Kelly’s opinions were on centralized decision making.  “We love our checklists,” Kelly said, but also explained a popular saying they used at NASA, “none of us is dumber than all of us.”  The phrase is a reference to the social disease of groupthink that can infect even the best teams.  If you want to stop groupthink before it takes hold, watch for the emergence of biased sampling, which means giving attention to evidence that fits the traditional model while ignoring evidence that contradicts it.  Kelly told the audience that after his wife, Congresswoman Gabby Giffords had been shot in the head as part of an assassination attempt, he wanted to talk with a newer member of the medical team like a resident before making decisions about the course of her care.  He wanted advice not biased by standardized thinking.

          The final research related to the twins' immune function has yet to be published, but Susan Bailey of Colorado State University who led the study recently said that the year in space appears to have resulted in permanent changes to some of Scott Kelly’s gene expression.  I needed to know the answer to Caleb's question about what liftoff felt like before I left.  “It doesn’t feel like you’re being pushed,” Kelly said, “it feels like you’re being pulled.  Like God, himself may have reached down and plucked you off of the earth”.

 

© Toby Groves 2018 All Rights Reserved


The information in this article was partially adapted from a talk Toby gives with the same title.  The photograph with Mark Kelly was taken in March 2018, at a different event than that mentioned in the article.

For further reading, some of the studies referenced in this article include:
-The bias against creativity: Why people desire but reject creative ideas. Mueller, J. S., Melwani, S., & Goncalo, J. A. (2011).  Retrieved from Cornell University, ILR School site: http://digitalcommons.ilr.cornell.edu/articles/450/ 
-The new science of building great teams, Harvard Business Review, Vol. 90, No. 4. (April 2012), pp. 60-69.  
-The effects of dynamic metacognitive prompts on expert auditor reasoning efficacy. Groves, T. (January 2018).  Dissertation available through ProQuest.