Updated: Aug 16, 2019
I am a huge fan of Facebook—I love all the pictures of tasty meals, exotic vacations, and long-lost friends’ children. I have hundreds of Facebook friends that are former students, adults I haven’t seen since we were in third grade together, and superstars who I shamelessly fangirl over (hi, Issa Rae!). I look forward to losing myself in the depths of Monday morning status updates when everyone gets to work and logs in to social media to rehash their weekends. I can’t wait to see what you are doing right now, even if I’ve never met you in real life.
But one dark side of social media is the way it is so often used to address diametrically opposed points of view. Sometimes I have to sit on my hands not to chime in on a status update with misinformation or a comment to something I have posted from a completely opposite perspective. When Moby blasted the “working class” for electing Trump on Instagram, I lost myself for 20 minutes pulling the numbers to concisely respond in a comment how he was wrong about who elected the president and how his characterization of the “working class” was impacting his own agenda to advocate for change. My own family members have trolled me with vitriolic responses to videos or links I have posted, and I have sniped back with comments meant to shut them down. I know why we respond to these “attacks” in this way, and I also know that this type of dialogue changes nothing—what the research shows us (and what is obvious to see as we watch these comments play out in a status update), is that it takes more than new information to change a deeply held belief, especially one that is tied up in our identity.
Our social constructs, which have evolved as humans have, are as neurologically important to us as shelter. Think about this. It’s why cyberbullying can make someone suicidal. It’s why we have to declare a break from social media with the same formality as quitting drinking or going vegan. In fact, researchers have found that the safety of a social network is perceived by the brain in many of the same ways as physical safety. Your fight or flight, sympathetic nervous system response that makes you irrational and unable to think clearly, increases your heart and breathing rates, spikes your adrenaline, and compromises your executive function fires in the same ways when someone calls you out on Facebook, or even posts a status update to which you vehemently disagree, as it would if you were in physical danger. This is a fact. An attack on your status update is perceived by your brain today as a cheetah stalking you may have been perceived several iterations of homo erectus ago. So when our social standing is attacked, or a deeply held belief is under fire, we go into sympathetic nervous system mode and don’t think rationally. This physical response to a challenge of something we believe is the first barrier to changing our minds. And the more deeply held the belief, the more exaggerated this response. If your identity is wrapped up in this belief, because it’s what people with whom you identify or respect believe, you are that much more attacked and the physical response is that much more exaggerated.
For people who engage in these social media debates, there is a kind of high. The more likes you get, the more endorphins and oxytocin released. And then a snarky troll on your response from someone from a different perspective—well, then you get the adrenaline rush and your executive function is impeded and your fingers are flying while you formulate a witty response that will SHUT. THEM. DOWN (in 40 characters or less). It’s a full dose of adrenal stimulation, and it is the reason many otherwise normal people debate politics on their social media accounts. The primary, physical effect of debating on social media is your own adrenal stimulation—we are compelled to engage by calling someone a racist or classist or communist or whatever when they respond to a disparate opinion online—by our adrenal system. When I basically told Moby he was a classist, rich white kid, I was getting high.
But what about the potential to change people’s minds with new information? What if you know something they obviously don’t? This is where what we know to be true about the psychology and neuroscience of change behavior comes in. Cognitive biases are an evolutionary tool designed to create short-cuts in our brains. If every time you heard a noise you had to stop, evaluate it, determine its identity, and then weigh its potential for danger, you’d be dead. Your brain creates these short-cuts that enable you to skip all those steps so you can move on to more challenging problem solving. Cognitive biases are short-cuts for your brain, but they aren’t necessarily beneficial—they cause us to sometimes default to assumptions or action that aren’t good. There are hundreds of cognitive biases, but some of the ones you probably know of are stereotyping, which save you from having to figure someone out; confirmation bias, which is when you look for all the positives in the idea you have already committed to (which is to blame for all those bad relationships in your adolescence); post-purchase rationalization, which is when you determine something your invested in is ultimately good for you; etc. These may or may not work for you, but they happen automatically in the brain so you can focus on other things.
The two cognitive biases most at play that prevent these social media debates from making any change are Ingroup Bias and Disconfirmation Bias. Ingroup Bias is simple. Basically, it means that when you identify with a group of people, be it because of a subculture, race, religion, or other identity to which you subscribe, you are more likely to believe what they say. In fact, neuroscientists have found that when people are told someone is from the same cultural group, they are more likely to believe the person, presume positive intentions, and dismiss errant behavior than from people they are told are from a different group. Identity in a group, no matter how real the group actually is, wholly alters the way your brain receives information from them (peer pressure is a great example of this). The converse of this is that when you hear something from someone in a group with which you don’t identify, you are less likely to believe what they say, even if its more plausible or has better evidence. So when I come out swinging with my MAGA hat, or my Planned Parenthood t-shirt, or my hijab, whoever else sees that and perceives themselves a part of my group automatically is positioned to put more faith in what I say, and whoever isn’t automatically questions new information from me. This is why building an ingroup is so important for political movements. If I create a group to which you subscribe, your cognitive bias will prevent you from spending too much time analyzing what I tell you. So when someone on Facebook from an ingroup to which you don’t belong gives you new information, your brain tells you to dismiss it. And if someone says something you disagree with, you automatically look for evidence they are not from the same cultural group, rather than the source of disparate information. I have errantly reposted wrong information from people I trusted, and I have dismissed true information from people I didn’t. This is Ingroup Bias. It is helpful in that once you have an established group to trust, you don’t have to spend so much time analyzing all the information. It is an evolutionary short-cut. But it is a critical barrier to making us open to information from people we see as fundamentally different from us, be it due to political affiliation, race, religion, musical taste, or style of dress.
The second bias in effect is Disconfirmation Bias. (The Backfire Effect is a popularized version of this http://theoatmeal.com/comics/believe). Disconfirmation Bias is a short-cut that our brains have evolved to reject new information that challenges deeply held beliefs. When you have a fundamental belief and someone challenges it with new information, your brain automatically rejects it and seeks contrary evidence, even if less compelling, to reaffirm that belief. Why does this happen? Our brains perceive our beliefs and our identification in social groupings as emotional safety. The same flight or fight response triggered in our adrenal system when we perceive physical danger is tripped in a smaller way when our beliefs are challenged. Our brain thinks this is unsafe. So the affirmation of a choice or critical feedback to inform a revision from a trusted third party helps to shore up our decision and make us feel, quite literally, safer. In this way, new information, even from a trusted source, can be hard for us to understand. We have to push through a feeling of being unsafe and the physical triggers that result from this in order to consider new information that is challenging. Ingroup Bias and Disconfirmation Bias are causing me to discount anything contrary or from someone with whom I do not share group identity. As a result, there are few disagreements on social media that result in fundamental change. Moby was unchanged by the stats I spent a solid afternoon sourcing into a well thought-out comment to his Instagram post. And if he even bothered to read it, he would assign me to a group to which he doesn’t subscribe to dismiss my data or he would seek evidence of his own stats to affirm his belief. But at least I got that adrenaline high from my sympathetic nervous system response, right?
Knowing our brains work this way means, first, that a Facebook debate will not change someone’s deeply held beliefs. So we have to know that when we engage in that kind of back and forth, we are just getting high. We engage in those debates to reaffirm our position within our ingroup and be affirmed by their likes or the sense that trolling people not in my ingroup gives me an adrenaline rush. A change in perspective on a political candidate, stance on abortion, perception of immigrant policies, requires us to feel like our identity is aligned with the new information. This means the information has to come from a trusted person with whom I have a positive relationship, I have to be in a positive emotional state (so my parasympathetic nervous system is firing and enabling me to consider new information), and I have to have some challenge or need that this new information meets. Without those conditions, new information, no matter how compelling, is dismissed. So when I am sitting in Facebook having my status update pile-on from a gazillion people I haven’t seen since elementary school, it further reinforces my contrary opinion, fires up my adrenal system, and further entrenches the divide. And when I want my wayward cousin to get on board with a belief he doesn’t share, I have to meet him for a drink, and talk, and be open to the possibilities. Change doesn’t happen when we call people commies, racists, unpatriotic, privileged, or dumb. It happens when we meet in the middle, befriend and love people different from us, and practice disagreeing in safe, positive spaces. In an era proving so divisive, the only way to actually make change is to step across the lines of disagreement and open our hearts to people who have fundamentally different beliefs. That ish is hard. But it is the way our brains are wired.