Let’s Break It Down | The Cost of Spreading Misinformation
When I was younger, I had a classmate who was one of those people that loved to share what I call “gotcha” information — little facts that sound too good to be true, but that nobody would spend the time disputing. One day, they shared one such fact, “Did you know that human blood is blue? That’s right! It only turns red after oxygen hits it, but really, blood is blue.” All of us kids stood stunned, looked down at our arms to see bluish-tinged veins, and concluded that they were right.
Here’s the problem: human blood is never blue, it’s always red. And yet, my adult self can’t quite shake the idea that my classmate from days past was right. That, somehow, all the other information I encounter stating the contrary is wrong. My experience is a great example of a phenomenon called the continued influence effect, which describes just how hard it is to correct false beliefs. Basically, once you tell someone incorrect information, even if it gets corrected later, the incorrect information lingers.
This is why I was especially disturbed to see news that two social media sites will soon have an even broader audience for disseminating misinformation. After months of being blocked by Google, Truth Social, Trump’s alternative to Twitter, is now available for download in the Google Play Store. And just a few days ago, Ye (aka Kanye West) announced plans to purchase Parler, a social media site popular with Trump supporters. Both sites have a reputation for spreading vaccine misinformation, conspiracy theories, and general bigotry. This would be troubling at any time, but is especially so with midterm elections on the horizon in the United States.
When I work with companies, I encourage leaders to embrace workplace conversations about what might otherwise be considered “taboo” topics, like racism, religion, politics, and the like. However, enabling these conversations requires a new set of skills, and I often get the question: What do I do when I find myself in a conversation where someone is repeating misinformation? Let’s consider what the research says about how to proactively, and reactively, navigate this situation.
Proactive strategy: Just share the facts.
Combat the continued influence effect by keeping people from interacting with the misinformation to begin with. This is especially important when introducing information about new topics. I like to think of our brains as sponges. When we learn new information, our brains soak it up like water. If it turns out that the information is incorrect, we have to squeeze some of that “bad water” out to make space for the good stuff. That’s pretty hard to do; it’s much easier if we just get the correct information to begin with.
This insight would have been helpful for my elementary-aged self. As children are learning about the body, teachers could include statements like, “What color is blood? It’s red. It’s red as it flows through the human body, and it remains red after it comes out of the body and interacts with oxygen. The hue might change slightly from a bright red to darker red, but blood is always red.”
Most importantly, avoid speculation or opinion while sharing these facts. Saying, “blood is red, but some people think it looks blue” is introducing misinformation. If you’re not sure about what the facts are about a new topic, consider holding off on sharing until you’ve done more research from a variety of reputable sources.
Reactive strategy: Go beyond retracting the statement.
We won’t always have the luxury of being the first to share information about a new topic. In that case, we may play a more reactive role of combating misinformation. We have to go beyond saying “that’s wrong” or simply stating the correct fact. Research refers to those types of statements as retractions — taking back the misinformation. Unfortunately, retractions are not enough to get rid of false beliefs. We have to go further to refute the misinformation.
Comprehensive refutation includes three components: introduce the misconception, state that it is incorrect, and then provide the correct information. The article I linked when talking about the blood story is a great example of what this looks like in practice. Don’t have a ton of time to write an 800-word essay? I’ve got you covered. Fill in the blanks here: “A lot of people incorrectly think that ____. However, that belief is wrong. The truth is, ______.” Unfortunately, refutation won’t completely eliminate the misinformation, but it’s the more effective option.
Bonus strategy: Take a break.
I am a big advocate of engaging in difficult conversations. In fact, allyship requires people from dominant and/or privileged identities and backgrounds to lean into such discussions. But after you’ve been talking with someone for quite a while, it may begin to feel like you’re not getting anywhere. You’ve run out of facts to include in your, “The truth is, ____” sentence, and you both are getting frustrated. When that happens, use this phrase: “It seems like we’re too far apart in our views to have a productive conversation about this topic right now.” If you’d like, leave the door open for more discussion in the future, or share another resource that they’d find interesting.
It’s especially important to have this proverbial escape hatch during workplace conversations. Inclusive cultures encourage open discussion about thorny topics. But in practice, people may find themselves in a conversation they feel ill-equipped to continue. In those cases, inclusive cultures must also be ones that teach people how to gracefully exit (and provide other resources for continued education).
These days, we are all more likely to find ourselves in conversations with people who believe things that are just wrong. Part of advancing inclusion is figuring out how to engage with claims of misinformation, and doing so in a way that will be most impactful. As we head into what is already a hotly-contested election season, hopefully these tips help you feel more empowered for those conversations.
October 20, 2022