It was a weekend morning in March 2020.
The census campaign director at the Leadership Conference for Civil and Human Rights, with whom Spitfire was working to encourage participation in the 2020 Census, pinged us with a text message. The COVID-19 pandemic was quickly causing chaos across the United States, and there was news of a pending economic relief package from Congress. The text message confirmed we had an escalating misinformation situation on our hands. Tyrese Gibson, of “Fast and Furious” fame, had just posted on his Instagram account of more than 12 million followers false information claiming people had to fill out the census to receive a pandemic relief stimulus check. In our work with the Census Counts coalition, we had seen this false rumor a few times online in our monitoring of census-related social media conversations, but until now, we had all agreed not to respond. We didn’t want to plant the idea of the Census Bureau sharing an individual’s information with any government agency. Not only is the bureau legally prohibited from doing that, we also knew trust in the government was already very low, especially among the marginalized communities historically undercounted in the census, the very people we were motivating to be counted. Leaving this rumor unaddressed risked suppressing census participation.
But we had prepared for a scenario like this. When we first saw this false rumor surface, we drafted messaging in case it ever did warrant a response. And the campaign had a rapid response infrastructure in place. So, we alerted our partners working on the census to report the post to Instagram and Facebook as a violation of their census interference policy. And we reached out to the publicist of a prominent Black actress who was already engaged in promoting the census among our priority audiences to see if she would share a debunking message with her followers on social media. As it turned out, after about an hour, Instagram took down the harmful post so our partners didn’t have to activate their networks, and we all went back to our weekends.
That example demonstrates a number of lessons you will learn from Just Truth, A communicators’ guide to combating disinformation in a hyperconnected world.
- Why did we decide this was a time to activate the network when we hadn’t in the past?
- How did we know what kind of messaging to deploy?
- What steps had we taken in advance so everyone knew how to report the misinformation to the social media platforms?
So what are you waiting for? Let’s get started. We recommend you start with the Misinformation vs Disinformation module.
Countering disinformation can be difficult, especially because attempts to refute false information often result in repeating the falsehood, which can ironically give the disinformation more life.
One effective practice to combat disinformation is to get ahead of it with inoculation messaging. Inoculation messaging is a strategy that primes audiences by delivering factual information before audiences encounter the falsehood, while simultaneously discrediting the future disinformation by exposing its motives and techniques.
Inoculation messages include a few essential components: priming the audience to recognize the falsehood by warning them about the impending disinformation and discrediting the argument by explaining the techniques and/or motivations behind it. Inoculation messaging works like a disinformation vaccine—exposing us to a weakened form of the disinformation and thereby helping us build resistance to future exposure.
This ad run by Raphael Warnock’s 2020 campaign in Georgia for the U.S. Senate is an example of inoculation messaging at its best.
While running for the Senate from Georgia in 2020, Warnock was attacked multiple times by opponent Kelly Loeffler’s ads, which made many false claims about him. Rather than repeat the falsehoods Loeffler’s campaign was spreading about him, Warnock simply referenced the disinformation by saying that Loeffler’s campaign was “taking things out of context from over 25 years ago.” As you can see in the video, the ad also highlighted headlines indicating that the claims against him were false without referencing the lies themselves. Using humor also made the ad more memorable for audiences.
The Warnock campaign ad shows us that it is possible to counter disinformation without repeating it. By being as specific as possible without explicitly naming the perpetrators of the disinformation we can discredit them without giving them airtime, while simultaneously inoculating audiences against their false claims.
Resources
To learn more about strategies to combat disinformation, please visit our sections on messaging under the Start Your Comms Strategy module.
Climate scientists agree - nearly unanimously at 97 percent- that global temperatures are rising and that human activity is playing a role in fueling climate change.
As recently as August 2021, the Intergovernmental Panel on Climate Change (IPCC), the United Nations body for assessing the science related to climate change, found that human activity is changing our climate in rapid and sometimes irreversible ways.
Yet despite the overwhelming consensus on climate science, over the years a select few climate deniers have attempted to weaponize fake experts as a tactic to lower public faith in science, breed uncertainty and sow disinformation.
Fake experts are individuals who are positioned as experts to comment on a field they don’t actually have expertise in. They may have expertise in a related field, or no expertise at all. But they are positioned by an opposing side to present false information disguised as an equally valid opposing view - often on indisputable topics. Climate change deniers, for example, may position a spokesperson with little to no scientific or climate background -- let alone expertise -- as an expert to baselessly dispute the scientific evidence that climate change is caused by humans, and to falsely claim that some scientists don’t believe that climate change is real.
Members of the media, no matter how well intentioned, have, in the past, frequently been used as pawns in this climate denial scheme. When reporters give a platform to fake experts, they risk inadvertently spreading mis- and disinformation by creating a false equivalence between two vastly disparate arguments that are not equally factual.
Consider, for example, an evening news host who turns a climate change conversation into a debate format: A seasoned scientific climate expert with decades of experience argues that humans are contributing to climate change on one side, and a political commentator without a scientific or climate background denying the existence of climate change on the other. Both guests are given equal platforms to make their case, and neither guest is fact-checked. A viewer watching this conversation unfold might take away that the guests’ arguments are equally valid, even though 97 percent of climate scientists agree that humans contribute to climate change.
Climate denialism fueled by mis- and disinformation has not only undermined public faith in science but has also notably impacted our ability to address the impacts of climate change. While mainstream media has improved its coverage of the climate markedly over the last several years, we have still lost irrevocable time in the fight against climate change.
This 2014 segment from HBO’s “Last Week Tonight with John Oliver” demonstrates - with a little humor - the impact of fake experts and false equivalency:
Resources
To learn more about fake experts as a disinformation tactic, please visit our module on Recognizing Techniques and Rhetorical Strategies Within the Communications Ecosystem.
To learn more about false equivalency and the role of media in the inadvertent spread of disinformation, please visit our module on The Role of the News Media.
Voter suppression tactics have been used against the Black community since Black people first won the right to vote.
While most people associate voter suppression with strict identification requirements, gerrymandering, physical intimidation tactics and racist laws, mis- and disinformation are also used as a form of digital voter suppression. Political ad targeting, bots and sockpuppet accounts are just some of the ways people sow mis- and disinformation online to suppress votes.
Sockpuppet accounts are fake social profiles created to deceive other users. Sockpuppets can be real people or bots that pretend to be part of a particular community so they can build trust and later use it to spread disinformation. These tactics are exacerbated by tech companies that fail to adequately moderate and remove false information and fake accounts. A prominent example of this occurred during the 2016 presidential election cycle, when sockpuppet accounts on social media were used to build trust within Black communities and then leveraged to share disinformation. These accounts, with names that indicated they were run by Black activists, were actually run by Russian operatives in a practice that Deen Freelon, a professor at the University of North Carolina at Chapel Hill, refers to as digital blackface.
According to a British Channel 4 News report, Cambridge Analytica, the research firm co-founded by Stephen K. Bannon, who served as chief executive of Donald Trump’s 2016 presidential campaign, profiled 3.5 million Black Americans in 16 swing states and categorized them for “deterrence” to dissuade them from voting for Hillary Clinton—or from voting at all—by targeting them with disinformation campaigns.
Shireen Mitchell, who has been researching media manipulation and disinformation on social media for years, described several of these sockpuppet accounts—many of them led by Russian operatives during the 2016 election campaign—in her report, A Threat to American Democracy: Digital Voter Suppression.
All of the disinformation campaigns during the 2016 election cycle leveraged issues that are important to the Black community—criminal justice, maternal mortality, reparations and systemic racism—and leveraged sockpuppet accounts to suppress the Black vote. This was done with the knowledge that these issues are key for Black women voters in particular. As Shireen Mitchell writes:
“This is why fake accounts pretending to be Black women matter. Not only in the disinformation campaigns but in every election. There has been a consistent number of fake accounts posing as Black women since 2013. These fake accounts, who pretend to be Black women, seem to be real people with real concerns. They connect with the American Black community online attempting to learn Black vernacular and key issue areas. Once the election ramps up they've gained enough following and trust that's when they begin to share disinformation. The goal is to make sure they have enough of a following before the shift to disseminate the disinformation.”
Digital voter suppression was not limited to the 2016 election. Disinformation was used to suppress votes in the Georgia runoff election in 2020 with an Avaaz analysis of Georgia-related election misinformation on Facebook finding that 60 percent of detected mis- and disinformation posts reached thousands of voters without fact-check labels. One of these campaigns included 20 posts falsely claiming that the NAACP issued a warning in Georgia that white supremacists would be targeting Black men for violent acts.
Resources
To learn more about other disinformation tactics, please visit our modules on Techniques and Rhetorical Strategies and Recognizing Our Role.
A wide variety of disinformation threats surfaced during the lead-up to the decennial Census in 2020.
One example in particular, which surfaced before and during the data collection period during which U.S. Census workers were going door-to-door to conduct routine data collection via surveys, demonstrates how mis- and disinformation can be spread inadvertently, even by well-meaning people.
During 2020, a hoax spread on WhatsApp -- an international messenger app that connects millions of people worldwide -- among a group of Arab Americans. The hoax suggested that robbers were impersonating workers from the U.S. “Home Affairs” department and using the census to rob homes. Of course, the United States does not have a Home Affairs Department. This false story originated based on a real event in South Africa in 2017 -- three years before the U.S. Census Bureau began its 2020 survey.
Even though the U.S. Census Bureau, Snopes and other news outlets debunked this hoax early on and civil rights groups like the Arab American Institute and Asian Americans Advancing Justice (AAJC) convinced social media platforms to remove the content, the hoax continued to scale -- or spread -- in various forms.
While the story was false, it was disseminated as a warning on a physical flyer, was translated into Mandarin, and then shared as a photo in a WeChat group of Chinese immigrants. It was shared as the basis for a news article in the Korea Times Chicago. It was shared on NextDoor multiple times by concerned neighbors. It was shared within local Facebook groups in at least 10 U.S. states. Police departments shared it on Facebook as a pre-emptive warning, in case it was real, which in turn prompted already fearful communities to amplify it more.
The hoax was not intentionally or maliciously spread by bad actors seeking to undermine trust in the U.S. Census Bureau’s operations. Rather, it scaled far and wide because well-intentioned people wanted to protect their neighbors. The rapid spread of this hoax was also bolstered due to the broader climate of fear and mistrust in the government at the time. The story validated many communities’ fears and world views.
Spitfire partnered with the Arab American Institute and other civil rights groups to try to stop the spread of this hoax. We needed to tread carefully and understand that the fear many people felt was not only valid but also a deeply seated emotion. We knew that any message would need to be delivered by a trusted messenger, as opposed to someone representing the U.S. Census Bureau, and would need to address fears by equipping audiences with the tools they need to feel safe. We took the opportunity to combat the hoax, but also to inform audiences about when to expect visits from U.S. Census workers, how to verify their identity and who to contact if they are uncertain. We prepared the message and partnered with a trusted messenger from each of the online communities where the hoax was scaled in order to disseminate the message to their communities. You can read the balancing message we created below:
"Luckily, this is not a real threat! This is an old message that has been going around since 2017 on different messaging apps and social media, about an incident in South Africa at that time. (There is no Home Affairs Department in the U.S.!)."
U.S. census workers WILL be making visits to homes between September and January to verify addresses for next year's Census count, which is a common practice to prepare for every Census. You can and should always ask for verification if a census worker knocks on your door. All census workers making visits have a valid ID from the U.S. Department of Commerce. If you are uncertain about their identity, you can call 800-923-8282 to speak with someone from the Census Bureau.
If you want more information or have any questions about census workers visiting your neighborhood, you can always reach out to the Arab American Institute."
Resources
To learn more about the inadvertent spread of mis- and disinformation, please visit our module on Recognizing Your Role in Amplifying Disinformation.
To learn more about crafting compelling messaging, including balancing messages as demonstrated above, to curb mis- and disinformation, please visit our module on In-Channel Balancing Messaging.