Data & Resources


Published on Aug 02, 2022

Truth or consequences

Contact: Communications

The challenges of collective sensemaking during an ongoing “infodemic”.

By Michael Grass, Center for an Informed Public

When the International City/County Management Association (ICMA) convened in Seattle in 2015, I covered the organization’s annual conference for Route Fifty, a digital publication focused on state, county, and municipal government across the country. During one ICMA conference session, the communications director at the City of Glendale, Calif., discussed how that city navigated “e-hostility” by developing a special rumor-debunking municipal website and gave pointers on when officials should and shouldn’t respond to a social media troll.

Looking back, it’s all sensible advice for any local government. But our information environments are far more complicated today. The “e-hostility” of 2015 has supercharged into an “infodemic” that has inflamed tensions; undermined trust in government, media, and institutions; and led to confusion, uncertainty, and endless distractions.

I now think about these communication challenges in the context of mis- and disinformation, our focus of study at the University of Washington’s Center for an Informed Public (CIP), a multidisciplinary research center where I help connect academic research, resources, and programming with the public, policy-makers, and educators.

I’ve come to understand some key lessons and insights that help me make sense of an often disorienting and dystopian subject area that intersects in so many parts of our lives today. I’ve found that when you understand the dynamics of how mis- and disinformation take root and spread, the more empowered you are to sort through the noise and the clutter, understand the importance of context and intent, and know when and how to respond more effectively.

As CIP director Kate Starbird, a UW Human Centered Design & Engineering associate professor who studies crisis informatics, wrote during the early weeks of the pandemic in 2020: “When information is uncertain and anxiety is high, the natural response for people is to try to ‘resolve’ that uncertainty and anxiety . . . by using communication tools like our phones and now our social media platforms—to ‘make sense’ of the situation. We gather information and try to piece together an understanding, often coming up with, and sharing, our own theories of causes, impacts, and best strategies for responding.”

The challenge, of course, is that this process—what researchers like Starbird refer to as collective sensemaking—can be messy and vulnerable to rumors, false claims, and inaccurate information. And with so many more people connected through smartphones and social media than ever before, it’s far easier to introduce, share, and amplify problematic information—whether we intend to or not. It’s important to remember that many people who share misinformation are doing so with good intentions. That’s why responding to misinformation with an empathic approach, instead of conflict, is usually the best first step.

In the event of a natural disaster or other emergency, like an earthquake, flood, or tornado, that uncertainty and incorrect information is usually resolved relatively quickly. But during a long-term and evolving crisis, like the pandemic, it is far more difficult to resolve uncertainty, especially when those with partisan, ideological, or financial motivations actively use our emotions around uncertainty to do their dirty work by unwittingly spreading problematic information.

But disinformation doesn’t need to actually fool us to be successful. Those who spread it often rely on the liar’s dividend—they only need to stir up uncertainty and doubt to achieve their goals of distraction and confusion. CIP cofounder Jevin West, who helped launch UW’s popular “Calling BS” data-reasoning course alongside CIP faculty member Carl Bergstrom, regularly points to Brandolini’s law: “The amount of energy needed to refute [BS] is an order of magnitude larger than is needed to produce it.”

It’s clear that we’re overwhelmed by our information spaces, and we confront so much bad and unreliable information online that it’s impossible to push back on it all. Moreover, since mis- and disinformation narratives are often built around a “kernel of truth,” where false claims are connected to something that is factual or plausible, bad information can find a place to latch onto and spread despite our best efforts to keep it at bay. As Maddy Jalbert, a CIP postdoctoral fellow who studies the intersection of misinformation, memory, and cognitive psychology, recently put it “Information can be very sticky....Once we learn something, we just can’t go back and erase the information.”

 

If you’ve seen something before, you’re more likely to believe it when you came across it again.

That stickiness can be compounded by the illusory truth effect, which Jalbert explains is a “phenomenon that merely repeating information makes it seem more true. This occurs for true information and for false information. If you’ve seen something before, you’re more likely to believe it when you come across it again. Before it was ever studied in a lab, demagogues knew that it worked, that just merely repeating something over and over was an effective tactic to spread that belief.”

Social media platforms can exacerbate the consequences and impacts. The Wall Street Journal’s Facebook Files investigative series looked at a Facebook algorithm change that led political parties in Europe to emphasize anger and conflict. Facebook researchers wrote “They have learnt that harsh attacks on their opponents net the highest engagement. They claim that they ‘try not to,’ but ultimately ‘you use what works.’”

The bigger, incredibly challenging quandary we all face, especially those who work in local communities, is: How do we incentivize calm, reasoned, and respectful civic discourse when our online environments are incentivized toward conflict and derision? There’s so much work to be done to better understand the forces that are driving us apart.

Michael Grass is the assistant director for communications at the University of Washington’s Center for an Informed Public and a former journalist who previously launched and edited Route Fifty, a digital news publication focused on state, county, and municipal government.

For more information: cip.uw.edu

 

Coming to terms

Here are a few phrases, concepts, and definitions to help explain an increasingly complex digital realm of disinformation.

Brandolini’s law: Named for an Italian researcher, this idea says that the amount of time/money/ energy needed to respond to misinformation online is far greater than the amount of time/money/ energy it takes to create it.

llusory truth effect: A psychology concept that helps explain why some misinformation is so hard to shake. When we hear false or misleading information repeatedly, we may come to think of it as the truth.

Liar’s dividend: For someone who knowingly spreads disinformation, the recipient doesn’t need to be fooled by it for the disinformation to be successful. Seeding doubt and manufacturing uncertainty around something that’s truthful is good enough.

Misinformation vs. disinformation: Misinformation is false or misleading information that is shared unintentionally; disinformation is false or misleading information that is shared intentionally and designed to misinform, confuse, or distract. Defining what is and isn’t mis- and disinformation can sometimes be tricky, since they’re often built around some “kernel of truth.”

  • Cityvision
  • Community engagement
  • Media
  • Data
Copyright © 2018-2024 Association of Washington Cities