Stay up to date on the latest learnings from the Round community by subscribing to our newsletter. Subscribe

An Innovative Approach to Fighting Misinformation Online

Round Editorial

Fake news spreads fast. According to research shared with Round by Beth Goldberg, Head of Research & Development at Google's Jigsaw, false information usually reaches people six times faster than factual information — and it typically takes more than three days for fact-checkers to issue a correction for Facebook, for example. By this point, most of the impressions have been made. Misinformation presents many challenges, but Goldberg remains optimistic about the role technology can play in combating it. In a recent conversation with Round, she explained how Jigsaw's interventions deter the spread of fake and harmful content online, emphasizing a strategy called "prebunking" that inoculates users against misinformation. Here are three key takeaways.

1. Jigsaw refined 4 tools to combat misinformation for social media: authorship feedback, accuracy prompts, the redirect method and prebunking

From unreliable news sites to viral videos, the internet provides plenty of opportunities for individuals to generate, consume, and share misinformation. Jigsaw uses various tools and techniques to engage with users at critical moments during their online journeys.

Authorship feedback: Before users post a comment online that contains threats, profanity, or misleading information, Jigsaw's AI toxicity detector, Perspective API, offers real-time feedback, allowing users to reconsider before publishing content the AI perceives as offensive.

Accuracy prompts: When users encounter potentially misleading information online, accuracy prompts intervene. These prompts provide bite-sized media literacy tips and ask individuals to consider the accuracy of certain content, reminding people to think twice before engaging with false information.

**Redirect Method: **In 2015, Jigsaw used Google and YouTube ads to redirect users from extremist ISIS-related content. These ads appeared when users searched for radicalizing content and redirected people to a curated playlist with videos offering counter narratives to ISIS's claims.

Prebunking: Of these strategies, Goldberg is most excited about prebunking, which uses short, informational videos to teach people to spot and refute misleading arguments online.

2. Prebunking takes advantage of misinformation's Achilles heel — its predictability

Misinformation can take many forms — deep fakes, cheap fakes, bots, sock puppet accounts — but the content often follows set patterns. Misinformation reuses common rhetorical techniques, like oversimplifying a problem down to two choices, and contains familiar narratives or tropes, like greedy billionaire or the dangerous migrant.  After war broke out in Ukraine, the Jigsaw team wanted to spot those misinformation patterns in Central Europe. They conducted expert interviews in Poland, Czech Republic, and Slovakia to ask: What misinformation do you anticipate about refugees three months from now? Experts predicted a rise in misleading rhetoric around refugees committing violence and draining public funds, so Jigsaw created prebunking videos to address these narratives. The videos were kept short - less than one minute - so they could be shown as ads and scaled across the region. The videos ran on four social media platforms in the three countries, garnering 38 million views and improving the share of viewers who spotted misinformation techniques by as much as 8%.

In one video, three friends are sitting at a bar. A woman says she's going home early because she's worried she'll encounter violent refugees on her walk home, but her friends warn her not to let fake news manipulate her and explain that most Ukrainian refugees are women and children. Goldberg highlighted that Jigsaw designed the video to work for the general public in each country. Instead of emphasizing research or the authority of an institution — which can backfire for people who distrust institutions — the video uses two regular, trustworthy people to share the correct information. Because the video addresses a common misinformation narrative about refugees (rather than a particular piece of fake news), it allowed the Jigsaw team to prebunk people against the inevitable surge of misleading narratives that tend to follow humanitarian crises like the one in Ukraine.

3. Prebunking is so promising because it is accessible and non-judgemental, and its emphasis on self-defense appeals to people across the political spectrum.

To educate viewers about common misinformation tactics like scapegoating, fear-mongering, and false dichotomies, Jigsaw created a series of playful, humorous animated videos. The scapegoating video, for instance, uses a clip from South Park's "Blame Canada" episode to explain to viewers that "if someone tries to make a complicated problem look simple by placing blame on a single group, they're likely trying to manipulate you." These videos teach users new concepts and don't require prior knowledge, making them accessible to people of any education level. Also, unlike some other approaches to counter misinformation, prebunking's framing of self-defense appeals to people across the political spectrum. In fact, in representative lab tests, Goldberg and academic partners found that these prebunking messages worked equally well for conservatives and liberals.

Recommended Articles

Beyond Energy Efficiency: How A Tech Leader Drives Sustainability Through Mindful Design

Erik Karasyk
December 11, 2023
View Article

Every designer working today is trying to make the world a better place, yet it is often forgotten that we are not only designing for people but also for the planet.


How Sustainable Business Practices Drive Innovation

Round Editorial
November 8, 2023
View Article

IKEA’s Senior Advisor and former Chief Sustainability Officer tells us how companies can lead positively by reconciling sustainability and business goals.


Designing Less Biased Products & Algorithms

Round Editorial
November 8, 2023
View Article

Building responsible and equitable algorithms is one of the toughest challenges technologists face today. Expert Cathy O'Neil explains why and what to do about it.

Subscribe to our monthly newsletter
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

© 2023 Round. All rights reserved | Privacy Policy | Terms of Service