After promising results in Eastern Europe, Google is launching a new campaign in Germany that aims to make people more resilient to the destructive effects of misinformation on the Internet.
The tech giant plans to release a series of short videos highlighting techniques common to many misleading claims. The videos will appear as ads on Facebook, YouTube or TikTok in Germany. A similar campaign in India is also under development.
This approach is called prebunking, and it aims to teach people how to recognize false statements before they encounter them. This strategy has support among researchers and technology companies.
“Using ads as a vehicle to counter a disinformation technique is pretty novel. And we’re excited about the results,” says Beth Goldberg, head of research and development at Jigsaw, Google’s incubator division that studies new social challenges.
While belief in lies and conspiracy theories are nothing new, the speed and accessibility of the Internet have given them even more power. Under the influence of algorithms, misleading claims can scare people away from vaccination, spread propaganda, fuel distrust of democratic institutions, and incite violence.
This is a challenge that has no easy solutions. Journalist fact checks are effective, but they are time-consuming, not read by everyone, and won’t convince those who already distrust traditional journalism. Moderation of content by tech companies is another answer, but it only spreads misinformation elsewhere, prompting cries of censorship and bias.
In contrast, prebunking videos are relatively cheap and easy to produce and can be seen by millions of people if posted on popular platforms. They also avoid the political component entirely, focusing not on the topics of false claims that are often cultural lightning rods, but on the techniques that make viral misinformation so contagious.
These techniques include fear-mongering, scapegoating, false comparisons, exaggeration and missing context. Whether the subject is COVID-19, mass shootings, immigration, climate change or elections, misleading claims often rely on one or more of these tricks to exploit emotions and short-circuit critical thinking.
Last fall, Google launched its biggest test of the theory to date, with a video campaign in Poland, the Czech Republic and Slovakia aimed at debunking false claims. The videos examined various methods used in false claims about Ukrainian refugees. Many of these claims were based on disturbing and unsubstantiated stories about refugees committing crimes or taking jobs away from local residents.
These videos have been viewed 38 million times on Facebook, TikTok, YouTube and Twitter – a number equal to the majority of the population of these three countries. The researchers found that compared to people who didn’t see these videos, those who watched them were more likely to be able to recognize disinformation techniques and less likely to spread false claims to others.
Google’s new campaign in Germany will focus on photos and videos and how easily they can be presented as evidence of something false. For example, last week, after the earthquake in Turkey, some social media users shared a video of a massive explosion in Beirut in 2020, claiming that it was actually footage of a nuclear explosion that triggered the earthquake. And this is not the first time that the explosion of 2020 became the object of misinformation.
Google found that its campaign in Eastern Europe varied from country to country. While the effect of the videos was highest in Poland, they had little to no discernible effect,” researchers found. One possible explanation: The videos were dubbed into the Slovak language, and not created specifically for the local audience.
But along with traditional journalism, content moderation, and other methods of combating disinformation, prebunking can help communities achieve a kind of herd immunity when it comes to disinformation, limiting its spread and impact.
“You can think of misinformation as a virus. It spreads. It lingers. It can make people act in certain ways,” said Sander Van der Linden, a professor at the University of Cambridge who is considered a leading expert on the theory of prebunking. “Some people develop symptoms, some do not. So: if it spreads and acts like a virus, then maybe we can figure out how to inoculate people.”