Last February, Cadbury Chocolate fell victim to a hoax. The image below went viral in an Indonesian WhatsApp group called “Viral Media Johor”, and later in a Nigerian group.
Obviously, the post was fake news. The man in the image is Aminu Ogwuche, who was arrested on suspicion of involvement with the bombing of a Nigerian bus station back in 2014. He has never worked for Mondelez, the company that makes Cadbury chocolate, and their products are not infected with HIV. Indeed, it’s not even possible to contract HIV from eating food contaminated with HIV-positive blood.
But the problem that this story and others like it pose is real. Rumours, hoaxes and misinformation find fertile breeding ground on social media. But as Google, Facebook, Twitter and other social media platforms increasingly crack down on misinformation, the purveyors of false stories are seeking refuge on direct messaging apps such as WhatsApp.
In developed countries, WhatsApp is primarily used as a personal messaging app. But in developing countries many people rely on it as a social network. Here, it’s not uncommon to join groups with hundreds of members. People follow groups dedicated to topics ranging from interests in sports and entertainment to media and politics, often finding them through websites such as the Brazilian Grupos de Zap. Despite WhatsApp’s limitation of 256 members per group, thousands of groups can exist at any given time dedicated to a political candidate, party or a social movement.
WhatsApp’s misinformation problem
The problem is that WhatsApp is particularly vulnerable to misinformation. Because its messages are encrypted so that only recipients can read them, the app provides a safe haven from snooping individuals and governments. This, combined with a mistrust of government, often prompts people to use WhatsApp to exchange private information that they feel hasn’t been “contaminated” with pro-government or corporate bias. But as the encryption stops WhatsApp from moderating messages, it’s difficult, if not impossible, for the company to fact-check or delete misleading messages or links.
WhatsApp itself isn’t at the root of misinformation. Political polarisation, ethnic tensions, the rise of instant communications and a growing mistrust of politicians all contribute to the current environment in which fake news has flourished.
But because misinformation on WhatsApp is so difficult to debunk, stories like the Cadbury rumour and other health-related hoaxes emerge again and again. For example, false rumours about vaccines can cause dangerous dips in the number of vaccinations.
Fake stories about politics can also quickly spread from group to group. During the recent Brazilian elections, business people connected to right-wing populist candidate Jair Bolsonaro were accused of creating thousands of WhatsApp groups supporting him and using them to spread false content about his opponents. Sometimes, WhatsApp rumours have even led to murders, most recently in Mexico and India.
What can be done?
WhatsApp maintains that it neither can nor wants to access any of the messages that are sent on the platform. So content moderation, as Facebook and Twitter are carrying out, isn’t an option. It has started to ban users who show suspicious behaviour or may really be software bots. It has also added a notification to show when a message has been forwarded from another account, and has limited the number of times you can forward a link.
But WhatsApp has also commissioned us and several other research groups to investigate the problem of misinformation on the app and look for alternative ways to address it. Our prior research shows that a game-based inoculation approach can help people develop resistance to online deception.
In contrast to existing technology-based solutions, we have had some success with a psychological intervention in the form of an online game, Bad News. The idea is that when people play this game, which we developed in collaboration with the Dutch anti-disinformation platform DROG, they will learn more about the various techniques of misinformation and how they are often deployed. It’s based on an idea from social psychology called “inoculation theory” which holds that preemptively warning and exposing people to a weak dose of misinformation will encourage them to cultivate mental defence against it, leaving them better prepared.
We now plan to develop a new adaptation of our game that can educate players about the complex spread of misinformation on WhatsApp and its potential social consequences. This free-to-play online game will be used as a novel digital media literacy tool in India and other countries. To do so, we are collaborating with the Digital Empowerment Foundation, a large media literacy organisation in India that will conduct workshops with the new game. Addressing the current spread of misinformation in India is particularly important considering the upcoming elections there.
Although our game clearly isn’t the only solution to help counter the spread of fake news on WhatsApp, we hope and expect that finding new ways to improve media literacy will empower people around the world to become less susceptible to misinformation.