“In a few months, WhatsApp will release a new version that will show you ads based on your chats,” said another. “Don’t accept the new policy!”
Thousands of such messages went viral on WhatsApp, the Facebook-owned instant messaging app, in the days that followed. Prepared by celebrities such as Tesla CEO Elon Musk and whistleblower Edward Snowden, millions of people rushed to download WhatsApp alternatives such as Signal and Telegram.
There was only one problem: It became clear from the 4,000-word policy that the new changes only apply if people use WhatsApp to chat with business and not for personal conversations with friends and family.
No, the new conditions do not allow Facebook to read your conversations on WhatsApp, the company explained to anyone who asked. Top executives posted long topics on Twitter and gave interviews for major publications in India, the company’s largest market. WhatsApp spent millions buying front-page ads in major newspapers and running graphics, debunking rumors on its website with a large “Share on WhatsApp” button, hoping to bring some truth to the flow of misinformation that runs through its platform. The company also encouraged Facebook employees to share these infographics, according to posts on the internal Workplace message board.
“There has been a lot of misinformation and confusion, so we are working to provide accurate information on how WhatsApp protects people’s personal conversations,” a WhatsApp spokesman told BuzzFeed News. “We use our status feature to communicate directly with people on WhatsApp, as well as to publish accurate information on social media and our website in dozens of languages. Of course, we also made these resources available to people who work for our company. so they can answer questions directly from friends and family if they wish. “
None of this worked.
For years, rumors and scams circulating on WhatsApp have fueled a crisis of misinformation in some of the world’s most populous countries, such as Brazil and India, where the app is the main way most people talk to each other. Now this crisis has reached the company itself.
“The trust in the platforms is [at a] bottom, ”Claire Wardle, co-founder and director of First Draft, a non-profit organization that researches misinformation, told BuzzFeed News. “We have years of people becoming increasingly concerned about the strength of technology companies, especially the awareness of how much data they collect about us. So when privacy policies change, people rightly worry about what that means. “
Wardle said people are concerned that WhatsApp will link their behavior in the app to the data from their Facebook accounts.
“Facebook and WhatsApp have a huge lack of trust,” said Pratik Sinha, founder of Alt News, a fact-finding platform in India. “Once you have that, any kind of misinformation attributed to you is easily consumed.”
What doesn’t help, both Sinha and Wardle added, is a lack of understanding among ordinary people about how technology and privacy work. “Confusion is where disinformation thrives,” Wardle said, “so people saw the policy changes, came to conclusions, and it’s not surprising that many people believed the rumors.
These patterns of misinformation, which have thrived on WhatsApp for years, often lead to harm. In 2013, a video went viral in Muzaffarnagar, a city in northern India, where two young men were allegedly lynched, inciting riots between Hindu and Muslim communities that killed dozens of people. A police investigation found that the video was more than two years old and was not even shot in India. In Brazil, fake news flooded the platform and was used in favor of far-right candidate Jair Bolsonaro, who won the country’s 2018 presidential election.
But the company did not take its disinformation problem seriously until 2018, when rumors of child abductors spilling across the platform led to a series of brutal lynchings across India. In a statement issued at the time, the Indian Ministry of Information Technology warned WhatsApp of legal action and said the company would be “treated as instigators” if it did not resolve the issue by sending WhatsApp into crisis. He flew with top executives from Menlo Park, California, to New Delhi to meet with government officials and journalists, and campaigned to raise awareness about misinformation.
It also built new features into the app to directly counter misinformation for the first time, such as labeling forwarded messages and limiting the number of people or groups of content that can be forwarded to slow down viral content. In August last year, it also began allowing people in several countries to upload the text of a message to Google to check if the referral was fake. This feature is not yet available for WhatsApp users in India.
Since then, the company has been working on a tool that will allow users to search for images they received in the app with one touch in 2019, a move that will help people verify the facts more easily. But almost two years later, there is no trace of the feature, although a text version is available in more than a dozen countries that do not yet include India.
“We’re still working on the search tool,” a WhatsApp spokesman told BuzzFeed News.
This week, the company posted a status message, the equivalent of WhatsApp’s Facebook story, at the top of the people’s status section. The touch of status revealed a series of messages from the company, which debunked the rumors.
“WhatsApp does not share your contacts with Facebook,” said the first. Two more status updates have clarified that WhatsApp cannot see people’s locations and cannot read or listen to encrypted private conversations. “We are committed to your privacy,” the statement said.
On Thursday, employees had a number of questions to Facebook CEO Mark Zuckerberg before weekly questions and answers, according to internal reports viewed by BuzzFeed News. Some wanted to know if the growing shift to Signal and Telegram affects the use and growth of WhatsApp. Others wanted the CEO to ask if Facebook was using WhatsApp metadata to serve ads.
“The public is outraged by changes to WhatsApp PrivPolicy,” said another person. “The distrust of FB is so great that we need to be more careful about it.”
Ryan Mack contributed to the reporting.