Home https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ World https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Instagram algorithms serve the disinformation of COVID-19, studies: NPR

Instagram algorithms serve the disinformation of COVID-19, studies: NPR




Researchers are concerned that the new feature of Instagram’s “suggested posts” is contributing to the spread of misinformation.

Dennis Charlett / AFP via Getty Images


hide caption

switch the caption

Dennis Charlett / AFP via Getty Images

Researchers are concerned that the new feature of Instagram’s “suggested posts” is contributing to the spread of misinformation.

Dennis Charlett / AFP via Getty Images

Instagram recommends false claims about COVID-19, vaccines and the 2020 U.S. election to people who seem to be interested in related topics, according to a new report by a group that tracks disinformation online.

“Instagram’s algorithm is leading people more and more to their own realities, but it also divides these realities, so some people don’t get any misinformation, and some get more and more misinformation,” said Imran Ahmed, CEO of the Center for counteracting the digital hatred that conducted the study.

From September to November 2020, Instagram recommended 104 posts containing misinformation, or about one post per week, to 15 accounts created by the UK-based non-profit organization.

Automated recommendations appeared in several places in the photo-sharing app, including the new Suggested Posts feature, introduced in August 2020, and the Research section, which directs users to content they may be interested in.

The study is the latest effort to document how social media referral systems contribute to the spread of misinformation, which researchers say has accelerated over the past year, fueled by the pandemic and the irritating presidential election.

Facebook, which owns Instagram, has disintegrated more aggressively in recent months. He extended his ban on counterfeit vaccines against COVID-19 on his eponymous platform on Instagram in February. But critics say the company has not dealt enough with how its automated referral systems expose people to misinformation. They argue that social networking algorithms can send those who are curious about questionable claims into a rabbit hole with more extreme content.

Ahmed said he was particularly concerned about the introduction of “suggested posts” on Instagram last year, a feature aimed at getting users to spend more time on the app.

Users who view everything recently posted from accounts they already follow see posts from accounts they don’t follow at the bottom of their Instagram feeds. Suggestions are based on the content they have already mastered.

“Putting it on schedule is really powerful,” Ahmed said. “Most people wouldn’t realize they’re eating information from accounts they don’t follow. They think ‘these are people I’ve chosen to follow and trust,’ and that makes it so dangerous.”

The Center for Combating Digital Hate says Instagram should stop recommending posts “until it shows that it no longer promotes dangerous misinformation,” and exclude posts about COVID-19 or vaccines altogether.

To test how Instagram’s recommendations work, the non-profit organization, which works with the youth advocacy group Restless Development, led volunteers to create 15 new Instagram accounts.

The accounts followed different sets of existing social network accounts. These bills range from reputable health authorities; the defenders of health, alternative health and the anti-vaccine; of far-right militia groups and people promoting the discredited Qanon conspiracy theory that Facebook banned in October.

Profiles that follow health-promoting and anti-vaccine opponents have received publications with false claims about COVID-19 and more aggressive anti-vaccine content, the researchers found.

But the recommendations did not end there. These profiles were also “fed with election misinformation based on hate identity and conspiracy theories,” including anti-Semitic content, Ahmed said.

Profiles that followed reports from Qanon or the far right, in turn, recommended misinformation about COVID and vaccines – even if they also followed reliable health organizations.

The only profiles that have not been misinformed follow highly recognized health organizations, including the Centers for Disease Control and Prevention, the World Health Organization, and the Gates Foundation.

The survey does not reveal how many suggested posts have been viewed for each of the accounts, making it impossible to determine how often Instagram recommends misinformation.

Facebook spokesman Raki Wayne told NPR the company’s stock[s] the goal of reducing the spread of misinformation “, but challenged the research methodology.

“This study is five months old and uses an extremely small sample size of only 104 publications,” Wayne said. “This is in stark contrast to the 12 million harmful misinformation related to vaccines and COVID-19 that we have removed from Facebook and Instagram since the beginning of the pandemic.”

Facebook says that when people search for COVID-19 or vaccines in its applications, including Instagram, it directs them to reliable information from reputable health organizations such as the WHO, the CDC and the UK National Health Service.

“We’re also working on improvements to Instagram Search to make accounts that discourage vaccines more difficult to find,” Wayne said.

Researchers have been following the overlap between conspiracy theories and the way they appear in social media recommendations for some time. Some anti-vaccine activists began publishing Qanon content last year, while widespread distributors of fraudulent election fraud turned to publishing misinformation about vaccines.

“The fact that there is a connection between these communities is something that is quite well documented,” said Rene DiResta, who studies misinformation at the Stanford Observatory. She said back in 2016 that a Facebook account she used to track the anti-vaccination movement had received recommendations to join groups for the Pizzagate conspiracy, Qanon’s predecessor.

Ahmed linked the overlap in various conspiracies recommended in his group’s study to the uprising in the US Capitol.

“That’s exactly what we saw on January 6,” he said. “It’s united by these extreme forces. And what drove it, in part? The algorithm.”

Editor’s note: Facebook is among the financial supporters of NPR.


Source link