The lawsuit is a pressure tactic to force Apple to act against Telegram, as it already has against Parler, a social media site that swelled with calls for violence and an uprising ahead of the Capitol siege, according to researchers. Apple and Google launched Parler from their app stores because of its careless moderation policies, and Amazon Web Services also withdrew support, disabling Parler last week for the same reason. Telegram offers both private, private chat rooms and public groups that anyone can join with the app.
The lawsuit was filed by the Safer Network Coalition, a non-partisan group advocating for technologies and policies to remove extremist content from social media, and coalition president Mark Ginsberg, a former US ambassador to Morocco. They complained about Telegram̵
A similar lawsuit is planned against Google, said coalition lawyer Keith Altman.
“Telegram stands out as a super distributor [of hateful speech]”Even compared to Parler,” Ginsberg said in an interview.
Ginsberg, a Jew, claims in the lawsuit that Telegram’s anti-Semitic content puts him at risk and that his iPhone ownership allows him to sue Apple in federal court to require the company to enforce terms of the service that prohibit hate speech. and incitement to violence against apps ported from the App Store.
The lawsuit, filed in the U.S. District Court for Northern California, concerns emotional stress and non-compliance with the California Business Code and seeks unspecified damages and an order requiring Apple to remove Telegram from its app store.
Apple spokesman Fred Sainz did not immediately respond to a request for comment. Mike Ravdonikas, a Telegram spokesman, did not immediately respond to a request for comment.
The siege of the Capitol was widely discussed and instigated on social media and messaging apps, including Parler and Telegram. Supporters of President Trump also celebrated the attack, as happened, and called for more in the days before taking office on Wednesday. The lawsuit against Apple gives the coalition a way to seek action against Telegram, which as a service based abroad can be difficult to access by US courts.
Telegram, which says it operates from Dubai, was developed by Russian internet entrepreneur Pavel Durov. The application is popular among people who want to keep their communication protected from autocratic regimes and others who seek online privacy. Durov himself has clashed with the Russian government over censorship and encryption.
But Telegram also has a reputation as an app for terrorism and hate groups. For years, he was used by Islamic State militants to communicate and spread propaganda while European police worked with Telegram to take down accounts related to the group in 2019.
The company opposed calls to do the same for right-wing accounts that publish racist and anti-Semitic messages. Telegram removed some well-known public groups that called for violence, but many other broadcasts remain active in the service. Some police officers also said that the migration from Parler to Telegram made it difficult for them to monitor extremists and prepare for potential attacks.
The prospects for the success of the suit are uncertain. Section 230 of the Communications Decency Act gives online platforms broad immunity from liability for most of the content they host.
Daphne Keller, who is studying the regulation of the platform at Stanford Law School, called the lawsuit long but interesting. She said it bears a resemblance to lawsuits seeking to force platforms to overturn decisions to remove social media apps and accounts. “It changes the scenario,” she said, trying to force Apple to remove the app.
But the lawsuit could face insurmountable barriers because Apple’s terms of service are broad, giving the company freedom to deal with applications, she said. In addition to section 230, Apple’s decision to keep Telegram on its platform is protected by the company’s right to free speech.
Apple does not require applications such as Telegram to exempt its service from inappropriate content. Rather, they need to have a “filtering method” and a way for users to report it. They must also provide contact information and be able to block “abusive” users of the service.
But Apple is unclear about the content moderation methods that are needed. In the past, Apple allowed applications to exist even when its customers complained about the content of the application. In 2019, when The Washington Post discovered reports of unwanted sexual content, racism and harassment in chat apps – some of which are used by children – Apple allowed the apps to stay in the store because, according to it, they used some content moderation and other precautions.
When Apple removed Parler from the App Store, in a letter to The Post, it said that Parler’s content moderation policies were not good enough. “While there is no perfect system to prevent all dangerous or hateful user content, applications are required to have robust content moderation plans in order to actively and effectively address these issues,” said Apple of Parler.
Google also removed Parler from its app store, saying it had warned Parler of its loose moderation before deciding to stop it.
Ginsberg sent a letter to Apple CEO Tim Cook in July, urging him to address a white-nationalist, anti-Semitic and violent telegram speech and “keep TELEGRAM’s financial legs on fire.”
He writes, “Due to the growing proliferation of Russian and Eastern European anti-Semitic extremist neo-Nazi groups using TELEGRAM, CSW began an in-depth study of its role earlier this year. Our study revealed serious cases in which the end-to-end encryption service of TELEGRAM made it possible. “
Ginsberg said he had not received a response to the letter.