Since Russian agents and other opportunists abused his platform in an attempt to manipulate the 2016 US presidential election, Facebook has repeatedly insisted that it has learned its lesson and is no longer a channel for misinformation, voter suppression and election interruptions.
But it was a long and stopping road for the social network. Critical outsiders, as well as some of Facebook’s own employees, say the company ‘s efforts to revise its rules and tighten safeguards remain completely insufficient for the task, even though it has spent billions on the project. As to why, they point to the company̵
“Am I worried about the election?” I’m terrible, “said Roger McNami, a Silicon Valley venture capitalist and early Facebook investor who became a vocal critic. “On the current scale of the company, this is a clear and current threat to democracy and national security.”
The company’s rhetoric has certainly been updated. CEO Mark Zuckerberg is now casually referring to possible outcomes that are unimaginable in 2016 – including possible civil unrest and potentially contested elections that Facebook could easily make worse – as challenges facing the platform.
“This election will not be as usual,” Zuckerberg wrote in a September post on Facebook, in which he described Facebook’s efforts to promote voting and remove misinformation from its service. “We all have a responsibility to defend our democracy.”
Yet for years, Facebook executives seemed surprised every time their platform – created to connect the world – was used for malicious purposes. Zuckerberg has proposed multi-apologies over the years, as if no one could have predicted that people would use Facebook to broadcast live murders and suicides, incite ethnic cleansing, promote counterfeit cancer drugs or try to steal elections.
While other platforms such as Twitter and YouTube are also struggling to deal with misinformation and hateful content, Facebook is notable for its scope and scale and, compared to many other platforms, for its slower response to the challenges identified in 2016.
Immediately after the election of President Donald Trump, Zuckerberg offered a remarkably deafening tone. regarding the idea that the “fake news” spread on Facebook could have influenced the 2016 election, calling it a “pretty crazy idea.” A week later, he returned the comment.
Since then, Facebook has released a series of the slowest events for its slowness to act against the threats of the 2016 election and has promised to do better. “I don’t think they’ve gotten better at listening,” said David Kirkpatrick, author of a book on the rise of Facebook. “What has changed is that more people are telling them they need to do something.”
The company has hired external fact-checks, added restrictions – then more restrictions – on political ads, and removed thousands of accounts, pages, and groups it found to be involved in “coordinated inauthentic behavior.” This is Facebook’s term for fake accounts and groups that maliciously target political discourse in countries ranging from Albania to Zimbabwe.
It has also begun adding warning labels to publications that contain misinformation about the vote, and has sometimes taken steps to limit the spread of misleading publications. In recent weeks, the platform has also banned posts that deny the Holocaust and joined Twitter to limit the spread of an unverified political story about Hunter Biden, the son of Democratic presidential candidate Joe Biden, published by the conservative New York Post.
All this undoubtedly puts Facebook in a better position than four years ago. But that doesn’t mean he’s fully prepared. Despite tight rules banning them, violent militias still use the platform to organize. Recently, this involved thwarting a kidnapping plot the governor of Michigan.
In the four years since the last election, Facebook’s profits and growth have skyrocketed. This year, analysts expect the company to make a profit of $ 23.2 billion with revenue of $ 80 billion, according to FactSet. It currently boasts 2.7 billion users worldwide, up from 1.8 billion at the time in 2016.
Facebook faces a number of government investigations into its size and market power, including an antitrust investigation by the US Federal Trade Commission. An earlier FTC investigation imposed a $ 5 billion fine on Facebook, but did not require further changes.
“Their number one priority is growth, not harm reduction,” Kirkpatrick said. “And that’s unlikely to change.”
Part of the problem: Zuckerberg has an iron connection with the company, but still doesn’t take criticism of him or his work seriously, blames social media expert Jennifer Griegiel, a communications professor at the University of Syracuse. But the public knows what’s going on, she said. “They see COVID’s misinformation. They see Donald Trump exploiting him. They can’t see it. ”
Facebook insists it takes the disinformation challenge seriously – especially when it comes to elections.
“The election has changed since 2016, as has Facebook,” the company said in a statement setting out its policies regarding elections and voting. “We have more people and better technology to protect our platforms and we have improved our content and implementation policies.”
Griegiel says such comments are appropriate for the course. “This company uses PR instead of an ethical business model,” she said.
Kirkpatrick notes that board members and managers who opposed the CEO – a group that includes the founders of Instagram and WhatsApp – have left the company.
“He’s so sure that Facebook’s overall impact on the world is positive,” and that critics aren’t giving him enough credit for it, Kirkpatrick told Zuckerberg. As a result, Facebook’s CEO is reluctant to accept constructive feedback. “He doesn’t have to do anything he doesn’t want to do. He has no supervision, ‘said Kirkpatrick.
So far, the federal government has left Facebook to itself due to a lack of accountability that has only empowered the company, according to U.S. lawmaker Pramila Jayapal, a Washington Democrat who baked Zuckerberg during a Capitol Hill hearing in July.
Warning labels are limited if the algorithms underlying the platform are designed to target polarizing material to users, she said. “I think Facebook has done some things that show it understands its role. But I think it was too little, too late. “