There is still much we do not know about the suspect and his background. But before anything else was known about him, anyone who has studied or covered extremism and these kinds of attacks could have given you an educated guess about what kind of person he was: Male. Probably in his 20s. Decent chance of at least a minor criminal record. More than likely a history of hate toward or violence against women. Oh, and one more thing – probably spent a fair amount of time on the internet.
People could easily be radicalized before social media. Many are still radicalized without it. But social media, often in combination with other factors, has proven to be an efficient radicalizer, partly because it allows for easy formation of communities and partly because of its algorithms, used to convince people to stay a little longer, watch one
Combine those algorithms with men who are disaffected, who may feel that the world owes them more, and you have a recipe for creating extremism of any stripe. 1
For all the largely much-deserved criticism they've recently got over all the things they've failed to act on, social networks have stepped up and take real and impressive action when faced with a deluge of ISIS supporters and content.
"The issue on mainstream sites is for the most part there has been an aggressive takedown" of ISIS-related content, Seamus Hughes, the deputy director of the program on Extremism at George Washington University, said. "The same dynamics did not happen when it came to white supremacy."
The companies could take action against white supremacists now. Indeed, they could go on forever like that, playing whac-a-mole with different movements that pop up and begin radicalizing their users, moving against them after enough people have been killed. It would be easier for them to do that than to actually deal with the underlying problem of those algorithms designed to keep people around
"It makes sense from a marketing perspective, if you like Pepsi then you're going to watch more Pepsi videos … but you take that to the logical extreme with white supremacy videos, "Hughes said. "They're going to have to figure out how to not completely scrap a system that has brought them hundreds of millions of dollars in ad revenue while not promoting someone's radicalization or recruitment."
So what would the platforms do now?
Asked that question, Bill Braniff, the director of the National Consortium for the Study of Terrorism and Responses to Terrorism (START) and a professor of practice at the University of Maryland, said: "What I think we should be asking them to do is continue to minimize the salience or reach of a violent extremist propaganda, that calls for violence … but not limit itself to just content takedowns as the way to do that. Maybe fewer people "
Content Takedowns alone can both contribute to narrative persecution and drive people to smaller, more radical sites, Braniff noted.
"We know that people can actually be addressed through counseling [and] mentorship," he said. "If instead of directing people who might be flirting with extremism to support, if you censor them and remove them from these platforms you lose … the ability to provide them with an off-ramp."
The comprehensive solution he sees is a change to the algorithms so that they can point people to different views or even in some cases to support such as counseling.
"Algorithms can either foster groupthink and reinforcement or they can drive discussion," he said. "Right now the tailored content tends to be, 'I think you'll like more of the same,' and, unfortunately, that's an ideal scenario for not just violent extremism but polarization … We're only sharing subsets of information and (19459025) The massive part of violent extremism is polarization, and it's really dangerous. "