Google’s Redirect Program: An Anemic Response to the Plague of Terror Content

15.08.2018

Google’s Redirect Program: An Anemic Response to the Plague of Terror Content

Google’s Redirect Program: An Anemic Response to the Plague of Terror Content

Google’s attempts to tackle the dangers of terrorist content on its online platforms, including YouTube, have come far too late and remain disturbingly inadequate, according to findings in a new study by the Counter Extremism Project (CEP).

YouTube first refused to crack down on content produced by Islamic terrorist groups in 2008, citing “everyone’s right to express unpopular points of view.” Ten years later, following numerous terror attacks connected to the proliferation of radicalizing propaganda online, Google-owned YouTube has finally modified its rhetorical position.

One of Google’s attempts––the Redirect Method Pilot Program––was launched in July 2017, two months after suicide bomber Salman Abedi killed 22 people in Manchester, England, with an explosive device that he had assembled using ISIS bomb-making tutorial videos on YouTube. The Redirect Method Pilot Program claims to target individuals searching for ISIS-related content on YouTube and redirect them to counter-narrative videos. In early 2018, CEP still found an abundance of extremist content on YouTube – demonstrating that someone searching for extremist material on the video-sharing platform is more than three times as likely to encounter extremist videos as counter-narratives.

To assess the efficacy of the Redirect Method Pilot Program, CEP reviewed a sample of 710 YouTube videos for extremist and counter-narrative content. The video sample was compiled from the results of six YouTube searches using terms related to Islamic extremism. Four of the six terms were chosen because they were explicitly mentioned by Google as terms targeted for “suggesting positive sentiment towards ISIS.” In total, nearly 10%[1] of the sample included propaganda or content that glorified extremism and 25 videos were explicitly violent in nature or showed gore. In contrast, CEP discovered only 2%[2] of the videos included anything that could be interpreted as counter-narrative messaging.

A small glimmer of hope is that official ISIS propaganda materials were relatively limited throughout the sample, suggesting that YouTube has at least improved its takedown practices for ISIS videos specifically. Unfortunately, the platform has clearly not made the same efforts to target non-ISIS extremist content. In contrast to four official ISIS propaganda releases, 18 videos––more than three times as many––were official propaganda releases from non-ISIS Islamic extremist groups, including al-Qaeda, the Nusra Front, the Taliban, and Hamas. The remaining extremist videos contained unofficial propaganda, combat footage, or photo montages, representing a mixture of both ISIS and other Islamic extremist groups. The question remains why is Google not removing all extremist content?

Google’s efforts to promote counter-narrative content is inconsistent and insufficient. Though CEP found only four videos that were official ISIS propaganda releases, 14 out of the 15 counter-narrative videos found were expressly targeted at the group. And on top of that, no counter-narrative material was found in the search results for two terms that were specifically noted as being targeted for counter-narrative messaging.

If we take a minute to step back from the inefficiencies of the method Google are using and question the potential counter-narrative videos have to dissuade aspiring jihadists. Not only is it easy for any Internet user to quickly and easily disregard unwanted content, but the counter-narrative content found on YouTube was frequently disparaged for its message and mocked in the comments section. For example, almost all of the 80 comments on one counter-narrative video on the site feature obvious pro-ISIS rhetoric, and several others point out that the account that uploaded the video blatantly features the U.S. State Department’s logo.

Regardless, well-meaning programs like the Redirect Method and YouTube’s record of removing extremist content remain unsatisfactory. When violence and death is a potential outcome, nothing short of a 100% removal of terrorist content from the platform is an acceptable policy.

[1] CEP found 53 videos (equaling more than 7.4% of the sample) included propaganda or content that glorified extremism

[2] 15 videos that could be interpreted as counter-narratives were found.

Source: Link