Google is rolling out more measures to identify and remove terrorist content on YouTube, saying there should be "no place for terrorist content on our services".
In a blog post yesterday, the company said it will take a tougher stance on videos containing supremacist or inflammatory religious content by issuing a warning and not monetising or recommending them for user endorsements.
Acknowledging that more needs to be done to tackle content that violates its policies, Google said it was taking four fresh steps to help do so.
It is bolstering its use of tech to help identify extremist and terrorism-related videos, upping its number of independent experts in YouTube's trusted flagger programme, taking a tougher stance on videos that don't clearly violate its policies like those featuring supremacist content, and YouTube will "expand its role in counter-radicalisation efforts".
Kent Walker, a senior vice president and general counsel at Google, said in the blogpost: "Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all."
Google and YouTube are committed to being part of the solution. We are working with government, law enforcement and civil society groups to tackle the problem of violent extremism online. There should be no place for terrorist content on our services.
While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now.
Google said its engineers have created technology to prevent re-uploads of known terrorist content, using image matching techniques.
Walker said: "Collectively, these changes will make a difference".