
In an op-ed by , first published in the Financial Times, the tech giant promised to increase its use of technology to help identify extremist and terrorism-related videos.
"This can be challenging: a video of a terrorist attack may be informative news reporting if broadcast by the BBC, or glorification of violence if uploaded in a different context by a different user," Walker explained. "We have used video analysis models to find and assess more than 50% of the terrorism-related content we have removed over the past six months."
The goal of the new technology will be to speed up the identification and removal of terrorism-related content.
Google will also increase the number of independent experts in YouTube’s Trusted Flagger programme.
While machines can help identify problematic videos, humans are needed to make nuanced decisions about the line between violent propaganda and religious or newsworthy speech.
"Trusted Flagger reports are accurate over 90% of the time and help us scale our efforts and identify emerging areas of concern," Walker said.
Google plans to add 50 expert NGOs to the 63 organisations that are already part of the programme and will support them with operational grants.
The search and tech giant will be taking a tougher stance on videos that do not clearly violate its policies, such as videos that contain inflammatory religious or supremacist content.
These videos will now appear behind an interstitial warning and will not be monetised and not be recommended to be eligible for comments or user endorsements.
While Walker avoided referring to the "ads funding terror" controversy, this move is evidently Google's most determined effort yet to resolve the issue for advertisers.
The most recent incident in which UK political ads were screened next to extremist content involved "borderline" videos such as these, a source told 北京赛车pk10.
"That means these videos will have less engagement and be harder to find. We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints," Walker commented.
Finally, YouTube will work with Jigsaw to implement the "redirect method" more broadly across Europe. This approach uses targeted online advertising to reach potential Isis recruits and redirects them towards anti-terrorist videos that can change their minds about joining.
"In previous deployments of this system, potential recruits have clicked through on the ads at an unusually high rate, and watched over half a million minutes of video content that debunks terrorist recruiting messages," Walker said.
Google has also committed to working with other platforms, including Facebook, Microsoft, and Twitter, to establish an international forum to share and develop technology and support smaller companies and accelerate our joint efforts to tackle terrorism online.
