YouTube has struggled for years with videos that promote offensive viewpoints but do not necessarily violate the company’s guidelines for removal. Now it is taking a new approach: Bury them.
The issue has gained new prominence amid media reports that one of the London Bridge attackers became radicalized by watching YouTube videos of an American Islamic preacher, whose sermons have been described as employing extremely charged religious and sectarian language.
On Sunday, Google, YouTube’s parent company, announced a set of policies aimed at curbing extremist videos on the platform.
For videos that are clearly in violation of its community guidelines, such as those promoting terrorism, Google said it would quickly identify and remove them. The process for handling videos that do not necessarily violate specific rules of conduct is more complicated.
Under the policy change, Google said offensive videos that did not meet its standard for removal — for example, videos promoting the subjugation of religions or races without inciting violence — would come with a warning and could not be monetized with advertising, or be recommended, endorsed or commented on by users. Such videos were already not allowed to include advertising, but they were not restricted in any other way.
“That means these videos will have less engagement and be harder to find,” Kent Walker, Google’s general counsel and senior vice president, wrote in a company blog post on Sunday. “We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints.”
Google, which has relied on computer-based video analysis for the removal of most of its terrorism-related content, said it would devote more engineering resources to help identify and remove potentially problematic videos.