In its ongoing efforts to combat extremist content (and appease nervous advertisers), YouTube announced some new updates. For one, it will soon place videos that don’t technically violate its community guidelines – but have nonetheless been flagged by users for hate speech – in what YouTube is calling “a limited state.” Such videos will be viewable, but only under a warning. In addition, they won’t be monetized, recommended, or include comments, likes, or suggested videos. This new feature will roll out over the coming weeks – first on desktops, then on mobile devices.
In addition, YouTube says it is implementing new machine learning technology to identify and remove terrorist-related content, and the company says it’s started working with 15 new NGOs – including the No Hate Speech Movement and the Anti-Defamation League – to identify content intended to recruit terrorists.