YouTube to expand teams reviewing extremist content

YouTube to expand teams reviewing extremist content

REUTERS
YouTube to expand teams reviewing extremist content

Alphabet Inc's YouTube said on Dec. 4 it plans to add more people next year to identify inappropriate content as the company responds to criticism over extremist, violent and disturbing videos and comments.

YouTube has developed automated software to identify videos linked to extremism and now is aiming to do the same with clips that portray hate speech or are unsuitable for children. Uploaders whose videos are flagged by the software may be ineligible for generating ad revenue.

But amid stepped up enforcement, the company has received complaints from video uploaders that the software is error-prone.

Adding to the thousands of existing content reviewers will give YouTube more data to supply and possibly improve its machine learning software.

The goal is to bring the total number of people across Google working to address content that might violate its policies to over 10,000 in 2018, YouTube CEO Susan Wojcicki said in one of a pair of blog posts on Dec. 4.

"We need an approach that does a better job determining which channels and videos should be eligible for advertising," she said.

"We've heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we don't demonetize videos by mistake."

In addition, Wojcicki said the company would take "aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether."

The moves come as advertisers, regulators and advocacy groups express ongoing concern over whether YouTube's policing of its service is sufficient.

YouTube is reviewing its advertising offerings as part of response and it teased that its next efforts could be further changing requirements to share in ad revenue.

YouTube this year updated its recommendation feature to spotlight videos users are likely to find the most gratifying, brushing aside concerns that such an approach can trap people in bubbles of misinformation and like-minded opinions. 

Extremist content,