YouTube's new moderators, brought in to spot fake, misleading and extreme videos, stumbled in one of their first major tests, mistakenly removing some clips and channels in the midst of a nationwide debate on gun control.
The Google division said in December it would assign more than 10,000 people to moderate content after a year of scandals over fake and inappropriate content on the world's largest video site.
In the wake of the Feb. 14 school shooting in Parkland, Florida, some YouTube moderators mistakenly removed several videos and some channels from right-wing, pro-gun video producers and outlets. On Tuesday, some YouTube channels began complaining about their accounts being pulled entirely. That would have marked a sweeping policy change for YouTube, which typically only removes channels in extreme circumstances and focuses most disciplinary action on specific videos. But YouTube said it was a mistake.
"As we work to hire rapidly and ramp up our policy enforcement teams throughout 2018, newer members may misapply some of our policies resulting in mistaken removals," a YouTube spokeswoman wrote in an email. "We're continuing to enforce our existing policies regarding harmful and dangerous content, they have not changed. We'll reinstate any videos that were removed in error."
The misstep pulls YouTube, Google and parent Alphabet Inc. deeper into a toxic political fights over gun control, fake and extreme content, and whether internet companies should be responsible for what third parties post on their services. The episode also shows how the huge video site continues to struggle with policing the service and how difficult it is to spot troubling content and decide whether the material should be taken down.
Gun reform calls since the shooting have sparked a rash of conspiracy theories on the web about the student activists. YouTube was criticized last week after promoting a video with a title that suggested a teen survivor of the Florida school shooting was a paid actor. The clip contained footage from an authoritative news source, leading YouTube's software-based screening system to misclassify it. After YouTube was alerted to the video, it was pulled.
In the wake of the Florida shooting, Google and other internet companies are facing external pressure to remove the National Rifle Association's NRA TV channel from their video streaming services. To date, YouTube and other services haven't pulled the NRA's official channel.
YouTube's official policy says that "harmful or dangerous" and "hateful" content can violate its guidelines. If video creators break the rules three times within three months, YouTube terminates the account.
Alex Jones, who runs the publication Infowars and has pushed conspiracy theories about school shootings, is the most outspoken self-proclaimed victim of YouTube. He said this week that YouTube told him his account faces two strikes. On Tuesday, an Infowars article stated that Google was "purging conservative media," claiming that "CNN and other news outlets" were lobbying Google to terminate the Infowars channel.
-- Bloomberg News