YouTube tries to fix its pedophile-related ad debacle as more brands flee the platform
YouTube is trying to stamp out its second major brand revolt in as many years, putting executives on an emergency call with top marketers and agencies, and sending a memo to Madison Avenue to let advertisers know it's working on a fix.
The problem: Pedophiles are viewing videos of children, leave disturbing comments and then share links to even worse content in the comments section. On Thursday, AT&T became the latest brand to say it was taking a break from YouTube after learning of the child exploitation.
"Until Google can protect our brand from offensive content of any kind, we are removing all advertising from YouTube," AT&T said in an emailed statement.
Hasbro also announced it would halt its ads on the site, according to CNBC. And earlier this week, Disney, Epic Games and Nestlé announced they would freeze spending on the platform, setting off the "adpocalypse."
Some brands and Madison Avenue agency leaders say that YouTube held the conference call on Wednesday. "It came together very fast," said one marketing executive, who listened to the call.
The exec said YouTube, which is owned by Google, took responsibility and promised action.
The first ad debacle on the platform was in 2017, after ads were found running alongside extremist and racist content. Since that first adpocalypse, YouTube has developed ways to monitor where ads run, so brands can get an audit—but only after the fact.
However, ad executives on the Thursday call said that YouTube "dodged" questions about implementing a system that would vet all videos before ads run. They believe Google won't allow that type of monitoring because it wants to keep its inner workings private. YouTube says it has to think about user privacy before opening to third parties that could poke around its platform.
The offending videos were exposed earlier in the week by a YouTube personality who has been criticizing the company for months for featuring videos of children that could be construed as sexual. In many of them, children are playing or trying on clothes, and while this could be considered harmless content, the comments sections are filled with child predators sharing links to worse content or directing other pedophiles to moments in the videos.
YouTube's algorithm was also suggesting videos that featured children, even when a viewer wasn't looking for them or only viewed a tangentially related video. Additionally, the search bar autofilled in search suggestions that appeared to be favorite terms of pedophiles. YouTube has said it removed the search terms and is taking offending videos out of suggestions.
"Any content—including comments—that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube," a company spokeswoman said in an emailed statement. "We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors. There's more to be done, and we continue to work to improve and catch abuse more quickly."
Advertisers have been hammering the digital ad industry for years now, especially sites like YouTube and Facebook, over where their messages appear. Facebook, too, has faced issues with violent, extremist and racist content, and is working with advertisers on controlling where ads run.
The brand safety issues, however, have not harmed the companies' profits. Google posted $33 billion in ad sales in the fourth quarter and Facebook generated $17 billion. The two digital powerhouses together will have 60 percent of digital ad revenue in the U.S. in 2019, according to eMarketer.
On Thursday, YouTube said it disabled comments on tens of millions of videos and took down 400 channels, and it is working with authorities to investigate any illegal activity.
—George Slefo contributed to this story