YouTube is considering moving all videos for children to its separate YouTube Kids app
YouTube is considering more changes to how content for kids shows up on the world’s largest video site as criticism mounts that it’s unsafe for children.
YouTube, owned by Alphabet Inc.’s Google, is debating changes involving kids’ content, according to a person familiar with the discussions. The Wall Street Journal earlier reported that the company was mulling moving all videos for children to its separate YouTube Kids app. Such a drastic change is unlikely, according to the person, who asked not to be identified discussing in-house company deliberations.
The Google unit has long positioned itself as a neutral platform that lets anyone upload and watch whatever videos they want. But now the site is struggling to convince parents and advertisers that it can protect children from violent, upsetting and harmful content. On Monday, Bloomberg reported that children who use YouTube’s main site far outnumber those who stick to the safer, vetted YouTube Kids app.
YouTube has already made tweaks to the platform as it tries to create a safer site for children. The company banned comments on thousands of videos featuring kids after predators were found to be using the comment section to flag parts of the videos showing activities that could be twisted to be construed as sexual.
“We consider lots of ideas for improving YouTube and some remain just that—ideas,” a YouTube spokeswoman said in an email. “Others, we develop and launch, like our restrictions to minors live streaming or updated hate speech policy.”
YouTube only recently made “responsible growth” its core metric, after years of focusing on engagement, even after employees flagged harmful and misleading videos to executives, Bloomberg reported earlier this year.
Major advertisers have frozen YouTube spending at various times out of fear their ads will be shown next to harmful videos. Still, the video site remains, with Facebook Inc. and Instagram, among the most popular places to advertise online.
Meanwhile, the U.S. Federal Trade Commission is investigating allegations that Google’s YouTube violated rules about collecting data on and advertising to children, according to a person familiar with the matter.
The commission is probing whether the world’s largest video site broke the Children’s Online Privacy Protection Act, which makes it illegal to collect information on minors and disclose it to others without parental permission. A group of activists last year asked the FTC to look into the matter. Representatives for YouTube and the FTC declined to comment.