“These principles for health sources are the first of its kind,” Garth Graham, director of health care for YouTube, wrote in a blog post. “We hope that other companies will also review and consider how these principles might help inform their work with their own products and platforms.”
Graham, a former health insurance executive, was hired by YouTube at the start of the year to lead a new effort to highlight and produce these videos.
Like social networks Facebook Inc. and Twitter Inc., YouTube has worked to better moderate its flood of user-generated media to deal with misinformation on Covid-19 and vaccines. The platform has removed thousands of videos for violating its misinformation rules since the pandemic began, which has led to criticism for being too censorious, particularly from the political right.
Yet the Biden Administration has gone on an offensive against technology companies in its push for more Americans to get vaccinated and the surgeon general released a report on health misinformation. And on Friday, Biden criticized social networks for their role in letting anti-vaccination material spread.
Democratic Senator Amy Klobuchar said Sunday that misinformation on social media about vaccines adds urgency to her call to change liability standards for what is published on their platforms.
“YouTube removes content in accordance with our Covid-19 misinformation policies,” the company said. “We also demote borderline videos and prominently surface authoritative content for Covid-19-related search results, recommendations, and context panels.”
Working with health organizations
The company said it will continue working with health organizations and other medical experts to prevent the spread of misinformation. Three years ago, YouTube started adding fact-check labels beneath videos that dealt with popular conspiracy theories, like those about the moon landing or linking vaccines to autism.
Twitter also said it would do its part to “elevate authoritative health information,” while Facebook expanded on its defense over the weekend with a blog on how it can’t take the blame for missed target for U.S. vaccinations.
“At a time when Covid-19 cases are rising in America, the Biden administration has chosen to blame a handful of American social media companies,” Facebook’s Vice President of Integrity Guy Rosen said in the post. “Facebook is not the reason this goal was missed.”
On Monday, Biden walked back his accusation that Facebook was “killing people” with misinformation, pinning the problem instead on a group of anti-vaccine agitators active on social media.
YouTube has also argued that it doesn’t operate the same way as social networks, since it doesn’t have the same sort of viral sharing. But its algorithm does feed people a majority of the videos watched on the site.
The surgeon general’s report didn’t mention YouTube by name, but it cited a recent academic study that examined vaccine information on YouTube. Since late 2019, pro-vaccine videos show up higher in YouTube’s search rankings, according to the research. But once viewers watch an anti-vaccine video, YouTube’s system ends up exposing them to more of the same, it added.
A YouTube spokesperson said the company has been working on these recent changes since February. They are coming to viewers in the U.S. first, and the company is planning to add more countries.
—Bloomberg News
Subscribe to Ad Age now for the latest industry news and analysis.