How has Elon Musk and X leadership responded?
Since Musk bought the company, he has tangled with various organizations who he has claimed have been “pressuring advertisers,” and he blamed them for pushing brands to leave the platform. This week, X sued one group, Media Matters, which has released multiple reports about X showing potentially harmful content, including hashtags affiliated with white nationalists and antisemitic posts, next to ads.
Yaccarino has been urging advertisers to hold the line with X. “On one side, there’s a vocal minority trying to use deceptive attacks to undermine our work,” Yaccarino wrote in a note to X employees and shared publicly on Monday. “But on the other side, there are vocal supporters and courageous partners who believe in X and the meaningful work you are all doing.”
Related: What advertisers need to know about Yaccarino
Yaccarino and Musk have both insisted that X is safer than it was under previous ownership and Yaccarino has claimed 99.9% of the content is “healthy.”
What is X’s history with brand safety?
For the ad industry, the turmoil at X is not new. There was a brand revolt last November, right after Musk took over the company. There were fears that he would open the platform to more extreme speech. Brands also witnessed ham-handed rollouts of new products, such as paid check marks, which allow any paying user to be verified, leading to a spate of seemingly authentic accounts impersonating brands. Since then, some advertisers and ad agencies have slowly resumed marketing on X. For all its flaws, X was still viewed as a powerful communication platform where consumers are talking about brands.
To be sure, X is not the only social platform that has faced brand safety issues—Meta, Google, TikTok and other platforms have also been scrutinized over how they moderate certain forms of speech and how their algorithms could amplify harmful content. Last week, for example, TikTok, owned by Chinese-based ByteDance, trended with videos empathizing with Osama Bin Laden. TikTok started removing videos referencing Bin Laden content for violating “rules on supporting any form of terrorism,” Reuters reported.
Marketers say the difference between X and other platforms, however, is that those services appear to be working with ad stakeholders to show they respond to crises and improve safety measures.
“[TikTok] could have acted faster, of course, but they did act,” said Daija, the CEO of Bridge, “and their intentions seem to be in the right place. All the platforms need to be concerned about this.”
How is brand safety measured on X?
In September, marking her first 100 days as X’s CEO, Yaccarino issued an update on progress, claiming that “hate speech impressions” were down 30% year over year. On Tuesday, however, Media Rating Council said that it halted work that could have helped clarify whether X’s brand safety mechanisms work. MRC audits how well a platform abides by its advertising policies as they relate to the moderation of the service. X had been working to set up this audit for years, since its previous ownership, but the audit was scrapped.
“X completed a pre-evaluation of brand safety with MRC, but then due to business and operational circumstances, was not engaged with the audit process,” a representative of MRC said in an email statement, “and per leadership could not engage for months, so we removed them from in-process designation.”
Digiday had first reported on the status change. MRC would have been able to tell if X’s brand safety protections operated according to industry standards based on terms set by the Global Alliance for Responsible Media.
X still has deals with third-party measurement firms DoubleVerify and Integral Ad Science to monitor some surfaces of the site, checking for ads in harmful settings.
In its lawsuit against Media Matters this week, X downplayed the significance of ads appearing near incendiary content, claiming it was a fraction of ads served—“0.0000009090909%” of ad impressions.
On Monday, Yaccarino posted on X that “only 2 users saw Apple’s ad next to the content, at least one of which was Media Matters.”
“That’s not the issue … The owner of your platform is retweeting and endorsing antisemitic posts,” said Lou Paskalis, the CEO of marketing consulting firm AJL Advisory who worked closely with X when it was still Twitter, in reply to Yaccarino’s post.