Facebook ads are still getting through even if they could discriminate against minorities and protected groups, according to a new report.
On Tuesday, ProPublica, the nonprofit investigative journalism group, said it was able to exclude African Americans, mothers, disabled people and Spanish speakers when placing hosuing ads on Facebook. There are laws regulating advertising practices when it comes to housing, employment and financial services to avoid discrimination.
"Under its own policies, Facebook should have flagged these ads, and prevented the posting of some of them," ProPublica said in its report. "Its failure to do so revives questions about whether the company is in compliance with federal fair housing rules, as well as about its ability and commitment to police discriminatory advertising on the world's largest social network."
Last year, the journalism organization raised the issue of potential discrimination on Facebook's automated ad platform, where advertisers can target people based on what the social network calls "ethnic affinities." It was found that the targeting tools, typically used for brands to tailor messages for different groups, could also be used for nefarious purposes like excluding certain races when promoting housing or job opportunities.
In February, Facebook said it would have a system that rejects housing ads that use racial characteristics for targeting, and that it would have a self-reporting button for the advertiser to verify its ads comply with anti-discrimination laws.
Facebook has been using machine learning to identify when ads relate to housing, and it has been conducting more human reviews when ads are set with sensitive racial targeting parameters.
ProPublica tested this new system, and said not only did the ads get through but there was no self-certification prompt.
"This was a failure in our enforcement and we're disappointed that we fell short of our commitments," said Ami Vora, Facebook's vice president of product management, in an e-mail statement. "The rental housing ads purchased by ProPublica should have but did not trigger the extra review and certifications we put in place due to a technical failure."
"Our safeguards, including additional human reviewers and machine learning systems have successfully flagged millions of ads and their effectiveness has improved over time," Vora added. The company also promised to hire more ad reviewers and refine the machine learning systems used in the program.
Facebook also said it would now require all advertisers to certify that they are in compliance with anti-discrimination policies if they exclude groups in their targeting strategies. This applies to ads even if they are unrelated to housing, employment and financial services.
The failure of the system could raise concerns about how well Facebook is responding to a number of problems uncovered on the network in the past year.
In September, ProPublica discovered another flaw in Facebook's targeting that allowed advertisers to discover groups of people self-identified by anti-Semitic terms. People on Facebook would list their interests as "Jew hater" and "how to burn Jews," and those groups could be available as an advertising audience.
Facebook chief operating officer Sheryl Sandberg said at the time, "The fact that hateful terms were even offered as options was totally inappropriate and a fail on our part."
Facebook promised more human review of its automated ad functions and self-reporting tools for users to help identify offensive ads.
The discriminatory ads failure comes as Facebook is also dealing with the fallout from the 2016 election. Facebook officials have been active on Capitol Hill and testifying before Congress about Russian meddling on the platform, and the company has been criticized for failing to identify bad actors that were buying ads to influence the election.