Data-Driven Targeting of Vulnerable Groups? Be Careful, Says FTC

Agency Issues Report on Discriminatory Risks of Big Data

By Published on .

Reprints Reprints

Federal Trade Commission headquarters in Washington, D.C.'s Apex Building at Constitution Avenue and 17th Street.
Federal Trade Commission headquarters in Washington, D.C.'s Apex Building at Constitution Avenue and 17th Street. Credit: Carol M. Highsmith

The Federal Trade Commission on Wednesday released a report on big data's potential to produce inadvertent discrimination. The takeaway for marketers: Be ethical in data use, and consider how targeting specific offers to audience segments could cause harm to vulnerable groups.

The "Big Data: A Tool for Inclusion or Exclusion?" report suggests that corporations advertising credit cards, bank accounts or loans, for example, should be particularly cognizant of how they target their pitches. Prohibiting single women from applying for a prime credit card based on their marital status would violate the Equal Credit Opportunity Act. "But," the report asks, "what if a single woman would qualify for the prime product, but because of big data analytics, the subprime product with a higher interest rate is the only one advertised to her?"

The commission, building in part on a 2014 workshop on the subject, reminds advertisers that companies cannot discourage "a reasonable person from making or pursuing an application," in ad copy or otherwise. Advertising and marketing practices could affect future offers that could be biased or discriminatory, the report adds. "In some cases, the DOJ has cited a creditor's advertising choices as evidence of discrimination," it says.

"We warn companies to proceed with caution in this area," said Andrea Arias, an attorney in the FTC's Division of Privacy and Identify Protection.

At this stage it's difficult to quantify just how frequently discrimination resulting from data analytics occurs. "We really wanted to just shine a light on that these things that are starting to occur," said Tiffany George, an attorney in the FTC's Division of Privacy and Identify Protection.

However, the agency has taken action against companies in relation to such discrimination in the past. In 2008, the FTC settled with Visa and MasterCard marketing firm CompuCredit Corporation, which agreed to repay consumers an estimated $114 million in unfair fees. The FTC alleged that CompuCredit was deceptive in failing to disclose that it used a behavioral scoring model to reduce some consumers' lines of credit if they used cash advances to pay for marriage counseling and items and services at bars, nightclubs, pawn shops, and massage parlors.

So, how can companies prevent data discrimination? The agency suggests that companies ensure their data sets are representative of groups such as minority and LGBT populations, and that "hidden bias" in data and algorithms do not unintentionally affect those populations.

Also, the FTC reminds companies that while certain correlations might be interesting, they may not be meaningful and could negatively affect vulnerable groups. "It may be worthwhile to have human oversight of data and algorithms when big data tools are used to make important decisions, such as those implicating health, credit, and employment," added the report.

In some cases, concludes the commission, fairness should outweigh analytics.

Most Popular