Facebook Racial Ad Targeting Options Quietly Dropped After Years of Criticism

Facebook quietly dropped a feature that was repeatedly criticized for letting advertisers seemingly target its users based on race.

The social network announced in a business blog post in August that it was making changes to "multicultural affinity" segments to help streamline the ad targeting options available to buyers—who funnel in the majority of its annual revenue.

The post, which largely flew under the radar apart from Bloomberg coverage, detailed updates made to a tool that critics have linked to discriminatory ad practices.

Investigations by ProPublica found the tool could be exploited by advertisers to exclude Black or Hispanic users, for example, from seeing certain types of marketing. It raised the possibility Facebook was violating U.S. law, such as the Fair Housing Act.

"As part of our latest efforts to simplify and streamline our targeting options, we've identified cases where advertisers—of all sizes and industries—rarely use various targeting options," the social networking giant announced this month.

It added: "Infrequent use may be because some of the targeting options are redundant with others or because they're too granular to really be useful.

"So we're removing some of these options. For example, we're removing multicultural affinity segments and encouraging advertisers to use other targeting options such as language or culture to reach people that are interested in multicultural content."

A spokesperson told Bloomberg that two categories, "African American Affinity" and "Hispanic Affinity," were being scrapped, although it would still offer advertisers a way to target users believed to have an interest in "African American Culture."

In a newsletter last weekend, The Markup editor Julia Angwin, who has long probed the advertising practices at the Mark Zuckerberg-led firm, noted how the company had not shared the news via its Newsroom, which hosts major policy changes and PR.

While Facebook doesn't categorize users by race specifically, its algorithm judges users' "affinities" to interests or behaviors it deems to be linked to a series of demographics, listed as non-multicultural, African American, Asian American, and Hispanic.

"We are using the term 'multicultural affinity' to describe the quality of people who are interested in and likely to respond well to multicultural content. What we are referring to in these affinity groups is not their genetic makeup, but their affinity to the cultures they are interested in," Facebook said in a 2016 tutorial, Ars Technica reported.

Facebook logo
In this photo illustration, the Facebook logo is displayed on the screen of an iPhone in front of a TV screen displaying the Facebook logo on December 26, 2019 in Paris, France. Chesnot/Getty

At the time, when Facebook was still calling the feature "ethnic affinities," it defended the practice, saying it was not the same as racial targeting and it was common practice for advertisers to tailor messages based on a user's suggested interests.

It emerged that advertising for the movie Straight Outta Compton had been released in multiple versions, based on what Facebook determined to be their affinity.

The first 2016 ProPublica investigation purchased an ad targeted at users searching for home property and chose to exclude anyone in the African-American, Asian-American or Hispanic affinity groups. The ads were approved in roughly 15 minutes.

Despite saying in February 2017 it would cut down on ad discrimination, ProPublica said in November the same year that exclusionary ads were still slipping through.

The publication purchased a variety of rental housing ads and requested they should not be displayed to categories of users including "African Americans, mothers of high school kids, people interested in wheelchair ramps, Jews, expats from Argentina and Spanish speakers." ProPublica reported they were all approved in minutes.

In 2019, the U.S. Department of Housing and Urban Development, or HUD, officially accused Facebook of violating the Fair Housing Act by "encouraging, enabling and causing housing discrimination" through its micro-targeting ad platform.

"Facebook is discriminating against people based upon who they are and where they live," secretary Ben Carson said at the time. "Using a computer to limit... housing choices can be just as discriminatory as slamming a door in someone's face."

Kian Lavi, a Facebook employee, referenced the ad options update on his Twitter profile earlier this month, saying that he was proud to have advocated for the change.

"We worked on the targeting team within Facebook ads for exactly three years, and fought for this decision almost every single day... this is a small step in ensuring an equitable internet, free of potential discrimination," Lavi tweeted on August 12.

finally get to brag about something i'm proud of working on.

for the past 3.5 years, @hewsuper and i have advocated for the removal of so-called "multicultural affinity" targeting options within facebook ads. they're finally gone. https://t.co/DzJg8v2WzU

— kian lavi (@kianlavi) August 12, 2020