Advertisement

ProPublica: Facebook advertisers can still discriminate by race

Facebook VP says approvals were "a failure in our enforcement."

A year ago, ProPublica discovered that Facebook let advertisers select who would see ads based on 'ethnic affinity.' Because the social network doesn't ask users to disclose their racial identity, Facebook collects data and assigns each a preference for content that aligns with those the network believes coincide with an ethnic group. Advertisers then had the choice to target -- or avoid -- users based on their 'ethnic identity,' which would violate the Fair Housing Act. Shortly thereafter, an apologetic Facebook said it would shut down 'ethnic affinity' ads for housing and jobs. But ProPublica just released a new report confirming they could still make dozens of rental housing ads that discriminated against certain ethnic groups -- and Facebook approved all of them.

Further, according to ProPublica, the social network approved all but one of them within minutes, as their image above demonstrates. The other ad sought to exclude renters who were "interested in Islam, Sunni Islam and Shia Islam," which Facebook allowed after just 22 minutes. The platform's policies state that their vetting process should have flagged the discriminatory language in each of these advertisement requests, but that didn't happen.

Federal law prohibits ads from discriminating based on race in three areas -- housing, employment and credit. Not coincidentally, these were the areas that Facebook claimed it was ended its 'ethnic affinity' advertising options. The Department of Housing and Urban Development was previously investigating Facebook for its advertising policies, but confirmed to ProPublica that it had closed the inquiry.

When reached for comment, Facebook said a technical error miscategorized the ProPublica ads, so compliance and review alarms weren't triggered. Facebook provided this statement to Engadget from its VP of Product Management Ami Vora:

"This was a failure in our enforcement and we're disappointed that we fell short of our commitments. Earlier this year, we added additional safeguards to protect against the abuse of our multicultural affinity tools to facilitate discrimination in housing, credit and employment. The rental housing ads purchased by ProPublica should have but did not trigger the extra review and certifications we put in place due to a technical failure.

Our safeguards, including additional human reviewers and machine learning systems have successfully flagged millions of ads and their effectiveness has improved over time. Tens of thousands of advertisers have confirmed compliance with our tighter restrictions, including that they follow all applicable laws.

We don't want Facebook to be used for discrimination and will continue to strengthen our policies, hire more ad reviewers, and refine machine learning tools to help detect violations. Our systems continue to improve but we can do better. While we currently require compliance notifications of advertisers that seek to place ads for housing, employment, and credit opportunities, we will extend this requirement to ALL advertisers who choose to exclude some users from seeing their ads on Facebook to also confirm their compliance with our anti-discrimination policies – and the law."