Meta is acting on its vow to reduce ad discrimination through technology. The company is rolling out a Variance Reduction System (VRS) in the US that ensures the real audience for an ad more closely matches the eligible target audience — that is, it shouldn't skew unfairly toward certain cultural groups. Once enough people have seen an ad, a machine learning system compares the aggregate demographics of viewers with those the marketers intended to reach. It then tweaks the ad's auction value (that is, the likelihood you'll see the ad) to display it more or less often to certain groups.
VRS keeps working throughout an ad run. And yes, Meta is aware of the potential privacy issues. It stresses that the system can't see an individual's age, gender or estimated ethnicity. Differential privacy tech also introduces "noise" that prevents the AI from learning individual demographic info over time.
The anti-discrimination method will initially apply to the housing ads that prompted the settlement. VRS will reach credit and employment ads in the country over the following year, Meta says.
The feature comes after more than a year of work alongside both the Justice Department and the Department of Housing and Urban Development. Meta (then Facebook) was charged in 2019 with enabling discrimination in housing ads by letting advertisers exclude certain demographics, including those protected by the Fair Housing Act. In a June 2022 settlement, the social media giant said it would both deploy VRS and scrap the "Special Ad Audience" tool whose algorithm allegedly led to discrimination. Meta had already limited ad targeting in 2019 in response to another lawsuit.
Meta isn't alone in trying to limit discriminatory ads. Google barred advertisers from targeting credit, housing and job ads starting in 2020. However, the tech used to fight that discrimination is relatively novel. It won't be surprising if other internet services implement VRS-like systems of their own so long as Meta's AI proves effective.