Meta settles lawsuit with Justice Department over ad-serving algorithms

The U.S. Department of Justice today announced that it entered into an agreement with Meta, Facebook’s parent company, to resolve a lawsuit that alleged Meta engaged in discriminatory advertising in violation of the Fair Housing Act (FHA). The proposed settlement is subject to review and approval by a judge in the Southern District of New York, where the lawsuit was originally filed. But assuming it moves forward, Meta said that it has agreed to develop a new system for housing ads and pay a roughly $115,000 penalty, the maximum under the FHA.

“When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the Fair Housing Act, just as when companies engage in discriminatory advertising using more traditional advertising methods,” U.S. Attorney Damian Williams said in a statement. “Because of this ground-breaking lawsuit, Meta will — for the first time — change its ad delivery system to address algorithmic discrimination. But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation.”

The lawsuit was the Justice Department’s first challenging algorithmic bias under the FHA, and it claimed that the algorithms Meta uses to determine which Facebook users receive housing ads relied in part on characteristics like race, color, religion, sex, disability, familial status and national origin — all of which are protected under the FHA. Outside investigations have provided evidence in support of the Justice Department’s claims, including a 2020 paper from Carnegie Mellon that showed that biases in Facebook’s ad tools exacerbated socioeconomic inequalities.

Meta said that, under the settlement with the Justice Department, it will stop using an advertising tool for housing ads, Special Ad Audience, which allegedly relied on a discriminatory algorithm to find users who “look like” other users based on FHA-protected characteristics. Meta also will develop a new system over the next six months to “address racial and other disparities caused by its use of personalization algorithms in its ad delivery system for housing ads,” according to a press release, and implement the system by December 31, 2022.

An independent, third-party reviewer will investigate and verify on an ongoing basis whether Meta’s new system meets the standards agreed to by the company and the Justice Department. If the Justice Department concludes that the new system doesn’t sufficiently address the discrimination, the settlement will be terminated.

Meta must notify the Justice Department if it intends to add any targeting options.

In a blog post, Meta said that its new system will apply to ads related to employment and credit as well as housing — likely in response to criticism of the company’s ad-targeting system in these areas, as well. The aforementioned Carnegie Mellon study found that ads on Facebook related to credit cards, loans and insurance were disproportionately sent to men while employment ads were shown to a greater proportion of women. Users who chose not to identify their gender or labeled themselves nonbinary or transgender were rarely, if ever, shown credit ads of any type, the co-authors found.

U.S. officials have long accused Meta of discriminatory ad targeting practices. In 2018, Ben Carson, the secretary of the U.S. Department of Housing and Urban Development at the time, launched a formal complaint against Facebook over ad systems that “unlawfully discriminated” against users based on their race, religion and other categories. A separate lawsuit filed by the National Fair Housing Alliance in 2019, since settled, alleged that Meta provided discriminatory targeting options to advertisers.