Meta agrees to drop ad tool after race bias lawsuit


The social media giant has agreed to stop using an ad-serving tool after it was alleged that its algorithm discriminated against people on the basis of color, race, nationality, sex, religion, disability, and familial status.

The proposed settlement, which is pending approval, resolves a civil rights lawsuit that was brought to the US district court in New York and accused Meta of violating the Fair Housing Act (FHA), said the Department of Justice (DoJ).

ADVERTISEMENT

Meta agreed to stop using the Special Ad Audience tool to show available housing vacancies on the same day as the lawsuit was filed. This alleged that the ad server’s algorithm used personal characteristics to determine which Facebook users were shown adverts, in violation of the FHA.

The algorithm allegedly used machine learning to find potential tenants on Facebook who “looked like” users who had placed the ads, which if true would run the risk of implicitly segregating citizens according to racial, religious or other characteristics without their even knowing.

The lawsuit claimed that Meta was therefore liable to charges because in designing the algorithm in such a way “it intentionally classifies users on the basis of FHA-protected characteristics.”

According to the DoJ, under its supervision Meta “will develop a new system to address racial and other disparities caused by its use of personalization algorithms.” This marks the first time the tech giant will be subject to court oversight for its targeted advertising system.

“As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner,” said assistant attorney general Kristen Clarke of the DoJ’s Civil Rights Division.

“This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit,” she added. “The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalized communities.”

“When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the Fair Housing Act, just as when companies engage in discriminatory advertising using more traditional advertising methods,” said US attorney Damian Williams for the Southern District of New York.

And he warned: “If Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation.”

ADVERTISEMENT

Meta has until the end of the year to develop a new system that does not discriminate on the basis of any characteristic protected by the FHA. Should it fail to do so, the settlement agreement will be terminated and the multinational will go before a federal court.

The original case was brought after the Department of Housing and Urban Development (HUD) conducted a discrimination investigation into Meta’s ad-serving system.

“It is not just housing providers who have a duty to abide by fair housing laws,” said Demetria McCain, a HUD spokesperson for fair housing and equal opportunity. “Parties who discriminate in the housing market, including those engaging in algorithmic bias, must be held accountable. This type of behavior hurts us all.”