© 2023 CyberNews - Latest tech news,
product reviews, and analyses.

If you purchase via links on our site, we may receive affiliate commissions.

Meta now subject to court oversight over algorithmic bias in housing ads

Meta has also developed a new system to reduce algorithmic bias as part of its settlement with the Department of Justice (DoJ), which said the company’s “discriminatory” ad delivery violated the Fair Housing Act.

The DoJ has announced it reached “a key milestone” in its settlement agreement with Meta, formerly known as Facebook, to prevent discriminatory housing advertising across its platforms.

As part of the settlement, Meta has built a new Variance Reduction System (VRS) to address algorithmic discrimination. It will also be under court oversight and regular reviews to ensure its new system meets compliance targets.

It marks the first time Meta is subject to court oversight for its advertisement targeting and delivery system, the DoJ said.

“Federal monitoring of Meta should send a strong signal to other tech companies that they too will be held accountable for failing to address algorithmic discrimination that runs afoul of our civil rights laws,” said Assistant Attorney General Kristen Clarke of the Civil Rights Division at the DoJ.

An independent, third-party reviewer will monitor whether Meta’s new system meets compliance targets agreed with the DoJ. Meta will have to provide regular compliance reports to the reviewer and the federal government, while the court will have the ultimate authority to resolve any disputes.

As part of the settlement agreement, Meta has also ceased to deliver housing advertisements using the Special Ad Audience tool, which targets users based on their likeness to other users.

“We appreciate that Meta agreed to work with us toward a resolution of this matter and applaud Meta for taking the first steps towards addressing algorithmic bias,” said US Attorney Damian Williams for the Southern District of New York, where the lawsuit against Meta was filed.

The US complaint against Meta alleged the company’s algorithms were biased when delivering advertisements. As a result, people of different gender, racial or ethnic backgrounds saw different ads, including when it comes to housing. It violated the Fair Housing Act, the complaint argued.

Meta said its new technology would help distribute ads across its platforms “in a more equitable way.” The VRS would first cover housing advertisements and then expand to employment and credit ads, it said.

More from Cybernews:

Threat actors can use ChatGTP to create deployable malware

Apple’s mixed reality headset could be released this year

Facebook users targeted in copyright infringement scam

Air France-KLM claims cyberattack stopped in time – experts aren’t convinced

LastPass hack aftermath: can we trust password managers?

Subscribe to our newsletter

Leave a Reply

Your email address will not be published. Required fields are marked