Facebook hid real estate ads based on users’ race, feds say

0

It doesn’t matter if you’re black or white, unless you’re looking for a new apartment on Facebook.

Users’ appearances determined whether or not Facebook was showing them ads for homes available through at least 2019, Justice Department officials said on Tuesday as they announced a one-of-a-kind deal with the parent company. of the social network for running a biased algorithm system.

Manhattan U.S. Attorney Damian Williams says the Justice Department reached an unprecedented settlement with Facebook’s parent company, Meta Platforms, Inc., that would force the company to revamp ad-blocking technology to users based on their race, gender, zip code and other characteristics. .

“Because of this groundbreaking lawsuit, Meta will – for the first time – change its ad serving system to address algorithmic discrimination,” Williams said.

If Meta doesn’t change the system to the Justice Department’s satisfaction, Williams said, his office “will pursue litigation.”

Meta has until December to abandon its real estate advertising system and create a new one that is not racist, sexist or classist and whose technology must be approved by the government before it can be implemented.

The new algorithm must incorporate an element of self-monitoring and Meta must agree to submit to continuous reviews.

Mark Zuckerberg’s company also agreed to pay a civil penalty of $115,054, the maximum under the law.

In a blog post, Roy Austin, Meta’s assistant general counsel, said the company’s new “gap reduction system” technology will ensure users won’t face racial discrimination or other protected characteristics. by the Fair Housing Act 1968.

Austin said the algorithm will also be used to ensure job and credit-related ads reach everyone who wants to see them.

The settlement resolves a lawsuit filed Tuesday following a discrimination charge and civil action against Facebook issued by the Department of Housing and Urban Development in March 2019.

According to the complaint, Facebook collects data about its users’ appearances in multiple ways.

One is its popular tool inviting people to create their own cartoon-like “avatar” – which has helped the algorithm collect information about users’ race.

After entering details such as skin color, eye, nose and lip shape and hairstyle, the site prompts users to create an avatar to open the selfie camera to determine the most popular facial features. relatives.

“This information regarding the user’s physical appearance is part of Facebook’s massive user data set,” the complaint reads.

recent news

recent news

As it happens

Get updates on the coronavirus pandemic and other news as they happen with our free email alerts.

By excluding people from seeing ads based on their race, gender and other characteristics, Facebook violated the Fair Housing Act, according to the federal government.

Self-designed tools called “Lookalike Audience” and “Special Ad Audience,” which aimed to help companies expand the number of people who saw their ads, but actually excluded people based on their gender and gender. race, officials said.

As part of the settlement, Meta will discontinue both tools.

Demetria McCain, principal deputy assistant secretary at the Department of Housing and Urban Development, said companies like Facebook play as big a role as housing providers in the modern age.

“Parties who discriminate in the housing market, including those who engage in algorithmic bias, must be held accountable,” McCain said. “This type of behavior hurts us all.”

The lawsuit is the first discrimination challenged by the DOJ by an algorithm under the FHA, which prohibited discrimination on the basis of race, sex, religion and other characteristics when leasing, selling or financing properties. ‘housing.

With dispatch services

Share.

Comments are closed.