Meta agrees to stop algorithmic discrimination after settlement

The Department of Justice reached a settlement with the Facebook owner Meta after it was accused of allowing landlords to target a specific audience based on race and other characteristics while displaying housing ads on its platforms.

According to The Washington Post, algorithmic discrimination is a violation of the Fair Housing Act (FHA), which the Trump administration brought in 2019.

The statement released by the Department of Justice on their website says, “Under the settlement, Meta will stop using an advertising tool for housing ads (known as the “Special Ad Audience” tool) that, according to the department’s complaint, relies on a discriminatory algorithm.”

The Department of Justice not only asked Meta to stop the activity of this sort but also instructed them to make a new system. As per their statement, “Meta also will develop a new system to address racial and other disparities caused by its use of personalization algorithms in its ad delivery system for housing ads. That system will be subject to Department of Justice approval and court oversight.”

As per US publication, Facebook Vice President of Civil Rights Roy Austin said that the company will use machine learning to make its ad distribution more equitable. “Discrimination in housing, employment, and credit is a deep-rooted problem with a long history in the US, and we are committed to broadening opportunities for marginalized communities in these spaces and others,” he added.

The settlement is termed as historic as Meta has agreed to change its delivery system to end algorithmic discrimination.


Source link

Related Articles

Leave a Reply

Your email address will not be published.

Back to top button