Meta Agrees to Alter Ad Technology in Settlement With U.S.

SAN FRANCISCO – Meta on Tuesday agreed to change its ad technology and pay a 115,054 fine, in a settlement with the Justice Department over claims that the company’s ad systems discriminated against Facebook users by banning anyone from viewing housing ads on the platform. Depending on their gender, gender and zip code.

Under the agreement, Meta, a company formerly known as Facebook, said it would change its technology and use a new computer-aided method aimed at regularly checking which audiences are targeted and whether they are eligible to receive housing ads. Is. Those ads. The new method, known as the “diversification system”, relies on machine learning to ensure that advertisers deliver housing-related ads to a certain protected class of people.

“We will occasionally take a snapshot of the marketers’ audience, see who they are targeting and remove as much variation from that audience as possible,” Roy L. Austin, Metana’s vice president of civil rights and deputy general counsel, said in an interview. He called it “a significant technological advancement in how machine learning is used to deliver personalized ads.”

Facebook, which has become a business colossus by collecting data from its users and allowing advertisers to target ads based on audience characteristics, has been facing complaints over the years that some of its practices are biased and discriminatory. The company’s advertising systems allow marketers to choose who has seen their ads using thousands of different features, allowing those advertisers to exclude people from a number of protected categories.

While Tuesday’s settlement relates to housing advertising, Meta said it also plans to implement its new system to test targets for employment and credit-related advertising. The company had earlier been hit with bias against women in job advertisements and exclusion of certain groups from viewing credit card advertisements.

U.S. Attorney Damien Williams said in a statement that “due to this groundbreaking lawsuit, Meta – for the first time – will change its advertising distribution system to address algorithmic discrimination.” “But if Meta fails to demonstrate that it has modified its delivery system sufficiently to protect against algorithmic bias, this office will proceed with the lawsuit.”

Meta also stated that it will no longer use a feature called “Special Ad Audiences”, a tool it developed to help advertisers expand the groups of people who will reach their ads. The Justice Department said the instrument is also engaged in discriminatory practices. The company said the tool is an initial attempt to combat bias and its new methods will be more effective.

The issue of biased advertising targeting has been discussed especially in housing advertisements. In 2018, Ben Carson, secretary of the Department of Housing and Urban Development, announced a formal complaint against Facebook, accusing the company of having “illegal discriminatory” advertising systems based on categories such as race, religion and disability. A 2016 investigation by ProPublica also revealed the potential for Facebook’s advertising discrimination, showing that the company’s technology has made it easier for marketers to exclude certain ethnic groups for advertising purposes.

In 2019, HUD sued Facebook for engaging in housing discrimination and violating the Fair Housing Act. The agency said Facebook’s systems do not deliver ads to a “diverse audience”, even if an advertiser wants to see the ad more widely.

“Facebook discriminates against people based on who they are and where they live,” he said. Carson said at the time. “Using a computer to limit someone’s housing choices can be as discriminatory as knocking on someone’s face.”

The HUD claim comes amid widespread pressure from civil rights groups claiming that the vast and complex advertising systems that underpin some of the larger Internet platforms have internal biases in them, and that tech companies like Meta, Google and others should do better. . Back to those biases.

The area of ​​study, known as “Algorithmic Fairness”, has been of considerable interest to computer scientists in the field of Artificial Intelligence. Leading researchers, including former Google scientists such as Timnit Gabru and Margaret Mitchell, have sounded the alarm over such biases for years.

In subsequent years, Facebook has restricted the types of categories marketers can choose when buying housing ads, reduced the number to hundreds, and eliminated targeting options based on gender, age, and zip code.

Meta’s new system, which is still in development, will periodically check who is being advertised for housing, employment and financing, and ensure that the audience matches the people marketers want to target. If the ads being offered seem to weigh heavily on white men in their 20s, for example, the new system would theoretically recognize this and shift the ads to serve a wider and more diverse audience.

Meta said it will work with HUD to incorporate technology into Meta’s advertising targeting systems in the coming months and has agreed to a third-party audit of the new system’s effectiveness.

The fines that Meta pays in settlement are the maximum available under the Fair Housing Act, the Justice Department said.

Similar Posts

Leave a Reply

Your email address will not be published.