الخميس، 23 يونيو 2022

NEWS TECHNOLOGIE

(Photo: Creative Commons Attribution-Share Alike 3.0/Wikimedia Commons)
Meta has reached a same-day settlement agreement with the Department of Justice (DOJ) over a complaint regarding its advertising algorithms. 

The lawsuit, filed by the Assistant Secretary for Fair Housing and Equal Opportunity in collaboration with the Department of Housing and Urban Development (HUD), alleged Meta enabled advertisers to target housing ads based on demographic data protected by the Fair Housing Act (FHA). Advertisers were able to selectively show housing ads to users of a certain race, color, religion, sex, disability status, familial status, or national origin. Meta’s marketing tools, known as “Special Ad Audience” (previously “Lookalike Audience”), would then use machine learning to determine whether a user was “eligible” to see a housing ad. This prevented some users from seeing opportunities they otherwise might have used. 

The complaint argues that Meta has engaged in disparate treatment by distinguishing users based on FHA-protected characteristics, and then designing algorithms that “[affect] Facebook users differently on the basis of their membership in protected classes.” 

(Photo: Will Francis/Unsplash)

Meta immediately worked with the DOJ to devise a settlement. The agreement, established Tuesday shortly after the original filing, requires that Meta pay a $115,054 fine and bring its “Special Ad Audience” to a screeching halt. Meta will have until the end of this year to create a new ad targeting tool. The replacement must “address disparities for race, ethnicity and sex between advertisers’ targeted audiences and the group of Facebook users to whom Facebook’s personalization algorithms actually deliver the ads.” Meta and HUD will work together to select a third-party reviewer who will verify the new tool’s compliance. 

Advertisers responsible for housing ads, however, won’t be able to use this targeting system at all. The settlement prohibits Meta from allowing housing advertisers to selectively show ads to users based on FHA-protected characteristics. Failure to comply with this requirement (or create a satisfactory replacement tool) will result in continued litigation in federal court.  

“It is not just housing providers who have a duty to abide by fair housing laws,” said Demetria McCain, the Principal Deputy Assistant Secretary for Fair Housing and Equal Opportunity at HUD, in a DOJ statement. “Parties who discriminate in the housing market, including those engaging in algorithmic bias, must be held accountable.”

The complaint and resulting settlement constitute the DOJ’s first foray into challenging algorithmic bias under the Fair Housing Act. But they won’t be the Department’s last. Last year Google was caught allowing advertisers to bar non-binary users from seeing job ads. The company quickly pledged to fix the issue, which it called “inadvertent.” But the oversight had already highlighted how allegedly fine-tuned ad targeting could have a significant negative impact on a certain demographic—or worse, be weaponized against members of a particular community. 

Now Read:



from ExtremeTechExtremeTech https://ift.tt/EGsqbuQ

ليست هناك تعليقات:

إرسال تعليق