Treliant Takeaway…DOJ Settles Allegations of Algorithmic Bias in Housing Advertising
Read the Press Release Here
- Source: justice.gov
Treliant understands the fair housing and the complexities of today’s emerging technologies. If your financial services company needs assistance with evaluating your practices, including your use of social media advertising, we can help.
On June 21, 2022, the Department of Justice settled allegations of discriminatory advertising of housing raised by the Department of Housing and Urban Development (HUD) against a large social media company.
In 2018, HUD initiated an investigation alleging the social media company violated Fair Housing Act (FHA) prohibitions on discrimination in advertising housing. That investigation resulted in an administrative complaint and a Charge of Discrimination (HUD Complaint) under the FHA, based on targeting options and delivery processes for housing advertisements. The social media platform elected to have the HUD Complaint decided in federal district court.
The DOJ complaint alleges the social media company engaged in a pattern and practice of prohibited discrimination in violation of the FHA in three ways:
- Trait-Based Targeting—Until at least 2019, the social media company encouraged advertisers to target ads by including or excluding users based on FHA-protected characteristics.
- Lookalike Advertising – The social media company invites advertisers to use its proprietary tool called “Lookalike Audiences.” When an advertiser uses this tool, an algorithm analyzes the advertiser’s current source audience and identifies additional platform users that are part of a “lookalike audience” with similar demographic and lifestyle characteristics. Through at least 2019, lookalike audiences were derived using characteristics protected under the FHA.
- Personalization Algorithms – The social media company uses a proprietary algorithm to determine which members of an advertiser’s Eligible Audience actually receive a housing ad in their news feed. The Personalization algorithms include FHA-protected characteristics in their input data, and FHA-protected characteristics may be used to predict user engagement with housing-related advertisements, resulting in disparate impact.
The settlement agreement requires the social media company to terminate the use of Lookalike Advertising for housing ads; develop a new system for housing ads to resolve disparities between race, ethnicity, and sex of advertisers’ targeted audiences and the audience receiving ads on the social media platforms; cease using any FHA-protected characteristics in targeting housing advertisements (and notify DOJ before adding any targeting options); select an independent monitor to verify the new advertising system complies with the settlement agreement; and pay a civil money penalty of $115,054.
This settlement adds to the growing evidence of regulatory and enforcement focus on algorithmic bias and discrimination. If your firm needs help assessing its risk in these areas, Treliant can help.