Justice Department announces settlement with Meta (formerly Facebook) to resolve alleged FHA violations arising from Meta’s targeted advertising system

Ballard Spahr LLP
Contact

Ballard Spahr LLP

The Department of Justice announced that it has entered into a settlement with Meta Platforms Inc., formerly known as Facebook Inc., to resolve allegations that Meta engaged in discriminatory advertising in violation of the Fair Housing Act (FHA).  The proposed agreement was filed in a New York federal district court simultaneously with a complaint alleging that Facebook’s housing advertising system discriminated against Facebook users based on their race, color, religion, sex, disability, familial status, and national origin. The settlement highlights the need for companies to be mindful of fair lending risk when formulating their social media and other advertising plans.

The lawsuit arose from an administrative complaint filed in 2018 by the Assistant Secretary for Fair Housing and Equal Opportunity with HUD alleging FHA violations by Facebook.  After the Secretary issued a Charge of Discrimination under the FHA, Facebook elected to have the Charge decided in a federal district court.  The complaint references Facebook’s 2019 settlement of a lawsuit filed by the National Fair Housing Alliance and several other consumer advocacy groups challenging Facebook’s advertising practices.  In the settlement, Facebook agreed to remove age, gender, and zip code targeting for housing, employment, and credit-related advertisements.  The DOJ alleges that the changes made by Facebook were insufficient to stop discriminatory ad targeting.

The complaint alleges that Facebook engaged in discrimination in violation of the FHA through the following three aspects of the ad delivery system that it designed and offered to advertisers:

  • Trait-based targeting. This involved encouraging advertisers to target ads by including or excluding Facebook users based on FHA-protected characteristics that Facebook, through its data collection and analysis, attributed to those users.
  • Look-alike targeting. This involved a machine-learning algorithm that advertisers could use to find Facebook users who “look like” an advertiser’s source audience (i.e. an advertisers identified audience of Facebook users) based in part upon FHA-protected characteristics.
  • Delivery determinations. This involved applying machine-learning algorithms to help determine which subset of an advertiser’s target audience would actually receive the ad based in part upon FHA-protected characteristics.

The complaint sets forth in great detail the alleged steps Facebook took in implementing each of these aspects of its ad delivery system.  The DOJ alleges that through its design and use of these aspects of its ad delivery system, Facebook engaged in both intentional discrimination based upon FHA-protected characteristics and conduct that had a disparate impact on the basis of FHA-protected characteristics.  It claims that Facebook’s alleged actions, policies, and practices constitute (1) making dwellings unavailable to persons because of FHA-protected characteristics in violation of the FHA, (2) making, printing, or publishing advertisements with respect to the sale or rental of dwellings that indicate a preference, limitation, or discrimination based on FHA-protected characteristics in violation of the FHA, and (3) representing to a person because of a FHA-protected characteristic that a dwelling is not available for inspection, sale, or rental when such dwelling is in fact available in violation of the FHA.

The settlement agreement includes the following terms:

  • By December 31, 2022, Meta must stop using the “look-alike” advertising tool for determining which Facebook users are eligible to receive housing ads.
  • By December 16, 2022, Meta must develop a new system for housing ads to address disparities for race, ethnicity, and sex between advertisers’ targeted audiences and the group of Facebook users to whom Facebook’s personalization algorithms actually deliver the ads.  If the parties agree that the new system sufficiently addresses the disparities, Meta will fully implement the new system by December 31, 2022.  If the parties do not agree, the settlement agreement will terminate and the parties will be returned to the legal positions they occupied before the agreement’s effective date.
  • If the new system is implemented, the parties will select an independent, third-party reviewer to verify on an ongoing basis whether the new system is meeting the compliance standards agreed to by the parties.  Meta must provide the reviewer with any information necessary to verify compliance with those standards.  The court will have ultimate authority to resolve disputes over the information that Meta must disclose.
  • Meta will not provide any targeting options for housing advertisers that directly describe or relate to FHA-protected characteristics and must notify the DOJ if it intends to add any targeting options.  If the parties cannot agree on whether a targeting option satisfies the standards set forth in the agreement, Meta cannot add the option without court approval.
  • Meta must pay a civil penalty of $115,054, which the DOJ calls “the maximum penalty available under the [FHA].”

While this case did not involve the CFPB, under Director Chopra’s leadership the CFPB has issued a series of orders to Big Tech companies, including Facebook, seeking information about their use of consumer payments data, including any use for behavioral targeting.  Also, under Director Chopra’s leadership, the CFPB has regularly been sounding alarms about the potential for discrimination arising from the use of so-called “black box” credit models that use algorithms or other artificial intelligence tools.  In May 2022, the CFPB issued a Circular addressing  ECOA adverse action notice requirements in connection with credit decisions based on algorithms.  In February 2022, the CFPB issued an outline entitled Small Business Advisory Review Panel for Automated Valuation Model (AVM) Rulemaking.  In the outline, the CFPB specifically focused on the potential for algorithmic bias arising from AVMs.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Ballard Spahr LLP | Attorney Advertising

Written by:

Ballard Spahr LLP
Contact
more
less

Ballard Spahr LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide