On June 21, 2022, the U.S. Department of Justice (DOJ) filed a complaint and settlement agreement in federal court alleging that a social media company is in violation of the Fair Housing Act (FHA), 42 U.S.C. §§ 3601-3619, for employing tools that allow advertisers of housing-related services to exclude certain users from seeing their ads based on the users’ race, color, religion, sex, disability, familial status and national origin, which are FHA-protected characteristics. According to the settlement agreement, the social media company has agreed to cease using or modify its use of the tools, and will pay approximately $115,000 in civil money penalties.
The FHA and its implementing regulations prohibit, among other things, making or publishing advertisements with respect to the sale or rental of dwellings that indicate a preference, imitation, or discrimination based on race, color, religion, sex, disability, familial status, and national origin. However the complaint alleges that the social media company, through its ad targeting and delivery system, targets and delivers housing-related ads to some users while depriving other users based on FHA-protected characteristics or proxies for those characteristics. Specifically, the complaint alleges that the company’s proprietary tool, called the “special ad audience” tool, uses a machine-learning algorithm to construct audiences for housing and credit advertisers based on data collected from the social media company’s users, which data encodes FHA-protected characteristics or information closely related to those characteristics. Further, the complaint alleges that a former version of the tool, called the “lookalike audience” tool, directly used demographics such as race, sex, religions, age, zip codes, and group membership, to select audiences to whom to deliver ads. For example, testing allegedly revealed that where an advertiser included only men in its “source audience” – or targeted audience – the lookalike audience tool enabled the advertiser to deliver its ad to a population that was 99% male. Under the settlement agreement, the social media company will stop using the special audience tool.
In addition, the complaint alleges that the social media company is in violation of the FHA via its offering to housing-related advertisers of an interactive tool called the “ads manager,” which allows an advertiser to upload a proposed ad and define its eligible audience by using drop-down menus that allow the advertiser to include and exclude users by their demographics, interests, and other characteristics. To address this allegation, under the settlement agreement, the social media company agrees to not provide any targeting options for housing advertisers that directly describe or relate to FHA-protected characteristics, and to develop a new system for housing ads to address disparities for race, ethnicity and sex between advertisers’ targeted audiences and the group of users to whom the ads are actually delivered.
The settlement follows recent statements by leadership from both the DOJ and Consumer Financial Protection Bureau regarding their mutual priority to pursue enforcement actions for fair lending violations, and assess the propriety of using algorithms for lending decisioning.