CFPB and other Federal Regulators Eye Regulation Aimed at Curbing Algorithmic Bias in Automated Home Valuations

Jenner & Block
Contact

Jenner & Block

[co-author: Jonathan Steinberg]

Late last month, the Consumer Financial Protection Bureau (CFPB) took another step toward adopting rules governing the use of artificial intelligence (AI) and algorithms in appraising home values. Specifically, the CFPB issued a detailed outline and questionnaire soliciting feedback from small business entities on a proposed rulemaking proceeding for using Automated Valuation Models (AVMs).

The CFPB and other federal regulators[1] intend to adopt rules designed to: (1) ensure a high level of confidence in the estimates produced by AVMs; (2) protect against the manipulation of data; (3) avoid conflicts of interest; and (4) require random sample testing and reviews.[2] In addition, federal regulators are now considering whether to include express nondiscrimination quality control requirements for AVMs as a ”fifth factor.” Once adopted, the new rules will apply to banks, mortgage lenders who use AVMs to make underwriting decisions, and mortgage-backed securities issuers, and are intended to protect homebuyers and homeowners who may be negatively impacted by inaccurate appraisals.

Automated Valuation Models

AVMs are defined by statute as “computerized model[s] used by mortgage originators and secondary market issuers to determine the collateral worth of a mortgage secured by a consumer’s principal dwelling.”[3]

According to the CFPB, AVMs are increasingly being used to appraise homes, a trend driven “in part by advances in database and modeling technology and the availability of larger property datasets.”[4] The benefits of better AVM technology and increased availability of data are their potential to reduce costs and decrease turnaround times in performing home valuations. However, like algorithmic systems generally, the use of AVMs also introduces risks, including issues of data integrity and accuracy.

Moreover, there are concerns that AVMs “may reflect bias in design and function or through the use of biased data[,] [] may introduce potential fair lending risk.”[5] Due to the “black box”[6] nature of algorithms, regulators fear that “without proper safeguards, flawed versions of these models could digitally redline certain neighborhoods and further embed and perpetuate historical lending, wealth, and home value disparities.”[7] “Overvaluing a home can potentially lead the consumer to take on an increased amount of debt that raises risk to the consumer’s financial well-being. On the other hand, undervaluing a home can result in a consumer being denied access to credit for which the consumer otherwise qualified, potentially resulting in a canceled sale, or offered credit at less favorable terms.”[8]

The Proposed Rule

On February 23, 2022, the CFPB released a 42-page outline, detailing several possible rulemaking options, which provides a glimpse into the agencies’ thinking as to the scope of future regulation.

The proposed rule will be a joint interagency rule as the CFPB maintains enforcement authority over non-depository institutions, whereas the Board of Governors of the Federal Reserve System, the Comptroller of the Currency, the Federal Deposit Insurance Corporation, the National Credit Union Administration, and the Federal Housing Finance Agency maintain enforcement authority over “insured banks, savings associations, [] credit unions[,] . . . [and] federally regulated subsidiaries that financial institutions own and control.”[9]

To address concerns about data integrity, accuracy, and reliability, the CFPB is considering two options—one that is “principles-based” and one that is “prescriptive.” A principles-based approach would require entities to maintain their own “AVM policies, practices, procedures, and control systems” to meet the first four quality control standards noted above.[10] The CFPB acknowledges that this may be preferable as a rule with stringent requirements may not be able to keep up with evolving technology and could present a significant burden for smaller entities. On the other hand, if the agencies decide to promulgate a prescriptive rule, they contemplate requiring controls related to “fundamental errors” that could produce inaccurate outputs, “management oversight of the availability, usability, integrity, and security of the data used,” a clear separation between persons “who develop, select, validate, or monitor an AVM” and employees involved in the “loan origination and securitization process,” and ongoing validation of the entities’ AVM through “random sample testing and reviews.”[11]

As part of the same proposed rule, the CFPB and the aforementioned federal regulators are also eyeing adding a nondiscrimination quality control under their authority to “account for any other such factor . . . determine[d] to be appropriate.”[12] The CFPB recognizes that a standalone nondiscrimination factor may be unnecessary as nondiscrimination may already be encompassed in three of the first four statutorily stipulated quality controls. Additionally, AVMs are subject to federal nondiscrimination laws such as the Equal Credit Opportunity Act (ECOA) and the Fair Housing Act (FHA). However, the CFPB contends that “an independent requirement for institutions to establish policies and procedures to mitigate against fair lending risk in their use of AVMs. . . . may help ensure the accuracy, reliability, and independence of AVMs for all consumers and users.”[13]

To address lending discrimination, federal regulators are considering both a flexible, principles-based approach, similar to the approach described above, and a prescriptive nondiscrimination rule. A principles-based approach would provide companies with “the flexibility to design fair lending policies, practices, procedures, and control systems tailored to their business model”[14] and “commensurate with an institution’s risk exposures, size, business activities, and the extent and complexity of its use of AVMs.”[15] In contrast, a prescriptive rule would “specify[] methods of AVM development (e.g., data sources, modeling choices) and AVM use cases” in order to mitigate the “risks that lending decisions based on AVM outputs generate unlawful disparities.”[16]

Last month’s announcement was triggered by the CFPB’s duty to convene a Small Business Review Panel prior to issuing a proposed rule that “could have a significant economic impact on a substantial number of small entities.”[17] The outline released by the CFPB was meant to elicit feedback from small business entities, such as mortgage loan brokers with annual receipts at or below $8 million, real estate credit companies with annual receipts at or below $41.5 million, and secondary market financing companies and other non-depository credit intermediation companies with annual receipts also at or below $41.5 million. For these small entities, the outline presents over forty questions and an early opportunity to influence the rulemaking process.[18]

Next Steps

As evident by the CFPB’s outline, a great deal remains in flux as regulators continue to contemplate their options. Because the CFPB is subject to heightened rulemaking processes for regulations affecting smaller entities, we have this early glimpse into the agencies’ thinking on AVM algorithmic bias. In the coming months, the CFPB will convene the Small Business Review Panel, release the Panel’s report, and work with its federal partners in drafting a proposed rule subject to the standard notice and comment process.


[1] The CFPB shares enforcement authority over AVMs with the Board of Governors of the Federal Reserve System, the Comptroller of the Currency, the Federal Deposit Insurance Corporation, the National Credit Union Administration, and the Federal Housing Finance Agency.

[2] The Dodd-Frank Wall Street Reform and Consumer Protection Act requires federal regulators to adopt rules ensuring that AVMs satisfy certain quality control standards designed to: “(1) ensure a high level of confidence in the estimates produced by automated valuation models; (2) protect against the manipulation of data; (3) seek to avoid conflicts of interest; (4) require random sample testing and reviews; and (5) account for any other such factor that the agencies determine to be appropriate.” 12 U.S.C. § 3354(a) (2010).

[3] § 3354(d).

[4] Consumer Fin. Prot. Bureau, Outline of Proposals and Alternatives Under Consideration, Small Business Advisory Review Panel For Automated Valuation Model (AVM) Rulemaking, 2 (Feb. 23, 2022) https://files.consumerfinance.gov/f/documents/cfpb_avm_outline-of-proposals_2022-02.pdf.

[5] Id. at 24.

[6] Id.

[7] Press Release, Consumer Fin. Prot. Bureau, Consumer Financial Protection Bureau Outlines Options To Prevent Algorithmic Bias In Home Valuations (Feb. 23, 2022), https://www.consumerfinance.gov/about-us/newsroom/cfpb-outlines-options-to-prevent-algorithmic-bias-in-home-valuations/

[8] Consumer Fin. Prot. Bureau, Outline of Proposals and Alternatives Under Consideration, Small Business Advisory Review Panel For Automated Valuation Model (AVM) Rulemaking, at 24.

[9] Id. at 2.

[10] Id. at 21.

[11] Id. at 22.

[12] 12 U.S.C. § 3354(a) (2010).

[13] Consumer Fin. Prot. Bureau, Outline of Proposals and Alternatives Under Consideration, Small Business Advisory Review Panel For Automated Valuation Model (AVM) Rulemaking, at 25.

[14] Id.

[15] Id.

[16] Id.

[17] Id. at 3.

[18] Id. at 29.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Jenner & Block | Attorney Advertising

Written by:

Jenner & Block
Contact
more
less

Jenner & Block on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide