Twenty-One State Attorneys General and the District of Columbia Back HHS’s Plan to Hold Hospitals and Other Providers Liable for Discriminatory Use of Clinical Algorithms

King & Spalding
Contact

On October 3, 2022, the Attorneys General for California, New York, and Massachusetts, in collaboration with Attorneys General from 18 other states and the District of Columbia (State Attorneys General) submitted a comment letter expressing its support for various proposals aimed at mitigating discrimination in healthcare as set forth in the August 4, 2022 proposed rule issued by HHS. This includes HHS’s first-time effort to address discrimination in clinical algorithms.

Section 1557 of the Affordable Care Act, codified at 42 U.S.C. § 18116, prohibits “covered entities” from discriminating based on race, color, national origin, age, disability, or sex in health programs or activities. Section 1557 defines “covered entities” to include hospitals, health clinics, health insurance issuers, state Medicaid agencies, community health centers, physician’s practices, and home health care agencies—that is, any healthcare entity or organization that receives federal financial assistance (e.g., Medicare and Medicaid). On August 4, 2022, HHS issued a proposed rule delineating several changes to the agency’s regulations implemented under § 1557, including the agency’s proposal to hold liable for the first time covered entities’ discriminatory use of algorithms in clinical decision-making.

HHS explained that many clinical decision-making tools use race and ethnicity as an input variable rendering their output explicitly “race (or ethnicity)-based.” According to HHS, some providers and covered entities may overly rely on race-based algorithms for clinical decision-making potentially resulting in less favorable treatment for certain groups of patients as compared to others, which may, under certain circumstances, result in “demonstrable harm.” The proposed rule makes clear that it covers a wide range of clinical algorithms (e.g., flowcharts, clinical guidelines to complex computer algorithms, decision support interventions and models) used in an equally wide number of ways extending in healthcare settings (i.e., everything from diagnosing to administrative operations, and everything in between). HHS requested that interested stakeholders and others submit comments to the agency’s proposed rule on or before October 3rd.

As noted above, the October 3, 2022, comment letter submitted by the Attorneys General from California, Massachusetts, and New York, on behalf of themselves and the Attorneys General from 18 other states and the District of Columbia expressed their support for the adoption of a regulation addressing algorithm-based discrimination. The State Attorneys General described it as a “welcome” proposal, observing that the “proposed regulation appropriately puts covered entities on notice of the relevance of Section 1557 to clinical algorithms, and is likely to increase the healthcare sector’s attention and investment into clinical review and auditing of these types of processes.” The State Attorneys General noted their agreement with HHS that the determination that the proposed regulation does not represent a new prohibition, but a “clarification and communication” to covered entities of their responsibility regarding one specific form of discrimination. Perhaps most notable is the comment letter’s statement that State agencies can and will address issues of algorithmic bias in ways that are more specific or broader than HHS and, in some cases, states may decide to offer broader protection to vulnerable groups than federal law provides.

The State Attorneys General’s comment letter comes on the heels of a press release issued by California Attorney General Rob Bonta announcing that his office had sent letters to 30 hospitals across California requesting information (RFI) about how healthcare facilities and other providers are identifying and addressing racial and ethnic disparities in commercial decision-making tools. The scope of information that the Attorney General’s RFI seeks is extensive and wide-ranging, requesting information on all commercially available or purchased decision-making tools, products, software systems, or algorithmic methodologies currently in use at the hospital for various functions, including clinical decision support as well as population health management, utilization management, operating room scheduling, and payment management. The deadline for hospitals to submit this information to the Attorney General is October 16, 2022. Notably, the California Attorney General’s press release warns that this RFI is just the first step to determine whether commercial healthcare algorithms have discriminatory impacts based on race and ethnicity.

Takeaway

As both HHS’s proposed rule and the State Attorneys’ General’s comment letter make clear, covered entities are on “notice” that they are responsible for ensuring that their use of clinical algorithms does not result in discrimination and that their failure to do so may result in liability under federal and state antidiscrimination laws. Given the broad range of algorithmic tools used by healthcare providers, hospitals, laboratories, pharmacies, and other covered entities, they should heed this warning and take proactive measures to assess their use of these tools and implement processes and procedures to mitigate the risk of liability.

Written by:

King & Spalding
Contact
more
less

King & Spalding on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide