Federal Trade Commission Looking at New Rules to Combat Discrimination in Algorithms and Poor Security Practices

Benesch
Contact

Benesch

The brief FTC note indicates the agency will look to combat poor security practices, protect against the misuse of personal information, and discrimination arising from algorithmic decision-making.

Last month, the Federal Trade Commission (“FTC”) submitted a brief summary of potential rules it may look at implementing over the course of 2022 to the Office of Management and Budget (“OMB”). The summary comes on the heels of a letter from FTC Commissioner Lina Khan to Sen. Richard Blumenthal, who is the chair of the Committee on Commerce, Science, and Transportation; as well as the chair of the subcommittee on Consumer Protection, Product Safety, and Data Security.

According to Commissioner Khan, any such rulemaking would be conducted under an updated, streamlined rule making process. The updated process eliminates time consuming elements of the rulemaking process (such as staff reports and analysis) that are not specifically required by law. The hope is that the streamlined process will allow for rules and regulations to be timelier, instead of one or two steps behind the technology it regulates.

While no draft or final rule is published yet, the summary and letter show an increased interest in more privacy and data protection regulation from the FTC.

Congress has been calling on the federal government to implement new rules and regulations on such matters, and the letter specifically addresses such calls. Congress has also proposed a litany of new laws; some geared at tech anti-trust, some geared at omnibus privacy protection. However, no legislation is close to being taken up for serious consideration.

Possible New Privacy Regulation

The summary and letter highlight three specific areas where the FTC will look into adopting new regulations: (1) security practices; (2) protection against the misuse of personal information; and (3) protection against discrimination that may arise from algorithmic decision-making.

First, the FTC may take new action against poor security practices. This is in line with other FTC and federal government action, which has recently heightened its focus on specific security measures that business must adopt in order to guard against the rising threat that cybersecurity incidents pose to individuals and the U.S. For example, the FTC recently issued an amended Safeguards rule requiring financial institutions to implement specific security measures (such as multi-factor authentication and encryption).

Future FTC rulemaking in this area may broadly apply specific security standards to a broader swatch of businesses, in order to combat lax security standards.

Second, in order to protect consumers from the misuse or abuse of their personal information, the FTC may consider new rules on so called “dark patterns.” Dark patterns are ploys that businesses use to mislead consumers into purchasing certain goods or services, or use to get consumers to agree to certain contracts, terms, or agreements. Any deceptive practice that is built into a business’s user-interface that—intentionally or unintentionally—obscures or subverts a consumer’s independent choices, is considered a dark pattern. It can also include instances where a business misleads a consumer into giving their personal information away.

For example, if a business’s systems interpret a lack of a choice, or silence, as consent, such a practice could be considered a dark pattern. According to Commissioner Khan, any FTC regulation on dark patterns would address the “serious shortcomings” of the current notice-and-choice privacy approach that U.S. law is largely based on.

Third, the FTC will look into rules and regulations to protect consumers from discrimination in algorithmic decision-making. Algorithms are traditionally protected by intellectual property laws because they are normally considered trade secrets. Therefore, algorithms do not face thorough, independent vetting.

Here, the FTC could consider rules that protect groups of consumers from being “disfavored” by the algorithms based on an individual’s protect status, including religion, race, medical status, gender, or sexual orientation. Businesses could be required to properly review their algorithms and implement policies and procedures to ensure algorithms are developed in a way that guards against intentional or unintentional discrimination.

Moving Forward

On both sides of the aisle and across all branches, the federal government is increasing their focus on privacy and data protection. As new rules and regulations are proposed and adopted, businesses will need to stay on top of evolving, and likely more onerous, privacy and cybersecurity standards.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Benesch | Attorney Advertising

Written by:

Benesch
Contact
more
less

Benesch on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide

This website uses cookies to improve user experience, track anonymous site usage, store authorization tokens and permit sharing on social media networks. By continuing to browse this website you accept the use of cookies. Click here to read more about how we use cookies.