SEC’s Scrutiny of AI Expands to Marketing Disclosures and Adviser Exams

BakerHostetler

Key Takeaways

  • U.S. Securities and Exchange Commission (SEC or Commission) Chair Gary Gensler’s public statements[1] about the securities industry’s use of artificial intelligence (AI) technologies continue to focus both on micro and macro concerns as well as potential disclosure issues, including so-called “AI washing.”
  • Recently, the SEC’s Division of Examinations launched a sweep examination of investment advisers with respect to their use and governance of AI technologies, which likely will inform the SEC’s rulemaking agenda announced in July 2023.
  • While the SEC continues to scrutinize how public companies, investment advisers, broker-dealers, and other registrants use AI technologies, firms should carefully consider how the adoption of these technologies may affect their compliance obligations and promptly address any identified issues.

Chair Gensler Continues to Be Outspoken About the Risks of AI Technologies

Chair Gensler’s warnings of the potential risks associated with the use of AI technologies in the securities industry[2] recently continued with him addressing a new concern, “AI washing”[3] or misrepresentations related to a company’s use of AI technologies and their capabilities. On December 5, 2023 at The Messenger’s “AI: Balancing Innovation & Regulation” summit in Washington, D.C., Chair Gensler warned businesses against this practice with a simple, “don’t do it,”[4] noting that such misrepresentations are governed by the same securities laws that require “full, fair and truthful” disclosures.[5]

This concern echoes warnings by the Federal Trade Commission, which in February announced that businesses should keep “AI claims in check” and guard against using false claims or exaggerations related to AI.[6] And the SEC has made clear it is already looking into these issues. At an October New York City Bar Association event, SEC staff cautioned that the Commission is already looking into instances in which publicly traded companies and investment advisers are claiming a product uses AI when it does not. In addition, Chair Gensler reminded companies that basic disclosure concepts still apply to capital raising and securities transactions: fair and accurate descriptions of material risks to include those posed by AI, if applicable.

SEC Conducts Sweep Examinations of Financial Institutions’ Use of AI

As part of its continued focus on AI, and as major financial institutions explore the use of AI in their businesses, the SEC’s Division of Examinations is undertaking a sweep examination of investment advisers’ use and governance of AI technologies, including with respect to marketing materials related to AI, third-party providers and compliance training; controls related to conflicts of interest; and contingency plans for system failure. The information gathered through this sweep examination likely will inform the SEC’s rulemaking agenda on regulating the use of AI, including the proposed rule we previously covered as well as potential enforcement priorities.

Looking Ahead

Given Chair Gensler’s repeated public statements and the Division of Examinations sweep examination, it is clear that the SEC will continue to scrutinize the use of AI in the securities industry. While Chair Gensler explained that AI has been around for many years and that the SEC itself uses AI, he continues to warn of micro and macro issues and is not backing away from his warning that it is nearly unavoidable that AI will trigger a financial crisis. On a micro level, Chair Gensler raises concerns of biases that come through data and conflicts when AI optimizes the investment adviser’s or broker-dealer’s interests ahead of their customers’. On a macro level, Chair Gensler focuses on his belief that natural economies will lead to mono cultures and large parts of our financial markets will be trading and relying on the same data sets or base models, which would lead to market instability.

Broker-dealers, advisers and other regulated entities therefore should carefully consider the following issues when implementing AI technologies into their operations:

  • Understand AI Models and Data Sets. Although Chair Gensler has stated repeatedly that the SEC is “technology neutral,” that does not excuse the industry from failing to scrutinize how AI may affect compliance obligations. Because the algorithms and the data on which a company may rely present various risks, it is critical to understand both of them to implement adequate controls, craft accurate disclosures and satisfy any relevant standards of care to clients on an ongoing basis. This may be addressed by internal policies and practices that likely will contain components of other existing laws[7] that already require outward disclosures of AI use and the use of back-end audits of the AI (sometimes performed by independent third-party organizations).
  • Protect Data Privacy and Intellectual Property. It is similarly important to conduct diligence on data sets and the processes by which data is collected into them, to understand whether personal, other sensitive or protected information is included therein. This is particularly crucial where generative AI is used and may disclose this sensitive information without authorization. Organizations can mitigate these concerns by (i) understanding data provenance (including how data is collected and the applicable consent to use); (ii) adhering to data limitation principles where available (e.g., only collecting necessary data rather than everything available; (iii) deleting data when it’s not needed; (iv) evaluating subsets of data; (v) tokenizing or anonymizing data when such limited data sets will suffice for AI use; and (vi) using humans to double-check outputs from AI systems before pushing outputs into production.
  • Material Nonpublic Information and Market Manipulation. In addition to disclosure and conflict-of-interest issues, the SEC definitely will be monitoring AI use for deceptive activities, including insider trading and market manipulation. Given this, industry participants should examine whether data sets include material nonpublic information or confidential information that contains restrictions against use or disclosure. Similarly, controls should be implemented to prevent trading algorithms and broadcasted generative AI from engaging in any activity that could be viewed as artificially affecting the price of securities.
  • Revisit Disclosures and Revise As Necessary. Last but not least, industry participants should ensure that their disclosures and other marketing materials are accurate with respect to their representations about their use of AI technologies and their capabilities.

[1] “AI: Balancing Innovation & Regulation,” The Messenger (Dec. 5, 2023), available at https://themessenger.com/tech/ai-summit-balancing-innovation-regulation-messenger-watch-live.

[2] Jonathan A. Forman, James A. Sherer, Michelle N. Tanney, Teresa Goody Guillén and Shade Quailey, “SEC Proposes AI Rules for Broker-Dealers and Advisers After Chair’s Warnings,” BakerHostetler (Aug. 8, 2023), available at https://www.bakerlaw.com/insights/sec-proposes-ai-rules-for-broker-dealers-and-advisers-after-chairs-warnings/.

[3] Richard Vanderford, “SEC Probes Investment Advisers’ Use of AI,” The Wall Street Journal (Dec. 10, 2023), available at https://www.wsj.com/articles/sec-probes-investment-advisers-use-of-ai-48485279; see also Richard Vanderford, “SEC Head Warns Against ‘AI Washing,’ the High-Tech Version of ‘Greenwashing,’” The Wall Street Journal (Dec. 5, 2023), available at https://www.wsj.com/articles/sec-head-warns-against-ai-washing-the-high-tech-version-of-greenwashing-6ff60da9.

[4] Richard Vanderford, “SEC Head Warns Against ‘AI Washing,’ the High-Tech Version of ‘Greenwashing,’” The Wall Street Journal (Dec. 5, 2023), available at https://www.wsj.com/articles/sec-head-warns-against-ai-washing-the-high-tech-version-of-greenwashing-6ff60da9; see also Jackie Snow, “SEC Chief Gensler Warns of AI ‘Herding Effect’ Leading to a Market Crash,” The Messenger (Dec. 5, 2023), available at https://themessenger.com/business/sec-chief-gensler-warns-of-ai-herding-effect-leading-to-a-market-crash.

[5] Id.

[6] Richard Vanderford, “SEC Head Warns Against ‘AI Washing,’ the High-Tech Version of ‘Greenwashing,’” The Wall Street Journal (Dec. 5, 2023), available at https://www.wsj.com/articles/sec-head-warns-against-ai-washing-the-high-tech-version-of-greenwashing-6ff60da9; see also Michael Atleson, “Keep your AI claims in check,” Federal Trade Commission (Feb. 27, 2023), available at https://www.ftc.gov/business-guidance/blog/2023/02/keep-your-ai-claims-check.

[7] See, e.g., New York City Department of Consumer and Worker Protection Local Law 144 regarding automated employment decision tools.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© BakerHostetler | Attorney Advertising

Written by:

BakerHostetler
Contact
more
less

BakerHostetler on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide