Accuracy and Oversight: Managing the Risk of AI “Hallucinations”
Generative AI models may produce information that appears plausible but is ultimately false, misleading, or misinterpreted (commonly referred to as “hallucinations”). Investment advisers, as fiduciaries, are subject to a duty of care to provide investment advice in the best interest of their clients. Investment advisers must ensure the investment advice provided to clients is based on factually sound and accurate information. Thus, investment advisers should implement processes to validate information generated by AI, especially when such information informs the development of investment advice for clients. For example, if an investment adviser uses an AI tool to summarize information or data that will inform investment decisions, it should consider how to incorporate human reviewers into that process and conduct periodic testing to confirm that the output from the tool is accurate.
Confidentiality Considerations: Vendor Terms, Confidentiality, and Data Retention
Investment advisers must consider how any arrangements with an AI vendor will treat confidential information. Information exposed to an AI tool, for example, can be used to train the AI tool’s model and potentially be discoverable by other users, both internally and externally. An investment adviser, as a fiduciary, has a duty to safeguard client confidential information. Typically, investment advisory agreements with clients and other service provider agreements are subject to confidentiality provisions in contracts with clients. RIAs are also subject to Regulation S-P, which imposes disclosure and compliance obligations regarding the safety and privacy of client information.
Investment advisers should confirm that agreements with an AI vendor include confidentiality provisions that sufficiently protect the information uploaded to the AI tool (including the investment adviser’s own confidential or proprietary information, as well as its clients’ confidential information) from model training or unrelated processing. Investment advisers should also conduct initial and ongoing due diligence into the vendor’s practices, including to assess the vendor’s data security practices.
AI use can expose sensitive information across internal teams or external vendors if not properly managed. Both RIAs and ERAs should apply strict access controls to restrict data visibility by role or department and implement data segregation mechanisms to prevent unauthorized exposure. Advisers should also periodically audit AI logs, prompts, and usage patterns for compliance.
AI platforms often function conversationally, creating iterative records that give rise to other risks, in addition to the confidentiality concerns noted above. For example, information entered into such tools can be exposed to third parties, thus waiving attorney-client privilege.
Investment advisers should also consider vendor data retention risks, including the extent to which data is processed and stored on vendor cloud servers and the limitations of so-called “zero-retention” settings. For example, even vendors that claim they do not retain data could become subject to court orders or other compelled disclosures that would require the vendor to retain and produce outputs or processed documents that contain sensitive information (e.g., court orders mandating preservation of user data, including chats deleted by user request or due to privacy obligations). Investment advisers should weigh these potential risks in light of the type of information they intend to expose to the AI tools.
Advisers Act Compliance and Disclosure Considerations
For RIAs, the use of AI intersects directly with various obligations under the Advisers Act, including the following:
- Disclosure. Form ADV Part 2A requires an RIA to disclose its methods of analysis and investment strategies used to formulate investment advice or manage assets. RIAs should consider whether their Form ADV Part 2A brochure should describe AI-based analytical methods or investment processes, especially if AI is used in connection with the formulation of investment decisions.
- Recordkeeping. Rule 204-2(a)(7) under the Advisers Act requires RIAs to maintain records related to their investment advisory business activities, including certain types of written communications (e.g., communications regarding investment recommendations that are made or proposed). RIAs should consider whether they have processes in place that will capture and retain records of AI output, especially if such information is used in connection with client conversations or the formulation of investment advice. As noted above, because AI platforms often function conversationally, RIAs should be mindful that any records created and maintained could be subject to review by regulators, and thus any AI output that does not accurately reflect the intent of the users could expose the RIA to legal and regulatory risk.
- Conflicts of Interests. Investment advisers, as fiduciaries, are subject to a duty of loyalty and must address any potential conflicts of interest arising from AI bias or vendor relationships. In considering whether to implement any new AI tool—especially in connection with the formulation of investment advice—an investment adviser should conduct some due diligence to assess whether the tool may produce biased results that gives rise to conflicts of interest that must be disclosed to clients.
- Marketing. Investment advisers should avoid “AI-washing” or overstating AI capabilities in offering materials and investor communications. References to AI in marketing or other public communications must comply with Rule 206(4)-1 under the Advisers Act and may draw heightened scrutiny from SEC examination staff. The SEC has recently charged investment advisers for making false or misleading statements about their AI capabilities.[2] RIAs should adopt policies and procedures requiring enhanced review of materials discussing any AI-related capabilities. As noted in the SEC’s 2025 Exam Priorities, examinations may closely assess investment advisers’ AI-related compliance policies and procedures and disclosures to investors.
- Compliance Program. Rule 206(4)-7 under the Advisers Act requires RIAs to adopt and implement written policies that are reasonably designed to prevent the RIA and its supervised persons from violating the Advisers Act and rules thereunder. These policies and procedures must be tailored to the RIA’s business. Thus, RIAs that use AI tools in connection with their business must implement policies and procedures that address the regulatory and compliance risks posed by the use of those tools, including the issues discussed above.
Governance, Training, and Oversight
Given the evolving regulatory and legal risks, investment advisers that adopt AI tools should consider establishing a formal governance process (e.g., an AI committee) to oversee AI tools and risk mitigation, including the following functions:
- Establishing written AI policies that address the accuracy, confidentiality, recordkeeping, and other compliance risks, consistent with Rule 206(4)-7 under the Advisers Act;
- Mapping AI usage across departments and implementing an internal approval process for both new AI tools and users;
- Defining permissible AI uses and approval procedures;
- Conducting annual regular staff training that addresses the firm’s AI policies and procedures and other risks and best practices; and
- Periodically reviewing AI governance frameworks to adapt to evolving AI technology and regulatory guidance.
Conclusion
AI technologies present opportunities for investment advisers seeking to enhance research efficiency, data management, and internal operations. However, these benefits must be balanced against the need to comply with regulatory and compliance obligations and manage legal and other risks. Before using any AI tool, investment advisers should adopt a strong AI governance framework, policies and procedures, and other safeguards to manage evolving legal, operational, and regulatory risks.
[1] While they are not subject to many of the SEC’s rules promulgated under the Advisers Act, exempt reporting advisers (ERAs) are “investment advisers” under the Advisers Act and thus subject to the Adviser Act’s antifraud provisions. Many of the considerations discussed in this article are also relevant for ERAs.
[2] See “SEC Charges Two Investment Advisers with Making False and Misleading Statements About Their Use of Artificial Intelligence,” (SEC Press Rel. No. 2024-36).