ChatGPT and HIPAA, Caution is Needed, Even ChatGPT Says So!

Jackson Lewis P.C.
Contact

Recently, things may have sped up a little in your doctor’s office. The notes for your recent visit may have been organized and filed a little more quickly. You might have received assistance sooner than expected with a physician letter to your carrier concerning a claim. You also may have received copies of those medical records you have been waiting for, earlier than usual. Greasing the skids in these areas could be due to the use of generative AI technologies, such as ChatGPT, being leveraged in a myriad of ways across all industries, not just healthcare. But, as pointed out in this recent ScienceBlog post, some caution may be in order.

Healthcare providers and business associates subject to the Privacy and Security Rules under the Health Insurance Portability and Accountability Act (HIPAA) have some steps to consider before sharing protected health information (PHI) with a third party. One that should come as no surprise is determining whether the third party is a business associate. According to OCR guidance:

A “business associate” is a person or entity that performs certain functions or activities that involve the use or disclosure of protected health information on behalf of, or provides services to, a covered entity… The Privacy Rule lists some of the functions or activities, as well as the particular services, that make a person or entity a business associate, if the activity or service involves the use or disclosure of protected health information. The types of functions or activities that may make a person or entity a business associate include payment or health care operations activities, as well as other functions or activities regulated by the Administrative Simplification Rules.

Business associate functions and activities include: claims processing or administration; data analysis, processing or administration; utilization review; quality assurance; billing; benefit management; practice management; and repricing.  Business associate services are: legal; actuarial; accounting; consulting; data aggregation; management; administrative; accreditation; and financial. See the definition of “business associate” at 45 CFR 160.103.

If the third party is a business associate, sharing PHI with the third party must be preceded by entering into a business associate agreement. As a best practice, it also would be prudent to assess the cybersecurity policies and procedures of the third party as well. If the third party is not a business associate, in general, the authorization of the patient would be needed in order to share the information, unless there is a basis under HIPAA to make the disclosure without an authorization.

With ChatGPT or a similar service (offered by a third party to any physician’s office or health system), any doctor, nurse, office manager, administrator, or other staff member in a physician’s office or health system could set up an account (free or paid) and begin leveraging the large language model-based chatbot’s services. Those services might include help with various administrative tasks, such as those described above. The same might be true for workers employed by business associates providing services to physician office customers, for example.

Naturally, the question is whether ChatGPT or any similar service is a business associate under HIPAA. It may be, and it may be prepared to enter into a business associate agreement. Whether the service is HIPAA compliant would need to be an issue to assess when considering the applicable use cases that will be permitted. These are important determinations that any HIPAA covered entity or business associate needs to make before permitting the sharing of PHI on the chatbot’s platform, or that of any similar service. Of course, an alternative is to limit the use of these tools by policy and certainly provide some training. Employees at all levels of a healthcare organization simply may not appreciate flow of data, particularly considering the ease at which they could generate a quick patient letter or referral to a colleague.

For completeness, I asked ChatGPT: I am a healthcare provider, should I share patient data on ChatGPT. The chatbot responded:

As a healthcare provider, it is essential to prioritize patient privacy and adhere to data security regulations. Sharing patient data on this platform is not recommended, as it could compromise patient confidentiality. Instead, consider using secure and HIPAA-compliant communication channels for discussing sensitive patient information. If you have any medical or healthcare-related question, feel free to ask, and I’ll provide general information without requiring specific patient details.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Jackson Lewis P.C. | Attorney Advertising

Written by:

Jackson Lewis P.C.
Contact
more
less

Jackson Lewis P.C. on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide