From the U. of Iowa, and kudos to them for educating and warning their employees:
As artificial intelligence (AI) continues to evolve, it is important to remember that most AI tools and services, including ChatGPT, are not HIPAA compliant. Therefore, it is not appropriate to use these tools or services in conjunction with patient protected health information (PHI).
In order to use application that processes, transmits, or stores patient information, such as an AI service, a proper security review, contracting, and business associate agreement must be completed. If you have a request to be able to use an AI system, please speak with your department director on how to pursue the request and initiate a security review.
Imputing patient information into an AI system could result in a HIPAA violation if the above conditions have not been met. For example, using ChatGPT to draft a patient letter or using an unapproved AI transcription service requires sharing of PHI with the application. Beware of these types of situations.
For additional information, please see IT Security Guidelines for the secure and ethical use of Artificial Intelligence.
Please contact the Joint Office for Compliance with questions at compliance[at]healthcare.uiowa.edu or 319-384-8282.
h/t, Becker’s Hospital Review