Privacy Please: HIPAA and Artificial Intelligence – Part I
What if Artificial Intelligence (AI) is deployed within a health system to apply machine learning to patient information, in part, to allow patients to download information and wellness numbers (such as steps, blood pressure, and blood glucose levels) and to check on their own well-being without coming in for a professional visit? These activities could bring the AI developer under federal and state laws protecting health information privacy, particularly the Health Insurance Portability and Accountability Act and its implementing regulations (HIPAA).
The health care industry is heavily regulated, and the requirements are not always consistent with “typical” business arrangements. AI developers or vendors exploring these opportunities should step into this world with eyes wide open. Unfortunately, many vendors are not aware of existing risks and often find themselves sliding down the slippery slope of health care regulation. To avoid that result, AI developers need to understand what activities may trigger HIPAA obligations and structure their activities accordingly.
Additionally, covered entities and business associates should consider whether AI applications and tools that they may use need to be considered in their assessments for identifying business associates and in their HIPAA security risk analyses and risk management activities. The first part of this blog will summarize HIPAA and some scenarios given by government that do not bring AI under HIPAA. The second part of this blog discusses scenarios that trigger HIPAA and key questions to ask in any AI arrangement.
What Is HIPAA?
HIPAA establishes a federal floor for safeguarding certain identifiable health information called “protected health information” (commonly called PHI). HIPAA establishes: requirements on the use and disclosure of PHI; rights of individuals with respect to PHI; administrative, physical, and technical security safeguards to be implemented to protect PHI; notification obligations in the event of a breach of unsecure PHI; and (being a federal regulation) administrative requirements.
PHI is broadly defined as health information, including demographic information, relating to an individual’s mental or physical condition, treatment, and payment for health care services. As mentioned, PHI includes demographic information when it ties the individual to health care (such as identifying an individual as a patient of a health care provider or a member of a health plan). HIPAA applies to “covered entities” and “business associate.” Covered entities are: health care providers that engage in certain electronic HIPAA-covered transactions, generally related to payment for health care services; health plans, including insurance companies and group health plans sponsored by employers; and health care clearinghouses. Business associates generally create, receive, maintain, or transmit PHI as part of providing services to covered entities or other business associates.
AI and HIPAA – OCR Guidance
The Department of Health and Human Services’ Office for Civil Rights (OCR), which is a primary enforcement agency for HIPAA, published guidance relating to the HIPAA implications for developers of health applications or “apps” (the Guidance). Although AI was not directly discussed, the OCR Guidance is instructive for numerous common AI situations. In the Guidance, OCR walked through the HIPAA analysis for six scenarios. The focus of these scenarios is to identify whether the particular arrangement would cause the app developer to be a business associate and, thus, covered by HIPAA. Under the first four scenarios, the app developer is not considered a business associate because the developer did not create, receive, maintain, or transmit PHI on behalf of a covered entity (or another business associate). Essentially, to be deemed a business associate: (i) the person or entity must be providing services to – or acting on behalf of – a covered entity or business associate; and (ii) PHI must be involved.
- Scenario 1: In the first scenario, a consumer downloads a health app to her smart phone and populates the app with health information to help her organize her health information. No health care provider is involved. A consumer is not a covered entity or a business associate and may use and disclose her own information as she decides. (This will be the case in all of the scenarios.) Because the consumer is voluntarily inputting the information, the app developer is not creating, receiving, maintaining, or transmitting PHI on behalf of a covered entity or business associate and, therefore, is not a business associate. If the same app scenario were applied to AI, then HIPAA would not be triggered because the AI is only interfacing directly with consumers.
- Scenario 2: A consumer downloads an app to help manage a chronic condition. She downloads data from her physician’s electronic health record (EHR) through a patient portal and adds additional information into the app. HIPAA does not apply in this scenario. Although the app developer obtains information from an EHR, this information was obtained by the consumer and uploaded by the consumer through the designated patient portal. The physician, who likely is a covered entity, has not retained the app developer to facilitate this service. Although the developer is creating, receiving, maintaining, and transmitting PHI, it is not acting on behalf of a covered entity or business associate and, therefore, is not subject to HIPAA. (If the physician is a covered entity, then the physician must comply with HIPAA. This will be the case throughout the scenarios.) The key issue here for AI is whether the developer is acting on behalf of a covered entity (or business associate).
- Scenario 3: A doctor recommends that his patient use a particular app that tracks diet, exercise, and weight. The consumer downloads and uses the app, which sends summary reports to the doctor. The developer is not subject to HIPAA under this scenario. Although the doctor recommended the app, the doctor did not retain the app developer. The consumer’s use of the app to transmit data to a covered physician does not make the developer a business associate of the covered entity because the consumer is controlling the dissemination of the information. Again, the focus on this scenario is on whether the AI is being used on behalf of the consumer and not the covered entity.
- Scenario 4: A consumer downloads an app to her smart phone to help manage a chronic condition. The health care provider and the app developer have entered into an interoperability agreement, at the consumer’s request, to allow the consumer to safely transmit information to the provider through the app. The consumer inputs information into the app and has the app transmit the information to the health care provider’s EHR. OCR deems this to be the same situation as the first two scenarios. The developer is not subject to HIPAA because it is not creating, receiving, maintaining, or transmitting PHI on behalf of a covered entity or business associate. The use of the app simply to transmit information from a consumer to a covered entity does not make the developer subject to HIPAA. Another lesson for AI is that an interoperability agreement is not the same as a services agreement.
These first four scenarios would not trigger HIPAA and illustrate safe scenarios in which consumers control the information submitted and utilize an app developed with no more than an interoperability agreement with a health care provider. But AI-assisted health care may involve much more than off-the-shelf general purpose AI tools, and additional app scenarios presented by OCR serve as cautions for those developing more customized implementations.
More on that and questions for any AI arrangement will be addressed in Part 2 of this blog.