Healthcare in the AI Crosshairs
News broke last month that Google partnered with the second-largest health system in the U.S., Ascension Healthcare, to collect and analyze the health information of millions of patients in a program code-named "Project Nightingale." Under Project Nightingale, Ascension transfers its patients’ health information, including personally identifiable demographic, treatment and diagnostic information, to Google’s systems.
Google then uses this data in part to create new software that runs on advanced artificial intelligence (AI) that helps process data and make suggestions about a patient’s diagnosis and prescriptions, as well as recommendations for enforcement of a hospital’s narcotics policy.
News of the partnership was met with widespread concerns, leading the U.S. Department of Health and Human Services Office for Civil Rights to announce that it would open a federal inquiry to examine the legality of the deal under the Health Insurance Portability and Accountability Act (HIPAA). Even before the partnership’s announcement, consumers had raised concerns about Google’s access to health data through its acquisition of Fitbit.
Consumers are realizing that Google and other companies in big tech are now expanding their reach into and becoming multi-billion dollar players in the healthcare industry, with the potential to combine medical and other data in myriad new ways. On the other hand, there may be benefits to this data sharing, and it is normal and common for healthcare providers to disclose health information to manage and improve a healthcare provider’s healthcare operations.
HIPAA BAA Outsourcing Rules
HIPAA permits a healthcare provider to use patients’ protected health information for its own healthcare operations, which includes "outcomes evaluation and development of clinical guidelines, provided that the obtaining of generalizable knowledge is not the primary purpose of any studies resulting from such activities," or "population-based activities relating to improving health or reducing health care costs."
Further, HIPAA permits a healthcare provider to outsource healthcare operations to a third party, referred to as a business associate, subject to a business associate agreement (BAA). The BAA must restrict the business associate’s use and disclosure of patient information, generally only allowing the business associate to use and disclose protected health information in a manner that would be permissible if done by the healthcare provider itself.
The benefit of such an arrangement is that the third party may have certain experience and expertise that can be used to improve the provider’s healthcare operations, especially with respect to developing technologies, such as AI, that the healthcare provider would not have in-house. However, there are exceptions to this general rule.
Specifically, a BAA may permit the business associate to use and disclose the information for its own "proper management and administration" and for "data aggregation purposes." A healthcare provider may also permit the business associate to use the patient information to create potentially valuable de-identified information. Once de-identified, the information is no longer subject to HIPAA, and the business associate may do whatever it wants with the de-identified information.
Much of the outcry around Ascension and Google’s partnership seems to arise from two factors.
- First, Google has so much other information about individuals that many people are skeptical of how Google will use patient information. Such concerns may be misplaced because HIPAA would prohibit, for example, Google’s use of identifiable patient information for purposes of targeted third-party advertisements.
- Second, some have criticized the lack of patient and clinician consent and transparency. However, HIPAA does not require a patient to consent to a provider’s use and disclosure of health information for healthcare operations. Further, if every healthcare provider had to obtain every patient’s consent or develop opt-out mechanisms for sharing patient information with each business associate, providers would face an overwhelming burden that might not produce any material privacy benefits but, more importantly, would undermine providers’ core mission of caring for patients.
As for transparency, healthcare providers typically have hundreds if not thousands of business associates. No healthcare provider notifies patients or staff about each one. Moreover, when companies partner to develop new AI to improve healthcare, they often refrain from discussing their efforts publicly to protect their competitive advantage.
For example, if Google announced at the onset that it was partnering with Ascension to develop AI to more quickly detect a type of cancer, then competitors could have raced to initiate the same activity and beat Google to register their patents. While consumers may not appreciate this reasoning, it explains the companies’ efforts to preserve confidentiality. After all, the tech industry’s announcements regarding innovations in healthcare moves financial markets, even if the work is very preliminary and may never lead anywhere.
Despite Outcry, Patient Information Sharing Offers Potential AI-Driven Improvements
Consumers’ initial concerns about Google’s use of the information are understandable, but it is not unreasonable to allow a technology company to access patient information to develop AI that will improve the healthcare delivered to the provider’s patient population (and which, incidentally, may help other patients, more broadly) if appropriate controls are in place to protect privacy.
When used correctly, AI offers great potential to improve healthcare operations, potentially evaluating outcomes and developing recommendations to improve clinical practices in a manner and scale that would not be feasible without the technology. The more data the AI ingests, the more valuable insights it potentially may uncover.
However, while the criticisms and benefits related to big tech’s relationship with healthcare providers are vast, the reality is very simple: any collaborations with big tech involving health information will be subject to intense scrutiny. The safest course of action for these partnerships may be, when feasible, to use de-identified information.
Further, transparency about any collaborations should be considered in order to avoid any bad publicity. While we noted above that there can be strong business reasons to protect confidentiality, the benefits of maintaining confidentiality need to be weighed against the risks that the project may end up as front-page news without a balanced discussion of the benefits to patients.