With the advent of artificial intelligence (AI) and the realization of all its enormous potential, some medical providers have undertaken to use AI in the creation of their medical records. This brackish-waters approach, combining human intelligence with machine learning, raises some important questions as to overall propriety and bottom-line legality. Is the use of AI in the completion of a medical report or record appropriate from a compliance perspective? Can a medico-legal document, whose contents are partly authored by a machine’s logic and verbal skills, be deemed acceptable from a clinical, regulatory or facility by-law perspective? It’s all a bit murky.
According to a recent article from Johns Hopkins Medicine, “the regulatory frameworks needed to govern clinical implementation of AI in safe, ethical and equitable ways are still being constructed.” The federal government and certain states are in the beginning stages of addressing this question. For example, legislation was introduced back in March in Arkansas (House bill “H 1816, Health Care Providers”) that would restrict the use of AI in providing healthcare services or generating medical records unless specific conditions are met.
It Was Inevitable
According to one national auditing firm, there is a growing number of clinical providers making use of AI in the completion of the medical record. The way this works is the physician will dictate or type in certain key words or short-hand phrases, and the AI app turns those terms into complete sentences.
With the wide availability of these kinds of AI aids, their deployment in the medical record context was only a matter of time. But are there hidden pitfalls in their usage?
A Hidden Hitch
It is our understanding that some of the leading AI resources that medical providers use in the completion of their patient records store queries, as well as the answers they create, in “the cloud.” Rather than a billowy stratocumulus realm where winged horses effortlessly fly; here, “cloud,” refers to an array of computer servers located worldwide and accessed over the internet. Having medical data reside in the cloud allows users to perform tasks without having to store their data or run programs on their own local servers. Using the cloud does have a couple of advantages: it allows more data to be stored and often acts as a back-up with some applications.
But here’s the catch, according to one prominent healthcare attorney: if a patient’s protected health information (PHI) is stored in the cloud, you must have a business associate agreement (BAA) with the cloud service provider. The problem is that some of the leading AI applications have different plans, and not all of these plans may offer a BAA. Accordingly, if you are using such an app in the completion of your medical records, make sure that you execute a BAA with that cloud provider so that you remain HIPAA compliant.
It will be interesting to see if future legislation or rulemaking provides further guidance on the extent to which AI can be used in the creation of a patient’s medical record, such as a surgeon’s report or a nurse practitioner's visit note. An ever-changing world means ever-changing responses from the regulators. We will be here to keep our readers up to date with the latest as these changes roll our way.