BY Maureen Williams, Product Marketing Manager, MEDITECH

Last year, a colleague of mine wrote an apology to clinicians on behalf of all EHR vendors for the pain we’ve put them through. It resonated with many readers, who have soured on the overly hopeful messages of clinical and business transformation coming from the industry. It’s important for all of us to honestly acknowledge where we are, how we got here, and what we need to do to restore the confidence of EHR users — particularly clinicians, whose lives have been irrevocably changed, for better and for worse, by the computerization of healthcare.

But the equally important part of last year’s piece — the part that followed the apology — was a look at “the path forward.” We are already on that path, and it’s already leading to positive changes for our clinical users. Most EHR vendors are now embedding physicians, nurses, and other clinical software users into their development teams. As a result, we’re designing software that’s more mobile, nimble, and intuitive. And we’re testing the usability of our software with real clinical users in real clinical settings, before releasing it.

Now I’d like to look a little further down that path, because there are developments on the horizon that are poised to usher in a whole new era of much-needed usability and efficiency improvements for providers. Just like the current mobility revolution (which is untethering clinicians and allowing them to turn their focus back to their patients), these emerging trends are not things we vendors can take credit for. We didn’t create the smartphone, the tablet, or the Internet; but we can find a way to quickly assimilate new technologies and innovations from other fields and leverage them effectively in our own.

Like it or not, we are entering a new era of Artificial Intelligence, and most of us are already starting to benefit from it. Siri and Alexa are quite literally, for many of us, household names. These interactive software agents, leveraging improvements in voice recognition and machine learning, are performing an ever increasing number and variety of tasks. Moreover, when integrated into an ecosystem of smart devices, they can perform increasingly complex tasks which will be necessary for their use in healthcare.

Many observers have commented on the vast potential for AI and machine learning to improve predictive diagnostics, as massive data sets can be mined for patterns and indicators no human could possibly detect. But physicians need help right now, and AI can be deployed much sooner for other important use cases. As vendors, we must capitalize on these advancements to address the two primary bottlenecks that have frustrated clinicians from the first day they were asked to use an EHR: Getting data in and getting data out of their electronic records.

Most of us have struggled at one point or another to get a so-called “smart” phone or speaker to understand what seems to us a simple question or phrase, and examples abound on the Internet of bizarre responses from these supposedly “intelligent” devices. But Natural Language Processing has advanced dramatically, and is on a trajectory to continue its rapid rate of improvement. Just a few years ago, clinicians had to spend hours painstakingly training their voice recognition applications to understand their individual voice patterns. Today, little to no voice training is required. Combined with improved medical terminology libraries, this technology is now enabling the creation of full-fledged virtual medical assistants.

What this means is that providers will soon be able to use this early form of AI to do far more than just enter progress notes. They’ll be able to ask questions of their EHR, like “has this patient received her flu vaccination?” or “when was the last time this patient had his HbA1c?” The evolution from simple dictation aid to interactive intelligent assistant represents an enormous step toward addressing the current inefficiencies of data entry and retrieval. The frustration providers experience around “note bloat” could be dramatically reduced if they can find what they need by simply asking.

Once this happens, the next logical step down this path is the use of ambient listening devices during patient encounters. Of course, this assumes we culturally get past the “creepy factor” — and, more importantly, the privacy concerns — associated with phones and smart devices constantly listening for the so-called “wake word” (and trust they’re not recording us all the time). These issues notwithstanding, Amazon recently announced that Alexa already supports HIPAA-compliant services, and many of us in the vendor community have teams using it to test the pre-population of progress notes, queuing of orders, and preparation of other data elements for physicians to confirm, simply based on listening to conversations between a patient and provider. After all, it’s quite easy for us to identify and distinguish between voices, so it’s likely a physician’s assistant or e-scribe will soon sit innocuously on a nearby desk or table and prep a significant portion of the visit documentation.

Down a parallel path, a growing field of patient devices is likely to become the home monitoring and early warning system we need to keep patients connected between visits. Of course, providers have long been wary of opening the EHR to a flood of unfiltered data coming directly from patients, but the advent of open APIs in healthcare means a huge expansion in the ways both patient-owned and facility-owned devices can be used, not to mention the value of the data they can provide. EHR vendors are now enabling their customers and third-party developers to create apps that tap data in their systems, which will allow for much more personalization, innovation, and user control than ever before. This is just what we need to truly boost portal usage: innovative, well-targeted apps and extensions that engage patients in their own health, and allow for the sharing of meaningful data.

We think patient contributions can save providers time and improve their efficiency, when conditioned effectively and entered as structured data. This is the philosophy that animates the OurNotes Initiative, which promotes better engagement by inviting patients to contribute to their own electronic medical records. Patients might be asked, for example, to share a list of topics or questions they’d like to cover during an upcoming visit, helping avoid “scope creep” during the appointment. They could be invited to review and sign off on notes after a visit, ensuring understanding and improving adherence. As John Mafi, MD and lead author of a recent study on physician perceptions of the OurNotes model said, “The idea of having patients doing some of the documentation and getting their voices heard could be a win-win.”

I would be remiss if I didn’t acknowledge the potential risks of some of these developments. AI and machine learning, for example, can open us up to malign actors, as Lee Kim, director of privacy and security at HIMSS reminds us. The more we automate systems, the more we open ourselves up to the potential for both malicious behavior and algorithmic error. The recent Boeing 787 Max 8 scandal starkly demonstrates what can happen when human expertise is overridden by machines. Even if we’re talking about information automation and not automation in the operating theater, clinical decisions are only as sound as the evidence that supports them, and we must ensure its integrity. Likewise, opening the EHR through APIs comes with certain risks.

But these aren’t reasons to hesitate on the path. Just reminders that we must proceed with care. If we do, we can make the kinds of improvements clinicians need right now: ones that make getting data in and getting data out of our EHRs more efficient. After all, how can we expect providers to effectively manage entire populations of patients, when we still have room to improve the individual patient encounter?

It took the healthcare industry a while to embrace the Internet, the tablet, and the smartphone. We want to be sure we’re assimilating new breakthroughs as rapidly as we safely can, and we think it’s imperative to start with clinician usability. Let’s not wait for these new technologies to solve all of our clinical and business intelligence challenges. Let’s work together to put them to use right now, and improve the lives of today’s clinicians.