Artificial intelligence (AI) has transformed the way we do business and our everyday lives. Virtual assistants, computer-aided diagnosis and clinical decision support are just a few examples of how artificial intelligence in healthcare has transformed the sector.
Yet, there is a dark side of AI. The malicious use of artificial intelligence in healthcare can create significant problems for the sector and beyond. AI may be developed with a malicious purpose in mind, or AI may be exploited by adversaries with a malicious purpose – in other words, currently available technology may be intentionally misused.
Checks and Balances in Innovation
While many view AI as a panacea, it is not without faults. Through the use of AI, a machine may make mistakes that a human may never make. For reasons such as these, it is important to keep in mind the concept of “man and machine” instead of “man versus machine.” The healthcare sector needs to be involved in a dialogue to take an active role in leveraging innovative technologies such as AI, but with a system of checks and balances.
AI systems of today often contain a number of exploitable vulnerabilities. For example, data poisoning attacks may occur by introducing training data that causes a learning system to make mistakes or an adversary might introduce inputs that are designed to be misclassified by learning systems. Furthermore, the core of an AI system is its data processing and decision-making engine. The security and integrity of the data processing and decision-making engine, including inputs, rules and otherwise, are quite important. If any aspect is tampered with and if there is no human “check,” then it is quite possible that significant harm may occur. In the case of healthcare, this may result in potential harm, injury, or even death to a patient.
Watch Lee Kim talk about healthcare security now and in the future on HIMSS TV.
New Threat Possibilities
AI systems have novel vulnerabilities that may be exploited to create new types of attacks. In the cyber realm, phishing – which is most identified as the initial point of compromise in cyberattacks according to the results of the 2019 HIMSS Cybersecurity Survey – may be automated through the use of artificial intelligence in healthcare. Spear-phishing, in fact, tends to be an effective form of phishing since it is often tailored to the recipient using intelligence gathered about the recipient. Indeed, fully automated spear-phishing attacks can be potentially disruptive for many organizations.
AI systems may also be used to conduct attacks on cyber-physical systems. As an example, medical cyber-physical systems are life-critical, networked systems of medical devices that are involved in treating patients. A compromise of a critical component within such a medical cyber-physical system can pose a significant risk to patient safety.
In another example, 3D printing is used extensively in healthcare, whether in terms of creating customized prosthetics, implants, tissue and organ fabrication, or otherwise. However, an AI-enabled attack may pose a significant threat to 3D printing (and its applications). For example, a critical component of a 3D printed product may be intentionally malformed or defective, potentially leading to patient harm. Or, the 3D printing system may be “controlled” by an adversary to build a malicious autonomous system.
Technology is an everyday part of our lives. AI is a tool, but one which can be used for good and bad purposes. We must be vigilant in securing and protecting the technologies we design, build and deploy, especially in the healthcare sector. People depend upon us every day and we ought not to let them down.
HIMSS AsiaPac19: Empower Value Creation
October 7-10 | Bangkok, Thailand
Pinpointing the value and impact of health information and technology is complex. Value is demonstrated in many ways and providers arrive at value differently. HIMSS AsiaPac19 will offer global best practices for you and your organization to plan for, implement and commit to creating quantifiable and sustainable value-based care for your patients and consumers.
Originally published November 19, 2018; updated March 25, 2019