For the past three years, health system innovation centers have been buzzing with ideas on how to adopt the rapidly evolving world of augmented reality (AR) into current workflows. At the intersection of people, processes, and technology, there are boundless opportunities to project real time information into a clinician’s field of vision providing both instant recall and encyclopedic knowledge of their patient’s condition.
WellStar’s Center for Health Transformation took a deep look at several cases for AR in the hospital setting. In collaboration with technology partners such as Georgia Tech Research Institute, we evaluated several practical use cases, built working prototypes, and offered them to physicians and nurses to test in simulated work environments. One example of our AR experiments was the creation of an application that continuously projects the vital signs from the anesthesiologists monitors onto the surgeon’s field of vision through a Google Glass display, improving the surgeon’s situational awareness of the patient’s physiological status during procedure, without having to take their eyes off the surgical field. Another experiment involved the use of the Epson Moverio BT 200 to assist with a relatively complex medical device setup procedure involving the pneumothorax chest drainage system. This involved a promising tech startup company called CN2, building an application which projected the step by step setup procedures into the eyewear, by overlaying the instructions onto the chest tube drainage system in real time.
These experiments brought to light five overwhelming design constraints that affect the practicality of integrating this type of technology in the hospital setting.
1) The integrated data must be intuitive – it cannot be solely textual, lest the physician be relegated to trying to read tiny print on a heads-up display (HUD). Color coding data and varying the intensity for urgent warning indicators was critical.
2) The Data view cannot be obstructive – the data projected onto the HUD must be virtually invisible, unless the eye is actively searching for the information. At the same time, the data image must be able to draw the attention of the wearer when appropriate.
3) The data must be of high value and timely – we don’t want a surgeon performing an operation to sort through unimportant fields or irrelevant data.
4) The use case must stay within the bounds of the technology – the AR streaming device is not useful if it runs low on power midway through the procedure, or if it must reboot for each new patient encounter.
5) Firmwear updates and signal interference are not minor irritations – instead, they are real safety concerns that necessitate continuous technical support if any application is ever to be adopted in actual healthcare situations.
Given these constraints and inherent dangers, how will AR successfully penetrate the healthcare environment?
The answer is in the creation and adoption of modules for non-clinical applications that utilize existing common devices such as mobile phones, rather than separate AR specific hardware.
One of the most promising of these non-clinical applications will be enhanced “Wayfinding”; providing turn by turn directions through a patient’s mobile device, that help them navigate the maze of buildings, floors, and corridors in large tertiary health centers. Despite spending thousands of dollars on signage, most hospitals still find it necessary to staff multiple “information or courtesy desks” to help direct patients and visitors to specific departments. Our internal survey found that the courtesy information desks answer more than 1,700 patient and visitor inquiries per day at our flagship hospital alone.
A few enterprising tech startups have already installed the active beaconing technology in several hospitals, which allow real time navigation inside of the buildings. Now combine this idea with enhanced AR, where you point the camera of your mobile device and see directional arrows projected on the floor, showing you exactly how to get to a selected patient room. On the way there, perhaps you point your device at the cafeteria sign, and you see today’s menu on your screen. Perhaps you are passing the gift shop and a e-coupon appears on your mobile phone offering you 20 percent off flowers and cards.
It is not hard to imagine taking AR a step further, integrating with your devices appointment calendar, synching with a map program to alert you when to leave home, where you should park and which hospital entrance to use. Perhaps such integration could be further enhanced by the department receptionist, being alerted to your presence in the building with your picture appearing on their device as you enter the department, allowing a personal greeting on arrival.
Another non-clinical use case for AR—many health systems have a high non-English speaking population requiring dual language signage and other translation services. A startup company called Word Lens designed an AR application that allowed you to point your phone at any sign and instantly see it translated on your screen. Since it was acquired by Google Translate, this type of functionality has improved to a startling degree, translating entire phrases rather than single words, and has become an essential tool for any international traveler. Hospitals can learn from this non-clinical application. For instance, by including a series of QR Codes on each sign so that when your camera is pointed, it may eventually give a grammatically correct translation of the text in any common language.
There are hundreds of other non-clinical use cases for AR in the healthcare setting that have yet to be conceived. With emerging AR companies creating new applications, there are opportunities to improve the patient experience and lower operational costs for health systems. At the end of the day, our patients come first. AR can be a valuable tool to better connect with them and deliver care.