Operational AI v. A Brain in a Jar

Many commonly adopted artificial intelligence solutions on the market today effectively ingest large datasets, run algorithms and produce reports, recommendations or visualizations for employees to act upon. In other words, they operate like a brain in a jar.

This means hospitals are paying for solutions only to glean insights that still require human intervention and decision-making.

AI solutions designed to operate this way add extra items to employees’ to-do lists rather than position them for success. If AI was designed in such a way that it mimics the work humans do, hospitals could use it to scale their workforces and ensure their people have workloads that require uniquely human capabilities, such as problem solving, critical thinking and creativity.

Where AI shows most promise

The third most promising application of AI — after robot-assisted surgery and virtual nursing assistants — is administrative workflow assistance, according to a 2017 Accenture analysis. Automating these processes is poised to save an estimated $18 billion in annual benefits by 2026.

These savings could likely be greater if AI is implemented in a way that adds to human intelligence.

Some AI solutions currently on the market are designed to help employees by offering suggestions of next best actions. Such solutions take on a broad view of AI, one in which data is plugged into an algorithm to offer recommendations. No work is fully executed or transferred from the human to the bot, meaning these solutions won’t make as much of an impact as true automation, where a computer takes on labor intensive tasks and relieves employees of time-consuming, repetitive work.

What is a holistic design?

Imagine AI personified: A computer with hands, eyes, ears and a brain. In this way, AI would be analogous to a human; it would be able to think and work in the same way. This is the concept of Olive — our AI solution.

You can view Olive as simply another employee. Organizations grant Olive user access to various applications, such as their EMR, just as they would assign login credentials to a new hire. Olive doesn’t require any integrations or additional trainings; instead, she is able to interact with any tool and remove the barriers to seamless data exchange among systems.

Olive uses her “hands” to operate the same tools that a human would, thanks to robotic process automation(RPA). RPA allows Olive to interact with different user interfaces, native applications and web portals to complete tasks the same way people do. With computer vision, Olive has “eyes” to read files or faxes and complete workflows. Lastly, Olive’s “brain” describes how she can enrich data to make better decisions. It’s how she learns from the work existing employees do to complete tasks independently.

We have plans to build out other characteristics for Olive, such as “ears” so she can use natural language processing similar to Alexa or Siri to understand a voice on a phone or transcribe a physician’s speech in real time. We also have plans to give Olive a “mouth,” which would enable her to respond to human inquiries and automate processes like appointment setting and reminders.

With hands, eyes and a brain, Olive is AI designed holistically. Check out our “Brain in a Jar” video to learn more about Olive’s current and future human-like features.

By using the human as inspiration for how AI functions, true automation — and subsequent workforce scaling — is achieved.

Learn more about the technologies that make up operational artificial intelligence in Artificial Intelligence 101: 5 Terms to Add to Your AI Playbook.

 

This article was originally posted on Becker’s Hospital Review.