A week or so ago, I was on a panel talking about IBM Watson and how, once technology like Watson is available to all of us, it would be life-changing. One of the other panelists took me to task sharing a bad experience (we’ve all been there) with Apple’s Siri and argued that digital assistants were stupid and never going to go anywhere. I think this defines the issue with how we perceive digital assistants. The assistants we currently have access to, mostly Apple Siri and Amazon Echo, aren’t AIs. They are speech-to-text front ends layered over Google search or scripts (to control things), and while they are artificial, they aren’t knowledgeable.
At IBM Think Arvind, Krisna took us through applications from Watson at Siemens, CVS, and Salesforce that were far more capable and, I think, showcase why Watson is falling short of its potential. It isn’t as much a technology problem as it is a perception problem. Most of us have never used Watson, so we fall back on what we have used, and those tools don’t reflect well on Digital Assistants. And, much like Elon Musk’s over-promising on his car autopilot technology (it isn’t yet autonomous driving), this disparity is hurting the potential for the coming wave of true AIs.
Let’s talk about that this week.
Siemens, CVS, Code-Net, and Salesforce
When a true AI like Watson is used, it bears little relationship to what Siri or Alexa can do. For instance, Siemens has created an offering targeting manufacturing called MindSphere. This product monitors the user and then learns from them how to accomplish tasks. Once trained, it can then anticipate what the user wants to do next with 95% accuracy. The user is still engaged for that extra 5%, but this technology automates most of what the user does, massively increasing their productivity while potentially reducing mistakes. Tedious complex processes that are repetitively focusing the user on tasks that engage their minds are more attractive. The user not only accomplishes more, but the job is more interesting as a result.
Code-Net uses AI in their AutoSQL offering, which analyzes data without ever having to move that data. This solution can also automate key repetitive processes using Watson Orchestrate, which handles their tedious, repetitive tasks for the user. Orchestrate keeps track of critical processes and offers advice helping the user to more deeply understand the issues they need to be aware of speeding up and assuring the positive outcome of the project. They use this tool to translate and transform programming languages, among other things.
CVS is using Watson to help turn insight into action. Users (customers in this case) will, once implemented, be able to find health care providers uniquely beneficial to them and will receive automated help navigating CVS prescription offerings. This tool allows users to help themselves more successfully, and the implementation of this tool has reduced cost and increased customer satisfaction.
Salesforce didn’t want the Pandemic to waste and wanted to make the best out of our Pandemic experience. They used the technology to help create their Einstein Chatbot capability, which has since jumped 706% in usage. This tool guides customers based on experience and documented prior expeditions on a self-help path to address their company problems. When given access to this tool, Salesforce found that customers enjoyed solving their problems personally, allowing them to get a faster turnaround and a better sense of control.
IBM Watson and its derivatives are nothing to Siri and Alexa. They are far more capable of analyzing your problem and either implementing a fix themselves or walking the user through a fix that probably wouldn’t have occurred to them. But for users and buyers, to get the full benefit, they too have to use Watson to see what it can do.
This perception problem is the same for autonomous cars; to get what these cars will be capable of, drivers have to experience them and realize that what Tesla is currently providing isn’t autonomous driving; it is enhanced cruise control. And like I’m convinced that if people rode in an actual autonomous vehicle, they’d love it (far safer, and you can watch movies, nap, or read while driving) they’ll love an AI that helps them complete tasks faster, with less risk, and higher quality.
Watson isn’t Siri or Alexa; it is far more capable and more of a tool than a toy. For Watson to reach its potential, its image has to emerge from behind far more common inadequate Digital Assistants like Siri and showcase to a larger audience what its true potential is. It can potentially change the quality of the users’ life by giving them the information they need when they need it. In short, to reach its potential, Watson needs to be far more visible. That’s coming, but the faster people get the power of Watson, the more they will realize that it, and the class of tools it represents, will represent much of what we are calling the 4th Industrial Revolution.