The 2017 Global Consumer Pulse report by Accenture states that 61% of customers around the world and 78% in emerging markets such as India switch companies due to bad customer experience.
Since one of the primary objectives of deploying an Intelligent Virtual Assistant (IVA) solution is to enhance customer experience (CX) for enterprises, it is crucial to ensure that IVAs are equipped with the capabilities to deliver on that goal.
AI approaches today understand human language and handle contextual responses in spectacularly different ways. Although we are constantly improving, AI assistants do commit errors. Natural language predictions can go wrong but how we manage these scenarios is crucial to ensure good CX.
As Conversational AI developers, we should not be leading customers to failing conversations. Instead, a good IVA should be able to understand user intent, ensure that the conversation stays on track, and ensure that the customer has a positive experience with the brand.
For Intelligent Virtual Assistants built on the Haptik platform, we work backwards to cover scenarios where a customer might get stuck or receive an irrelevant response. Enterprises often cannot predict customer expectations. This might hamper their CX over time, as end-users expect an effortless and smooth experience while interacting with a brand’s AI assistants.
What is Disambiguation?
Disambiguation basically means the removal of ambiguity by making something clear. While communicating with someone, when you find ambiguity in their response, you would prod the other person with a follow-up question. We have used the same first principles of human conversation and built it as part of the Haptik NLU engine.
Intent Mismatch Scenarios Solved with Disambiguation
From your customer conversation data, you could get instances of partial or incomplete messages sent to the Intelligent Virtual Assistant. And somehow the NLU (Natural Language Understanding) has to interpret the full context from the customer. This issue is even more rampant when dealing with conversational topics that are more confusing for your end customers. Let us take a look at a few interesting examples below:While interacting with an e-commerce IVA, a customer types “Price”. Now, how would the IVA understand if user wants to know the price for product A or B?
Another example is a telecom company IVA, which could go two ways when customer says “Recharge”. The two intents that match closely are:
1. Recharge failed
2. My Recharge
To resolve ambiguity in responses and enable our Conversational AI platform to respond accurately, we embedded a Disambiguation logic in our NLU core. The power to disambiguate poorly constructed user messages can make IVAs work 10X better as compared to traditional chatbot platforms.
But, how did we arrive at this problem?
We observed a trend of frustrating, head-scratching conversations and gathered insights from our AI Analytics system.