Advanced Computing in the Age of AI | Thursday, April 25, 2024

Conversational AI Continuing to Mature for Customer Service 

As more companies use conversational AI bots as a front line of customer service for consumers, the experience can typically be either helpful or maddening.

Ganesh Gopalan, the CEO and co-founder of conversational AI vendor Gnani.ai, told EnterpriseAI that he understands those reactions, but insists that in another two to three years, conversational AI bots will be much improved and inspire more confidence among consumer and business users.

Early conversational AI systems were built around keywords, which when mentioned by a consumer caller, would then invoke a related response, said Gopalan. The problem was that this process often lacked context and did not adjust for regional dialects or casual wordings. For users, that could be frustrating because when the bot did not understand them, they were transferred to a live agent and had to repeat all they had already shared with the bot. Maddening.

In late 2021, the success of conversational AI for consumers depends on what they are asking it to do for them, said Gopalan.

“If you are going to book an appointment, that works today,” he said. Even booking a test drive for a car and other tasks can be done using bots, he added. But problems can arise if the customer’s request does not meet the expectations of the AI bot.

Ganesh Gopalan, CEO of Gnani.ai

“If you are looking for a test drive at a car dealer, you could say that in a million ways,” said Gopalan. “You could say you want a test drive, but you could also say ‘I want to just check out a Ford Mustang.’”

Unless the conversational AI of the bot is trained for such a variation, it will be unable to understand what the caller is requesting, he said. “The way we currently code this stuff is you come up with the initial set of ways, and then feed it to natural language processing algorithms. They would generate similar sounding sentences, but it may not cover all the options are for all regions and there could be some peculiarities with certain use cases.”

On the other hand, for many customer service calls today, conversational AI bots work well about 70 percent of the time, he said.

“I think all the routine stuff can definitely be done today,” Gopalan said. “If somebody is calling an insurance company to know about their policies or their status and stuff, all that can be done. But if a customer is calling to complain about their claim not being covered and they want somebody to just listen on the other end, I do not think that is completely ready today.”

So when might more reactive and understanding conversational AI become possible?

“In a couple of years, I think that it should be getting better and better,” he said. “What happens is the system learns from the mistakes, so maybe you start with a system that works 80 percent of the time. The problem is when companies don't fix the remaining 20 percent, then the system gets messed up. You need a learning system, and you need NLP or a company like us to focus on specific use cases.”

That kind of customization for each use case is critical to enable accurate ad usable conversational AI for a wide range of uses and industries, said Gopalan.

“Everything has to be customized to some level,” he said. “Just taking something off the shelf and trying to plot two things together, that is not going to work. You are not going to anticipate everything that is going to happen, and the models that generate new sentences are not going to generate everything you will need.”

And even as conversational AI continues to mature and improve, it will never be able to entirely replace human interactions with a live customer service agent, said Gopalan.

“People are going to call in and complain, or they are going to have complicated problems to be solved,” he said. “You still are always going to have human beings doing that customer service work.”

Rob Enderle, principal analyst with Enderle Group, told EnterpriseAI that the technology has been around for some time, but that companies including Gnani.ai are working to help reduce its costs and offer it to more enterprises.

“The technology is relatively mature, but rather pricey in its mature form,” said Enderle. “IBM Watson has been used in production for insurance sales, and it was so believable some of the men who were called attempted to ask the virtual woman they were talking to out [on a date]. The issue is no longer the performance of the hardware, but the level of effort it takes to train it.”

That training has been highly labor- and cost-intensive, said Enderle, and the hope is that the next generation of neuromorphic computers will reduce the related training time and costs.

“Conversational computing can be implemented successfully today with a big enough budget, and the industry is working to get that cost down to something that is far more reasonable,” he said. “Once they do that, likely in the next three to five years – and implement it as a cloud service, this technology should become far more commonplace.”

The conversational AI market “represents the next big step in the man-machine interface,” said Enderle. “Once mature, it promises to change dramatically how we interact with computers, putting far more of the communications load on them. This capability makes conversational computing a critical step to creating the future of computing and thus one of the most critical efforts currently under development.”

Gnani.ai recently announced a partnership with global contact center and business process outsourcing vendor Transcosmos to offer its conversational AI product suite and services through Transcosmos’ call center network. The move aims to help Gnani.ai expand its North American presence, the companies said.

In July, AI conversational intelligence platform Chorus.ai was acquired by ZoomInfo for $575 million. Chorus.ai uses machine learning to bring together new data insights from traditionally untapped data workflows inside companies so they can help boost additional sales efforts.

 

EnterpriseAI