From simple robotics to speed up a process, right through to intelligent AI that can learn for itself and make decisions, technology is reshaping our world and the way that businesses operate.
One of the frequent areas of debate is the extent to which AI will remove jobs. No one can know for sure, but my belief is that we won’t see ‘Armageddon’: while it will eradicate certain roles, it will also create new ones. In a customer service context, it will remove transactional processing roles (which are not really customer service anyway) and create more opportunities for people in roles that require higher order skills such as empathy, relationship building, problem solving and brilliant communication.
Organisations will need to integrate technology and human intervention in a blended way. Staff will therefore need to be technologically competent and have empathy for what a customer actually needs or wants. We will see a new role of customer experience and technology broker – which must be led by customer experience directors who are themselves attuned to both sides of the equation.
Research The Institute published in July has already suggested that AI will not simply obliterate millions of jobs. A quarter (26%) of employees we surveyed said that AI had led to job losses – but slightly more (28%) said that it has also created new opportunities.
I think it is important that we don’t get too hung up on the jobs impact of AI. After all, throughout history we have seen how society reshapes, re-forms and regrows itself after disruption: the agricultural revolution, the industrial revolution, the advent of the computer, the creation of the internet – none of them simply wiped jobs out. Life went on and new jobs were created.
For me, what is equally important is that we are asking ourselves, right now while AI is still in its infancy, what it is that we actually want to achieve with it. If businesses are not very clear about their objectives for AI and how that links to the organisation’s values and purpose, it could quickly take them to places they didn’t want to go.
We have to remember that AI goes way beyond robotics. When you programme an AI application to do something it won’t just confine itself to doing only that over and over again. The AI will learn for itself and start making its own decisions.
This means it’s essential the AI is set up in the right way from the outset – data sets that are incorrect or not relevant will simply be reinforced and magnified. So you don’t want to programme the machine with the instruction “help me sell more of these products.” You need to programme it with “help me understand how people are using my products, or how can we help customers more.” The old adage “rubbish in means rubbish out” is amplified because it will simply learn bad habits, ceaselessly looking for and finding ways of selling all kinds of unsuitable services or products.
In other words, AI soon takes us back to the kind of moral and ethical questions that have been dominating the headlines recently around data privacy, personal security and trust.
A higher use of AI will bring with it a higher risk of scandals around ethics and authenticity. This could lead to customers becoming alienated and disengaging themselves, or regulators becoming increasingly demanding and more prescriptive over dealings with vulnerable customers.
The question for customer experience directors and other members of the Board therefore needs to be not “can we build this?” but rather “what will we use this for and how will it improve the customer experience?”
Designed well, AI goes to the absolute heartland of customer service – helping organisations serve customers more effectively and giving them more choice. But customers are already somewhat ambivalent about AI and overall still prefer dealing with people, so trust needs to be built and sustained.
It’s only by showing customers that AI is really working in their interests that organisations will achieve the buy-in they need. The ones that get it wrong will risk losing customers and business amid a decline in trust and engagement.