The experience is often painful. Again, the internal monologue starts: “Bet they’re in a service centre on the other side of the world, that’s why they’re not understanding what I’m saying”. “They’re probably up against some serious chat targets and are managing eight chat conversations at one time”. Some general: “Argh! Why is this taking so long?” and/or “it’s not that difficult to understand, I expressed myself very clearly”. Then, finally: “I wonder if I’m talking to a bot”.
This is a screenshot of a relatively recent online chat with Apple whilst keen to avoid the quest that is securing a Genius Bar appointment.
Hmm. “Laughing! I am certainly human” was not the most convincing reply to my question, Apple.
By the way, nice try including the word ‘totally’ in your next response in an attempt to throw me off the scent.
I have no idea whether I was talking to a bot, and I’d hope that when chatbots are fully integrated into society, they won’t ‘lie’ and pretend to be human, but this particular experience certainly felt like it. I suppose there is some good practice in repeating and confirming whether they had correctly understood the nature of my problem, but it felt too much, too perfect, too machine-like. I was trying to talk to an expert, and it didn’t feel like it. It felt forced. Suddenly I needed to know whether I was talking to a machine or talking to a human being. Why did this matter to me? Well, authenticity I suppose. As a consumer, as a human, I want to feel in control and fully cognizant of what is happening to me, particularly when I can’t see the person to whom I am talking. The minute I start to doubt what’s happening, my trust in the company, in the brand, the ability to fix my problem, begins to erode.
This isn’t the only ‘online chat’ I’ve had where I’ve really wondered whether it was an automated chatbot or a human being. Why is that? Well, I’ve seen chatbots being demonstrated at service conferences and they can be VERY convincing. I am fully aware of their abilities and potential. I can also imagine the lure of an automated, humanless ability to respond quickly to customers, providing the businesses with the low-cost information that they desperately crave.
I wrote a blog last year about “The Intangible Balance Scorecard” and both of the models I referred to, “The Elements of Value” model by Almquist, Senior and Bloch, 2016 and the “SCARF” model by David Rock, explore the difficulties in quantifying and harnessing the HUMAN elements of value such as a “sense of belonging” and “fairness” – all essential parts of our service experience – all elements which chatbots will find it very difficult to replicate.
But to be honest, should they have to? If you look at the lower levels of the “Elements of Value” pyramid, a good chatbot can fulfil lots of these functional needs very well. They should “inform”, “reduce effort”, “reduce hassles” and “save time”. The mistake may be when companies seek to recruit a chatbot to fulfil our emotional, life changing and societal needs (give it time) and also, critically, if they aren’t being authentic about the experience.
So to summarise, I think that there’s a real need for a capital C to loom over the beginning of any chatbot discussion. I don’t mind talking to a machine as long as I know that I’m talking to a machine. Please don’t try to fake it. To possess such knowledge appeals to my Status, Certainty, Autonomy, Relatedness and Fairness. When I have visibility of what’s actually happening, I’ll happily discuss my service requirements with you, Mr or Ms Chatbot, and I’ll thank you for your time and any support that you give.
Author: Sarah Lethbridge
Sarah Lethbridge is the Director of Executive Education at Cardiff Business School. Her role is to work with external organisations to design programmes of learning which employ the academic expertise of the Business School.
Sarah joined the Business School’s Lean Enterprise Research Centre in 2005. Since that time, she has worked on numerous lean projects in hospitals, universities and public and private services.
She has worked with the Ministry of Justice’s Lean Academy, the Value for Money team in the Home Office, Nestle, Legal and General and Principality Building Society to ensure that organisations approach lean in a holistic, sustainable way.