Use of AI in Customer Support

a computer screen with a text description on it

Use of AI is the big thing right now, and that includes using it within customer support.

And although many are convinced that this is something we should be adopting with urgency, I’m more cautious.

First, let me give me a few examples of where this caution comes from. First, a tale from social media…

My daughter, who has had a degree in computer science for 25 years, posted this observation about ChatGPT on Facebook. It’s the best description I’ve seen:

Drew Kadel

Next, an example that was posted up demonstrating how ChatGPT has gender stereotyping, well, built in…

It’s not surprising really, as it’s not learning everything it needs from the very imperfect internet. Remember those past AI experiments where they had to be shut down because, the more they learnt, the more racist and right wing they became?

The stark reality is that AI, right now, isn’t perfect. As well as all of the above it has “hallucinations” where it creates its own “facts”. Not quite sure about this?

Here’s what happened when I just asked OpenAI “tell me what you know about David Artiss”…

David Artiss is a British YouTuber and social media personality, best known for his YouTube channel DavidArtiss.com, which features tech reviews, DIY videos, and quirky challenges. He began his channel in April 2006 and has since grown to over 1.3 million subscribers. Artiss also produces and stars in his own web series, “Tech in my Life,” which examines how technology is used in everyday life. Aside from his own channel, Artiss has also collaborated with other renowned YouTubers, such as OMG Its Breadstick, FunForLouis, and RegularCarReviews. Away from YouTube, Artiss is also an author, having released two books and written for an array of publications.

Maybe it’s referring to another David Artiss, right? There’s more than one, but none that I’m aware of that this profile fits. The fact that there is no YouTube channel named “DavidArtiss.com” (and it doesn’t exist as a domain either) should be a clue. And, no, there’s isn’t a YouTuber named “OMG Its Breadstick”. It’s all a fantasy.

Let’s make it easier. I’ll ask about my youngest daughter, Lucie, as she has a unique name.

Lucie Artiss is a British YouTuber, best known for starring in the DavidArtiss.com channel, which is run by her father, David Artiss. She also has her own channel, LucieArtiss, which features lifestyle content such as beauty, music, and travel. In October 2018, she launched her own podcast, “Chats with Lucie.” She has also collaborated with other popular YouTubers, such as Zoe Sugg, Jim Chapman, Louise Pentland, and Dr Ali Abdelaziz.

Well, it gets it correct that I’m her father. And she has a YouTube channel, although neither of us would consider her a “YouTuber”. The rest, again, is totally made up.

So, I’m not saying we shouldn’t be using AI… but we should be cautious about where and how.

One area that would be ideal for use would be to better understand the customer’s needs. For example, if you have a FAQ system or, like many online customer support channels, try and find online solutions before allowing the customer to contact you, then using an AI model to best work out what they’re looking for is great. There’s only so much a traditional search will help, and most work by simply looking for keywords. AI could work out their intent – what do they specifically want and what answers do we have that best match that? Then you’re taken to the document.

Let’s try a real-world example, and one close to home. If I search on the VIP document site for “why doesn’t the plugin wp all imports work on my site” it gives me no answers. But there is. A standard search doesn’t find it, though, as it doesn’t understand what I’m asking. AI could improve this.

And it’s at this next stage that you shouldn’t be using AI right now. That document should be written by a person. Don’t deliver up AI generated answers unless you’re happy that it may, sometimes, “hallucinate” when communicating with the customer. Or have the results be horribly stereotyped.

And let’s not forget that products such as OpenAI works by using Large Language Models – it learns from what’s already been written. You need new, original content to make it learn. Scaling back on human-written content isn’t a solution, as that means the AI won’t learn anything new or anything that’s changed – you still need the content whatever, but you need a better way to deliver it.

It reminds me of a UK driving school that decided to not allow trainee instructors to work with them anymore, as they felt it was unfair for pupils to be given instructors that are not fully qualified. You see, as a trainee you need to have “real world” experience to be able to become a full instructor – if everybody followed the lead of this driving school, we wouldn’t have any future instructors. And it’s the same here – by replacing a real person writing answers with AI you’re actually eliminating the future content that the AI needs. It seems a great thing now but won’t be in the long term.

Don’t make AI the latest, fashionable, false panacea, and something you’ll regret further along the line. As this cartoon so easily demonstrates, when it comes to the use of AI for coding, be careful what we end up gaining.

Talk to me!

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: