The first instalment in our new blog series, “Chatbots that don’t suck”. In this post, our team looks at indicators of a smart chatbot that tend to show up—or go missing—during the first month after deployment.

Do us a favor: open Google and type in “chatbots suck”. If those search results are any indication, value still lags far behind the hype for this popular technology. Still, the potential for successful, high-value chatbot deployments is there, as are the use cases.

In the spirit of finding examples of chatbots that don’t suck, and explaining why, we’ve pulled our experts together for a new blog series. Welcome to the first edition of “Chatbots that don’t suck” where our team of experts will take you behind the scenes on some of our chatbot deployments for some of the largest organisations in Australia, with the busiest contact centres.

The client: The Department for Education for South Australia

The Department for Education provides a range of integrated education, training, health and child development services to benefit children and young people in South Australia. One of the key roles of the department is to register, on-board and employ the thousands of teachers that work in the state’s schools.

The challenge: Lots and lots of “low-touch” questions  

With more than 800 public schools and preschools in South Australia, the recruitment process is a big undertaking. Each year, thousands of applicants flood the system with clicks, uploads, and—of course—questions. For example:

  • “How do I become a registered teacher?”
  • “When can I apply?”
  • “When do applications close?”
  • “Where do I find this form?”
  • “Where is my application up to?”

The same simple questions, over and over and over again. What’s more, teachers constantly contact the Department with the same questions. Anecdotally, we’ve come across entire Facebook groups dedicated to supporting other teachers navigating the process. 

To their credit, the Department for Education acknowledges this high-volume scenario and they’ve taken steps to streamline the process—to create an improved process for teachers and to maximise the productivity of the internal team fielding the enquiries.

The team decided to identify a vendor capable of adding another channel for communication. So we built Andie the AI-powered chatbot, who now serves as an extension of the Department for Education’s internal support team. 

We could spend all day talking about Andie, but instead we thought we’d share a lesson from her development.

DfE chatbot Andie on a laptop
DfE chatbot Andie on a mobile

The lesson: A chatbot’s first month tells you all you need to know

There are two ways to build software: 

  1. Slow, comprehensively and exhaustively. 
  2. Quick and agile. 

We’re big believers in number two. That means developing a viable chatbot prototype in a short time frame. Basically, we want to deliver a proof of concept in a matter of weeks. Or, in SA Health Zoe’s case, six days. 

The sooner we can get a chatbot onto a client’s website, the sooner we can start gauging effectiveness, then build up and add content using a staged, data-driven approach. 

And so much of that valuable data is generated in the first month of a chatbot’s lifespan.

What Andie taught us is that there’s no one better to help build a chatbot for teachers than the teachers themselves. So we embraced agile project methodologies, stood Andie up, and started iterating based on the input and feedback we were receiving during that first month.

After just the first week, for example, Andie received questions that she couldn’t answer. Our solutions team then made recommendations based on those questions and released new content for Andie to use to answer those questions. A new chatbot iteration, and an upskilled Andie, in a very short time period.

By the end of that first month, we’ve made numerous updates based on the constant influx of teacher enquiries. Andie was consistently providing teachers what they needed, especially those common low-touch questions that can be answered without calling the contact centre.

Looks like Andie’s future is quite bright.

In closing: Customers tell you what they need (not the other way around)

With customers today more insistent than ever, with higher expectations about customer service, and lower tolerance for dumb chatbots, we can’t waste time trying to guess what the customers need. Instead, we need to get them involved at the outset and then build on our solutions using an agile approach.

It’s the perfect model for chatbots, who are constantly receiving real input from real customers. Why not iterate based on what those customers are telling you, rather than dusting off your hands and declaring, “It’s perfect. It’s good to go”?

That’s exactly how we built Andie.

Andie is built for adoption because Andie’s knowledge base is built on what South Australia’s public school teachers actually need—in their own words! So when you’re working on a chatbot or building some software, or considering launching your own chatbot to help support your customer team? Remember that there’s a tremendous amount of actionable data created during that first month of deployment.

The trick is getting to that first month with a viable chatbot solution that you can quickly iterate on.

Clevertar is an award-winning Australian company that provides chatbots for customer engagement, specialising in virtual characters as a conversational interface. The company was born out of research showing that conversational virtual characters are influential and engaging, and Clevertar has developed a software platform for the creation and delivery of chatbots utilising this innovative technology.

Leave a message

3 + 12 =

Join our eNews

Get the latest Clevertar news delivered straight to your inbox each month

12 + 13 =