The first instalment in our new blog series, “Chatbots that don’t suck”. In this post, our team looks at indicators of a smart chatbot that tend to show up—or go missing—during the first month after deployment.
Do us a favor: open Google and type in “chatbots suck”. If those search results are any indication, value still lags far behind the hype for this popular technology. Still, the potential for successful, high-value chatbot deployments is there, as are the use cases.
In the spirit of finding examples of chatbots that don’t suck, and explaining why, we’ve pulled our experts together for a new blog series. Welcome to the first edition of “Chatbots that don’t suck” where our team of experts will take you behind the scenes on some of our chatbot deployments for some of the largest organisations in Australia, with the busiest contact centres.
The client: The Department for Education for South Australia
The Department for Education provides a range of integrated education, training, health and child development services to benefit children and young people in South Australia. One of the key roles of the department is to register, on-board and employ the thousands of teachers that work in the state’s schools.
The challenge: Lots and lots of “low-touch” questions
With more than 800 public schools and preschools in South Australia, the recruitment process is a big undertaking. Each year, thousands of applicants flood the system with clicks, uploads, and—of course—questions. For example:
- “How do I become a registered teacher?”
- “When can I apply?”
- “When do applications close?”
- “Where do I find this form?”
- “Where is my application up to?”
The same simple questions, over and over and over again. What’s more, teachers constantly contact the Department with the same questions. Anecdotally, we’ve come across entire Facebook groups dedicated to supporting other teachers navigating the process.
To their credit, the Department for Education acknowledges this high-volume scenario and they’ve taken steps to streamline the process—to create an improved process for teachers and to maximise the productivity of the internal team fielding the enquiries.
The team decided to identify a vendor capable of adding another channel for communication. So we built Andie the AI-powered chatbot, who now serves as an extension of the Department for Education’s internal support team.
We could spend all day talking about Andie, but instead we thought we’d share a lesson from her development.
The lesson: A chatbot’s first month tells you all you need to know
There are two ways to build software:
- Slow, comprehensively and exhaustively.
- Quick and agile.
We’re big believers in number two. That means developing a viable chatbot prototype in a short time frame. Basically, we want to deliver a proof of concept in a matter of weeks. Or, in SA Health Zoe’s case, six days.
The sooner we can get a chatbot onto a client’s website, the sooner we can start gauging effectiveness, then build up and add content using a staged, data-driven approach.
And so much of that valuable data is generated in the first month of a chatbot’s lifespan.
What Andie taught us is that there’s no one better to help build a chatbot for teachers than the teachers themselves. So we embraced agile project methodologies, stood Andie up, and started iterating based on the input and feedback we were receiving during that first month.
After just the first week, for example, Andie received questions that she couldn’t answer. Our solutions team then made recommendations based on those questions and released new content for Andie to use to answer those questions. A new chatbot iteration, and an upskilled Andie, in a very short time period.
By the end of that first month, we’ve made numerous updates based on the constant influx of teacher enquiries. Andie was consistently providing teachers what they needed, especially those common low-touch questions that can be answered without calling the contact centre.
Looks like Andie’s future is quite bright.
In closing: Customers tell you what they need (not the other way around)
With customers today more insistent than ever, with higher expectations about customer service, and lower tolerance for dumb chatbots, we can’t waste time trying to guess what the customers need. Instead, we need to get them involved at the outset and then build on our solutions using an agile approach.
It’s the perfect model for chatbots, who are constantly receiving real input from real customers. Why not iterate based on what those customers are telling you, rather than dusting off your hands and declaring, “It’s perfect. It’s good to go”?
That’s exactly how we built Andie.
Andie is built for adoption because Andie’s knowledge base is built on what South Australia’s public school teachers actually need—in their own words! So when you’re working on a chatbot or building some software, or considering launching your own chatbot to help support your customer team? Remember that there’s a tremendous amount of actionable data created during that first month of deployment.
The trick is getting to that first month with a viable chatbot solution that you can quickly iterate on.
Explore AI with us
It’s never too soon or too late to explore the world of artificial intelligence!
We’re keen to discuss how your business’ everyday problems could be solved with AI, and how they might positively impact your customers.
Please leave a message and we’ll get back to you soon.