11 minute read

Four Decades in Conversational

FOUR DECADES IN CONVERSATIONAL AI

Let’s set a line in the sand as we start this piece. This discussion is 100% about Conversational AI, not AI in other applications such as facial recognition, or recruitment triaging, or image recognition, or fraud detection, or autonomous cars, or the singularity where GAI (general AI) becomes as, indeed more, intelligent than humans leading to a world where Sundar Pichai’s assertion that Artificial Intelligence will have a more profound impact on humanity than fire will come to pass.

All of those topics are entirely

relevant and have their pros and cons, controversies and successes but we’ll not concern ourselves with them here as this is not an area where 20 years of professional practice and 40 years of broader interest in how humans communicate with machinery will have any validity. This is the frame of reference, research and reality that the author brings to this piece.

The Author is a veteran player with experience building Conversational A.I. since 1982 when he built his first ChatBot on a ZX Spectrum computer. He and his Associates have created production conversational systems for; customer service, learning & development, technical support, classroom support for learners & teachers and playful systems aimed to generate discussion amongst many others.

Since 2002 Elzware has been designing

and making Conversational AI systems using, evolving and training clients, on appropriate technologies and methods using engineering, social and computer sciences.

There have been some prototype systems for healthcare, virtual humans and other outliers, more than 80 systems in nearly 20 years. They are voice/text input and output, mixed UI environments, multi modal, tied into back office systems for email and SMS, wrapped with code for access to web services, driving social media automatically. Creating nuance to lip-sync and Avatar expressions, we tune Avatars for attitude and see how people react to better understand. That is the position. We are old skool, seen it, done it, watched fashions and hype cycles come and go. Now let’s get to the point.

There’s been a lot of conversation about Chatbots and Conversational AI over the last few years, lots of talk of revolutions and scary tales of computers taking over the world to make us subservient, not just from the loony out-there correspondents but from respected traditional media outlets around the world both in print, online and on television.

This is a shame as the market for Chatbots is far older than the current hype cycle and in many ways should be, the author believes, reflecting back on its roots to ensure that lessons previously learnt are blended into the best of the methods that are currently being trialled around the world particularly in simple transaction based interactions fit for customer service and before less adventurous human interaction sectors get drawn in to further trials.

We are on the threshold of an amazing phase in human to machine conversation but there are some significant roadblocks along the way. Purely data driven Conversational AI is blowing in like fog to cover everything in it’s path and while there is much talk about levels of Conversational AI being more or less autonomous, even sentient or conscious the reality is that fundamental problems with conversational data and large language models are presenting problems for those vendors that are not working with a hybrid architecture.

By hybrid is meant a blending of clear, concise, auditable and governance driven business and process rules and structures in a method that is transparent and explainable,

/ By Phil D Hall, Conversational AI Architect, Elzware Ltd /

indeed interpretable to the common businessman and not just the esoteric and slippery methodologies of data science. I’m ignoring the costs of computing these methods and their impact on our planet for this piece.

Since Elzware was set up in 2002 it has seen some peaks and troughs in the market for digital conversational. Work before Elzware was with a global systems integration company working with systems that were called ERMS (email response management systems ). These were built according to various NLP methods and worked side by side with call centre operatives to deliver transaction support and handover to humans. 20 years ago and the functionality is essentially no different to that attempted by the recent blizzard of companies that it seems, thankfully, to be thinning slowly out.

Why didn’t these systems continue to evolve? Social media is the short answer, marketeers and information specialists imagined a delivery mechanism where information was delivered once and people would find this through search engines and all would be good in the world, but let’s not get side-tracked into this grimy don’t-call-me-apublisher so I, the Big Tech Companies, can ignore the vitriol, anger and offensive content that is an unacceptable percentage of total social media output.

Let’s not open the box on the Tay debacle and start a discussion about feedback mechanisms of autonomous generative and/ or adversarial AI system, let’s keep focussed on Hybrid AI and let’s talk about some of the systems over the last few years that Elzware has delivered that set a target for the heavy lifting that needs to be achieved if the crucial sectors of; healthcare, education and to a lesser extent entertainment are going to be presented with Conversational AI which is fit for purpose.

Echoborg – 2016 and still evolving

Actions speak louder than words, so goto www.echoborg.com to check out some of our trailers. As words go though, The AI, built by Elzware, gives the impression of being on the brink of sentience. It speaks through a human or “Echoborg”. It is programmed to

recruit more Echoborgs. Participating humans have to decide whether to become Echoborgs or persuade the AI to agree to a different partnership.

Growing show-by-show over five years, “I am Echoborg” experientially engages crowds to discuss emerging assumptions and issues of control. It successfully triggers new awareness and curiosity about the role of AI in our lives. The show is as much about the audience talking to each other as it is about talking to the AI. It is these conversations where the magic really happens. While “I am Echoborg” was originally designed to be run in a theatre setting, now it has been adapted to be run on the Zoom platform in reaction to our global problems with Covid19.

This show has been critically acclaimed in the august halls of the UK’s parliament and corporate/control organisation around the world. It was shortlisted for the Innovation in Story Telling at the Future of Story Telling conference in 2018. There is no audience that does not leave the experience richer, wiser and keener to understand how AI is affecting their world and how their world is affecting AI.

GHAIA – 2016

Created as a fully functional prototype system that ensured greater condition management by conversationally engaging the user in their treatment plan & connecting them with medical, care or research professionals. Enabling independent living, conversationally supporting the activities of daily life and improving health & wellbeing. Capturing, storing and securely relaying real time data gathered via conversation to external apps and devices – informing and empowering the user and their medical, care or research team.

With a humanistic and deep ecological value base, the GHAIA system was developed through an ethnographic approach, with features & functions designed and developed with significant patient and clinician input. Interviews were conducted with patients & professionals across a number of medical fields, capturing the wants, needs and actions of the patient, and those invested in their care, throughout the patient journey. Questioning, how to support the best possible health outcome, and potential barriers to achieving this.

The data captured from these interviews formed the basis for GHAIA’s features and functionality. We aren’t talking about simple intents and entities here, Elzware’s process - since before these ML (machine learning) methods we pumped up - was always meta in as much as the notion of intents is subservient and indeed a small set of steps in a broader process that engages conversation as it is and doesn’t try to dumb it down into IVR (Interactive Voice Response) level flat procedures that are locked into place inside a black box where a probability of success is all that is standing between a correct response and a potential lawsuit or life changing event.

Throughout the interviews, clinicians expressed the want & need to keep patients engaged in their treatment plans, and for patients to have access to relevant and reputable information. They also communicated the want, as clinicians, to have access to up to date and relevant patient health data - providing a clearer picture of a condition and its progress.

Patients overwhelming expressed the want to be in control, stay connected to family & friends and have access to information, advice and support. Across the board carers and family members wanted to know their loved one is safe, happy and comfortable.

There are other healthcare stories, but let’s move over towards education for a few paragraphs.

Ravensbot - 2012

The Shift was a project that created a concept and working model of an informal, creative, sociable and collaborative online learning environment for young people aged 16-19, who were not in education, employment or training (NEET).

The service was driven by our hybrid form of artificial intelligence to match students’ queries or actions with suitable answers or reactions that may be supplied as text, links, diagrams, video or other multimedia. The responsive web pages encouraged a ‘Conversational Learning Environment’ (CLE) supported by an animated figure known as the Ravensbot. All of this was built in collaboration with; students, staff and NEETs over a period of structured meeting, agile development and trials.

This was our first comprehensive foray into a website that was completely driven by a Conversational AI system, up to and including the ability to be “aware” of where the users was, their journey and to match the delivery of the conversation accordingly.

TeachBot – 2009

Before the UK educational sector was refactored from a governmental level during Michael Gove’s rein in the Department of Education, Elzware had been working with funding through local and national organisations to turn the UK’s English Reading and Writing curriculum for 11 to 16 years old into an interactive experience on a piece of hardware called an Ameo (between a PDA and a tablet which hadn’t really arrived yet). The aspirations where high and the collaboration between parents, teachers, policy makers and the students bode well for the further development of the system. Highlights of the build system, which fell foul of the financial crash and government policy changes were:

For students:

 A personalised learning experience, based on your own pace and goals  Provided clarification on terminology or topics, through a detailed glossary  Provided detailed instructions, guiding them through each subject area, based on

National Curriculum requirements

For parents:

 Allowed them to track their child's performance  Based on National Curriculum advised learning techniques and content, they could

rest assured that their child/children would have the support they need, at a pace to suit them

For teachers:

 Helped them monitor learning for more appropriate intervention – they could track students' progress without the need for detailed analysis of their work to identify strengths and weaknesses  Let them get back to their job – TeachBot was designed to meet National

Curriculum targets, so they could get on with inspiring young minds, whilst

TeachBot dealt with the common questions raised by students  Delivered consistent support, allowing them to focus on students who need help with more complex issues, or to cover topics relevant to the whole class, rather than dealing with individual student issues

DesignBot – 2007

 The product was developed to take the essence of the Socratic method and provides it to learners through contemporary technology: an onscreen natural language conversational interface. In the designing aspect of

Design and Technology students turned to the DesignBot at any point when they want an outside prompt to help them move their thinking forward  The DesignBot dealt with any off-task forays in a tolerant manner but quickly drives the student back on task, helping the student determine which direction to take next. On occasion, when a lack of information/knowledge is identified, the designBot would take the student direct to a website, the relevance of which is determined by the preceding conversation

Why highlight these systems? It’s not so that you, the reader, can suck your teeth and be impressed about how far ahead of the market Elzware was, but more so that you can attune yourself to what is possible with the right level of collaboration and a methodology that is fit for purposes, both from the perspective of sector governance and the experience of the user.

There are a lot of naked emperors right now in Conversational AI and it is worth taking time to interrogate their actual capabilities as well as their underlying methodologies and attitudes to the data which is being generated and used. That said, this is a perfect time to start a journey to enable the delivery of excellent quality information to those people that are interested in your organisation using Conversational AI, we recommend the Hybrid variety.

Good luck to you all and do give us a call if you want some support in consultancy, building or training to work with the best of breed tools in this highly confusing and volatile emerging marketplace. Talk to us on hello@elzware.com

This article is from: