6 minute read
Identify the call for help: developing a language app for emergencies
ALAN GIFFORD
Volunteer, Glamorgan Spring Bay State Emergency Service, Tasmania
Glamorgan is a popular touring destination for international travellers, creating a risk of road crash incidents involving casualties who do not speak English. As Graeme was preparing for bed, his BART phone alert and his pager shattered the quiet of the night. Instead of its occupants turning in, the house came to life. Lights came on. The kids stirred, and Graeme’s wife anxiously wanted to know what was happening.
Hurrying for his personal protective equipment hanging in the cupboard by the front door, Graeme fumbled with his phone to see what the call-out was all about. It was as he feared. A latenight road crash rescue. At this time of night, it was bound to be serious. Two vehicles on a country road 20 minutes away. Badly injured travellers trapped in one vehicle and needing extrication. The other vehicle was down an embankment. An occupant was reported out of the car. Nothing more.
Under lights and sirens, emergency responders raced to the scene and into action. Firefighters quickly assessed the scene for fire hazards; spilt fuel across the road posed a major risk. Vehicle body panels and trim littered the road. Paramedics gathered around the vehicles attempting to assess the condition of the casualties. Police took control of the scene for the safety of oncoming motorists and responders alike. State Emergency Service (SES) crews fanned out looking for scene hazards, downed powerlines and casualties that might have been thrown from the vehicles. Equipment was laid out in preparation for an extrication.
Vehicle radios crackled with sitreps and radio traffic. The sirens of approaching emergency vehicles cut the chilled night air. Word spread that a chopper was on its way.
Finally, with the windows removed, a paramedic and the SES crew leader spoke to the casualties. They were dazed and uncomprehending. The driver was unconscious and bleeding heavily. The other passengers were clearly injured and in great pain, but their injuries were not immediately obvious.
It was only then that the crews realised: none of the casualties spoke English.
These issues face all responders sooner or later, where international visitors or resident migrants, who have little or no competency with the English language, are caught in an incident but are unable to communicate with their rescuers.
After one such incident attended by a Glamorgan Spring Bay SES response crew, it was apparent that the response was seriously impaired by crews’ inability to recognise or identify the spoken language of the travellers. Translation would have been helpful, undoubtedly, but no one had any idea of the translator that was required, and mobile phone connectivity to a translation service would have been compromised by the remoteness of the incident site. Three-way interpretation or translation in medical and first aid situations can be dangerously unreliable. Often, the circumstances require rapid assessment and rapid first aid.
A different approach was needed.
Crews had several failed attempts at communication with the non-English speaking casualties, using signage, flip-cards, world maps, national flags – and simply smiling a lot and pressing on regardless – but all had failed. To find a solution, an approach was made to the University of Tasmania (UTAS) Information and Communication Technology (ICT) Department for assistance.
In late 2019, a proposal for a research project was submitted to the UTAS ICT Department, which was enthusiastically accepted. The following year, a team of six graduate and undergraduate students, under staff supervision, began their work. Notwithstanding the growing threat of COVID-19, the project quickly took shape to create an app that could be used by emergency responders to communicate with casualties during an emergency incident.
The app, downloaded to a suitable platform, had to: identify an eight to ten-second speech sample of spoken language from a non-English speaker. The content of the sample was of no significance, but reliability had to be high—90% accuracy or better use the casualty’s own identified spoken language for conveying preloaded information, instructions and medical assessment questions display all statements and questions on the platform’s screen in English, to allow for selection by the responder as required, while providing spoken word in the identified language ensure that every statement and question was carefully worded and culturally sensitive, with no ambiguities. Medical questions had to be precise and appropriate provide a yes/no answer format for each question, shown in English and in the casualty’s language, which can be recorded by the responder for later recall. There is no need for casualties to speak in their own language operate without the need for casualties to read or type text. The only vocalisation by the casualty is to be the initial speech sample provide for all questions and their answers to be recorded in English for later recall if required (i.e. for paramedics on scene, hospital staff, police and report writers) be easy to operate in high-stress environments and to allow for the limited digital technology competencies of some operators provide for rigorous testing, regular updates and improvements.
In emergency situations, any device supporting the app had to be rugged,
PHOTO: ALAN GIFFORD
Glamorgan SES Unit and the University of Tasmania collaborated to develop a language identification app for emergency situations. capable of being dropped without damage, rain-proof, easy to operate by responders wearing safety gloves, fitted with filters to exclude extraneous sounds, and simple to navigate. A tablet device housed in a robust protective case was selected as most suitable.
The development team chose locally available and affordable existing electronic hardware for emergency vehicle installations. High-gain antennae and wi-fi were utilised to overcome connectivity issues.
After two years of development, trialling and demonstration, the app is now in its first phase of development. It can accurately and reliably identify 12 languages, representing 80% of the languages spoken by international visitors to Tasmania. Future developments are proposed to increase this number. An exciting possibility is the development of software that identifies languages that are area-specific, such as in areas that have high concentrations of language groups or for areas where many Indigenous languages are spoken.
The app has an impressive identification rate of 93% accuracy. Currently, the time to identify a language using the software ranges from 30 to 90 seconds. The second phase of development plans to reduce the time taken to identify a language to around 20 seconds.
The embedded questions and information statements are comprehensive and have been approved by emergency responders, doctors and resident migrants. The response to initial trials of the app has been positive, helping to identify areas for improvement in emergency responder operation and medical assessment and treatment. Letters of endorsement in support of the project have been received from government heads of agencies and migrant organisations.
Despite the positive feedback, the software has not been used in active emergencies to date and is not yet ‘industry ready’. It is vital that the app is totally reliable, accurate and thoroughly trialled before being deployed. Lives will depend on it.
With the collaboration between the Glamorgan Spring Bay SES and UTAS coming to an end, the project is being progressed by the not-forprofit Association for the Identification of Spoken Language. The managing committee of volunteers that form the Association will oversee development plans for at least three more phases of development. It is working to roll out the industry-ready app without fees or charges for an initial period to gain user feedback.
The Association is advocating for financial and technical support to continue the project. Despite its humble beginnings as a collaboration between a small Tasmanian country town’s SES unit and a team of graduate and undergraduate students, we believe this technology will save lives, reduce trauma, empower first responders and reassure those who are caught up in an emergency.
This project has an enormous future that could extend well beyond emergency responder use. Government agencies, the professions, industry and even the defence forces could make use of an app that quickly and reliably identifies unrecognised spoken languages and communicates with its speakers. It just needs a hand up.
To find out more contact Alan Gifford, President, Association for the Identification of Spoken Language. Phone 0447 250 945 or email algiff1942@gmail.com.