Accessibility Through the Ages – A Technological Journey!

Image is a cartoon graphic showing two hands typing on a keyboard and how technology has developed so this keyboard can connect to a variety of online platforms … Like shopping, banking, entertainment and so on. The keyboard is connected to all these things through lines like a flowchart.

Not so long ago I worked in the NDIS sector. I really loved my team there, pity about the manager. It only takes one to spoil the apple cart. I had to do many NDIS plans. I liked doing them. I liked meeting the clients. I like to think I was pretty good at them too. Of course, being deaf, communication could be problematic. For these meetings I always booked Live Remote Captioning. I could have chosen Auslan interpreters, but my reasoning for captioning was that the clients were already in a vulnerable position. They were disclosing personal information.  The person doing the captioning was invisible, because they delivered it remotely and online. Having one less person in the room was my way of making it more comfortable for the client.

I am well aware that I am privileged here. I have good speech (for a DEAF GUY ANY WAY :-D) That meant that I could speak, use the captioning and carry out the meeting. The added benefit was that I would get a full transcript of the meeting. This helped me in the final writing of the plan and ensured I had no excuse for missing anything.

It is interesting because when I began my career, captioning was not available. Interpreters were barely available. I remember Barb, the accountant, was a CODA. She interpreted for me. We made hundreds of cold canvassing calls to get Deaf people into work. John was a CODA too. He managed the fledging interpreter service. I would grab him when I could. Vanessa shared an office with me, sometimes she would interpret too. The year was 1989.

Interpreting grew over time. I was one of the first in South Australia to receive interpreting paid for by the University. I had Barb, Heather and Karin. All wonderful interpreters. I remember having to fight my social work lecturer to keep them in the class. He reckoned they had no place in his class cos in real life, I wouldn’t be able to work with interpreters. How wrong he was. I wish I could see him now and give him the bird. It would be so satisfying.

So, I graduated and started work. In truth, I worked and studied. In those early days the phone was always a problem. I would apply for jobs; they would ask me about the phone. I would have to confess that I couldn’t. Try to sell the idea of job trading. You know, I’d do extra paperwork, someone would do my calls. Often, I lost out on jobs because I couldn’t take incoming calls. It was a hard slog.

Then in 1994 the National Relay Service came. Finally, I could use the phone, albeit through a third person. They could call me too. I had to train them how, but most were accommodating. I had a TTY (before the times of Jobaccess. The employer had to buy it) I had a flashing light too so that I knew when my phone was ringing. These were heady days. Ordering pizza, making dates with girls I had met in the pub and actually using the phone at work. WOOOOAHHHH!

Then it was the mobile phone. By golly wasn’t that brilliant. I remember at the start you could only send an SMS to people who had the same carrier, Telstra to Telstra, Optus to Optus etc. If memory serves me right, Robert Adam started a campaign to get carriers to allow text to different carriers.

Before that I had a Hutchinsons pager. I could receive messages but not send any. Now I could actually communicate with someone direct, no third person. OOOOH!! It was addictive. We sent thousands of texts. Blimey, it was expensive at 25cents a text. I know many a deaf person that got in heaps of financial strife cos they went batty with texts. But that changed over time to unlimited texts. And rightly so, given that a text at the time cost something like .00000000000017 cents.

Independence grew every year as technology improved. Opportunities for deaf people improved with it. Email was the next big step. You could cut out the phone altogether and just email. I say could, but hearing professionals love talking on the phone. When they could email, they would call. When they could text, they would call. Getting them to change a habit of a lifetime is harder than taking a bone from a determined dog, and almost as dangerous.

In 2006 we had another breakthrough. The Howard Government introduced Auslan for Employment. It was a shabby policy at the time.  Just $5000, once off. Then the employer was expected to pay. So we campaigned hard to change this. Luckily for us, Howard got kicked out on his sorry arse (I’ve never forgiven him for Tampa.) Rudd came in and Shorten was his Parliamentary Secretary for Disability. Soon enough Auslan for Employment was an annual $6000 for everyone.

It has not increased since. It’s still shabby policy because it doesn’t consider different needs, demands, regions, duties etc – BUT – it opened doors and opportunities for many Deaf professionals in that it allowed them to participate more in the workplace, be involved actively with meetings, training and so on. For all of its faults, Auslan for Employment has been a life saver for many Deaf people.

Then there was real time captioning. Then Live Remote Captioning. I remember attending the University of Melbourne where Matthew Brett demonstrated how it could work for students. I was on the Deafness Forum Board and we had a conference. AI media demonstrated how it worked. Way back in 2004 or 2005 The University of Sunshine Coast were flogging an IBM system where lectures were all recorded and converted to text by voice recognition. An admin person would correct any errors and within 24 hours they would place the recordings of lectures, with the corrected captioning, online.

It was a great time to be hard of hearing because suddenly the access needs of this group were being considered. Whether we like it or not, nearly all people with any kind of deafness do not sign. Only a very small percentage of the vast population of people who have a “hearing loss’ actually use Auslan. But somehow, until that time, Auslan had been the primary focus of access. Not that this is wrong, rather there were many people who were missing out and very little was being spoken about their needs.

So, we went from a time when the Australian Caption Centre were really the only people focusing on captioning to a time when there were multiple providers of real-time captioning both live and remote. We have Bradley Reporting, Captioning Studio, Red bee Media and AI Media. In time Auslan for Employment changed to allow payment of captioning too. The playing field began to even out.

As Australia’s internet improved these services began to go online. Video Relay Interpreting started. Captioning was delivered through different platforms – Gotomeeting, Skype and so on. You could basically get access anywhere and anytime. Deaf and hard of hearing people live in Utopia compared with when I began my career.

For many of us it’s always been a farfetched dream for there to be voice recognition technology available whenever it was needed. You could meet friends in the pub, turn it on and it would allow you to follow conversations. You could receive a phone call, turn it on and know what the person was saying at the other end. This was the dream of voice to text technology.

I used to scoff at this. I used to think that there was no way it would become common. I mean how was it going to distinguish accents, synonyms and so on. I was a sceptic and felt that it was always going to be limited. Indeed, this was always a weakness of such technology. If you had a cold, for example, the technology would struggle to understand you.

Then one day in 2018, while working for the NDIA, Sarah sent an email. Deaf people in the NDIA had a sort of information sharing thing going on. Sarah encouraged people to try Live Transcribe, an Android based voice to text app. She said it was surprisingly accurate and she was able to partake reasonably well in a meeting when she was unable to book interpreters. So, try it I did, and I was gobsmacked. Sure, it made errors, but it could pick up everyone and it was surprisingly accurate. I began to use it regularly for impromptu meetings that were called, one on ones and so on. I have not stopped since.

The rapid development of this sort of technology has been outstanding. It has allowed me to communicate more independently, without the need of a third person. It makes errors that are really quite funny. Calling a USEP Partner a New Sex Partner for example.  When the pandemic started, I would set the technology up on a tablet near the computer and it would transcribe Zoom meetings really accurately.  My colleagues and I began to develop protocols so that I could participate better. No one talking over each other for example. Using the hand raising feature and so on. Teams and Google Meet have really accurate auto captioning built into their systems too.

To give the reader an idea of the change this has made on my work life, consider this. In 2017, while working as a Senior Local Area Coordinator, the bill for two deaf people to use interpreters and captioning, just for nine months, was $84000. Last year I didn’t even finish my $6000 Auslan for Employment money, so good is this technology now. I am well aware I am privileged because I speak well, but the change and the independence it has provided me has been life changing.

Live Transcribe is a free app on Android. Currently, my technology of choice is Microsoft Group Transcribe. This is an Apple based app, and its free too. It’s even better than Live Transcribe. I used to try Live Transcribe to see if it would transcribe the TV or someone talking on my phone. It didn’t work well. But Microsoft Group Transcribe does. I use it when a TV show doesn’t have captions, which is rare now. I also use it to make voice phone calls. I put my phone on speaker, I turn on the App on my iPad and away I go. Just today I arranged my home insurance and booked a restaurant doing just that.

And now we have Convo Australia. Interpreters at the ready anywhere and anytime. You pay a monthly subscription and if you need an interpreter Covo Australia are there at the ready on your phone. What that will do in terms of supply of interpreters for those that need it face to face I do not know. I do wonder if it will place stress on system that already cannot meet demand. Who knows? Time will tell.

It’s incredible, isn’t it?  From those days when I had to hope that Barb was available to make a call, I can now make calls independently. From those days where only the lucky could get interpreters for their work, they are now commonplace. From the days when we could communicate independently through the NRS, we can now do away with the third person. From the days when we had to book interpreters well in advance, they are now available at the ready on our mobiles.

Life’s good! – Enjoy it until the next amazing development in technology comes along.

3 thoughts on “Accessibility Through the Ages – A Technological Journey!

  1. You have opened the eyes of someone who hasn’t spent a lot of time with the hearing impaired. The journey to get from the ’80’s to where we are now with technology and how long it took to get there, your frustrations etc., it was good to sit back and reflect that all this new stuff to help people communicate is really quite new. Thanks Gary!

  2. Great article Gary! We were lucky to have Barb, Heather and John in early years.

    I often used Google’s “Live Transcribe” app which installed on my mobile phone during ad-hoc team and one-on-one meetings when I was working in the NDIA office for two years, I felt that the “Live Transcribe” app’s accurate rate was around 70 %, it’s still a very good rating far better than lipreading or even write on notepad to communicate with hearing people. I approached you recently and you gave me advice to use Microsoft Group Transcribe (Teams) installed on my iPad device (obtained via Job Access). That stuff is so bloody great, I use it almost everyday in my new role in the office recently. My thoughts on accuracy rate in Teams’ captioning technology, I would say Teams is one of the best captioning app I have ever seen. It also helps to reduce my assumptions when the Teams app (iPad) sits on my desk listening to everyone talking in the office.

    Just last week, Microsoft announced to join the “Metaverse” race, an app called “Mesh for Teams”, which allows people to meet in virtual meetings, I have seen video of this, oh man, just google it! I had a dream of this I could set the “communicate language mode” to Auslan in virtual meetings or even meeting people in VR world via “Mesh Teams”. So in my dream, this could means if a (hearing) person cannot sign in Auslan mode, then a holographical interpreter appears on top of people’s heads in virtual meeting or VR world automatically for better readability and communication engagement, I think this could happen very soon, never know 🙂 We are so lucky to live in the Golden Age of information.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.