
Who remembers WALL-E. That lovable robot in the 2008 movie of the same name. WALL-E had been designed as a sort of waste disposal unit. He developed feelings and emotions and fell in love with EVE, the Extraterrestrial Vegetation Evaluator. They had an electric kiss and more or less saved the world. Artificial intelligence (AI) at its best, just what the world needs. Or does it?
Over in America Waymo offers fully autonomous and driver-less taxis. Now, Driver-less cars offer heaps of potential to the disability community. If you are blind or have a physical disability, especially in Australia, you will know the frustrations of trying to get a cab, particularly an accessible Cab. Driver-less cars offer the potential of enormous independence for many people with a disability. But on the other hand, if driver-less taxis become the norm in Australia. How many people are going to lose their jobs? https://waymo.com/
No taxis drivers. In the gig economy, what’s going to happen to Uber drivers. Potentially, the ever-popular food delivery could be done by driver-less units. You get a text, ”Your meal has arrived, please come and collect your delivery.” Thousands of university students are gonna miss out on income. Worse, imagine those people who can’t come out and get their delivery because of physical issues. If experience tells me anything, no one will think about disability until one day someone pipes up with a question – WHAT ABOUT ME??? By then undoing what has already developed will cost millions, if not billions of dollars.
I love artificial intelligence. It gives me access every day. Years ago, we used to laugh at the possibility of automatic captioning. It could never be done we said. Too many words sound the same, it will never know which one to use. Or accents, come on now, no software is gonna be able to decipher the millions of accents. Is it? How is it gonna recognise all the languages of the world? Who remembers the hilarious early YouTube automatic captioning? So many of us, including me, dismissed the possibility of there ever being a usable form of automatic captioning, EVER.
And here we are today. I no longer book Auslan interpreters or captioners because the technology is so reliable. Sure, it makes some errors but it’s easy to work out the phonetics most times. At worse, I just have to ask people to occasionally repeat what they have said. I use it for the doctor. Hell, when I had my hip replaced, I carried my iPad all the way to the operating theatre and communicated with everyone until they knocked me out.
I meet clients, have meetings with colleagues, make phone calls and even listen to podcasts. It’s been a boon for me. Yes, I know I have the privilege of understandable speech and it is not for everyone. But the independence it has given me has been enormous. I’m only sad that I had to wait until I was 57 years old to get access to it.
Now people are trying to program sign language into AI as well. We all poo poo this too. Can’t be done we say. What about the right facial expressions, space, nuance etc. I’m skeptical too but given what they have done with captioning, it wouldn’t surprise me if they develop something more than usable with sign language in the not-too-distant future.
Imagine going to the bank with a virtual interpreter on your iPad at the ready to discuss a home loan. And when you sign it has a sensor and voices back what you say. Farfetched? Maybe, but the way artificial intelligence is developing today, I wouldn’t totally dismiss it.
I probably sound more enthusiastic than I actually am. Yes, I love my automatic captioning, but it has come at a cost. Captioners and people in the captioning industry are actually losing their jobs. People who are friends of mine have had to change careers. Companies who I once relied on are struggling. The technology is great, but it sometimes comes at a great human cost.
Imagine the scenario with sign language? Interpreters out of work. No longer will there be a need to get the community to learn sign language for inclusion, the trusty iPad will do it all. TAFE courses close. The less people that use sign language, the more difficult it will be to keep sign language alive. As native signers die out, who will be looking after the language when it becomes fully automated. It is scary and more than a little bit dangerous.
Even now artificial intelligence is leading to exclusion. I’ve written before of how companies now use it for recruitment. Interviews are done virtually, and often at short notice. You have to hear, you have to speak, you have to see. Often if you are neurodiverse the format of virtual recruitment is inaccessible too. And many developers don’t care! It is all about rapid roll-out and earning mega bucks!
But there are some companies that do care. Microsoft and Apple are pretty good at thinking of accessibility. I mean Teams, as an example, has really good automatic captioning. Teams also has the ability to permanently spotlight all people that sign so they are easily seen and understood. Apple, apparently, also have programs to provide support in Auslan and other countries sign languages. Thinking ahead, including all people and thinking about groups that will be excluded when developing anything new is always the way to go. Sadly, it doesn’t happen enough and consequently people with a disability are always playing catch up.
The Government is worried too. For all its benefits artificial intelligence has the potential to be abused and exploited. There are concerns about privacy and misinformation. Already copyright is being breached as developers “steal” the work of others to develop the data for their programs.
Recently Victorian MP, Georgie Purcell, had an image of her digitally altered by AI. The picture was altered so that a dress that she was wearing exposed her midriff and enlarged her breasts. She quite rightly complained, and Channel 9 were forced to apologise. Even now you have online access to AI where you can ask it to generate an image for you. It is scary and it is for these reasons Governments the world over are scrambling to regulate the use of AI.
Nevertheless, used properly and ethically AI has the potential to open up the world and make it more inclusive for people with a disability. The problem is that too few developers outside the huge multinationals are thinking about people with a disability as they develop AI and are actually excluding people with a disability in a big way.
Here is hoping that the push to regulate AI development considers at length the needs of people with a disability. Used properly, AI could make the world a better place for people with a disability. We just have to make sure we get in early and make sure our voices are heard.








