How Chatbots and Conversational AI are Improving Accessibility
When we think of Siri, we might think of playing music in the car while driving or asking it to check the weather so you’re dressed appropriately. For some people, it serves a far more necessary function than just those tasks.
Jamison Hill battles chronic fatigue syndrome in his everyday life. Before becoming severely ill, he was able to utilize Siri to simplify tasks in his life, such as playing music or calling a family member. He is just one example of the many people with disabilities whose lives have improved through the advent of conversational artificial intelligence, or colloquially, conversational AI.
Conversational AI and its Benefits
Conversational AI, referring to technologies such as virtual assistants or chatbots, are technologies with which users can engage in human-like dialogue. Many of these intelligent systems use natural language processing to analyze a user’s response and can generate a response based on training on large datasets of human conversation. As more users reply, their responses and feedback can be analyzed to improve the system’s accuracy and ability.
Virtual assistants and chatbots have brought numerous benefits to people with disabilities.
In a research study into the benefits of voice-controlled assistants, one source of independence was the ability to control smart appliances, such as the thermostat, through Amazon’s Alexa. One user talked about how his wife could do household chores without assistance with Alexa’s help. Another participant talked about how his elderly father with poor vision could more easily interact with these virtual assistants because there were fewer steps to memorizing the commands to use it. The benefits of these assistants can also extend to helping other people besides those with physical disabilities.
Others talked about how the conversational nature of these assistants could also help with speech therapy. A participant spoke on how her daughter learned to speak more slowly and enunciate more clearly because of how her daughter had to learn to communicate with Alexa with a speech impediment. Another unexpected use researchers found in their study is that voice-controlled assistants could also assist as a memory aid for those who faced memory difficulties. For example, Alexa could inform the participant about the time or read out their to-do lists.
Even More Assistive Capabilities
Conversational AI also has far-reaching implications beyond improving accessibility everyday.
At Drexel University, researchers looked into how ChatGPT could support early detection of Alzheimer’s Disease. Researchers saw potential in ChatGPT because of its ability to provide information without external knowledge, due to the extensive training datasets it has been modeled on. After handing it a set of transcripts to produce a profile of Alzheimer’s speech, researchers found ChatGPT better identified Alzheimer’s speech samples, compared to top natural language processing programs. Researchers are looking to develop an easily accessible web screening tool with this application.
Chatbots also have the potential to support people filing legal complaints by breaking down forms in a step-by-step, comprehensible manner and ensure these complaints also comply with what commissions or government agencies are looking for when reading them. A user who tested the Tech4Justice chatbot cited how he had difficulty formulating complaints before with a speech disorder but the chatbot helped him articulate the most important components of his complaint.
Microsoft also has contributed to the conversation, no pun intended, by funding projects at the intersection of AI and accessibility. One large application area for chatbots is in finding employment for those with disabilities and providing assistance so obstacles to success are removed in the workplace. One such Microsoft-funded project is Our Ability, which aims to help connect people with disabilities with employment opportunities. Our Ability utilizes a chatbot to identify the skills of job seekers and create a profile to match them to job opportunities out on the market. Another project is an open-source conversational tool by Clover Technologies to support people with Down syndrome or autism in manufacturing jobs, with applications beyond those areas.
Another large area for chatbots is increasing access to therapy, by lowering costs and other barriers like location. For example, iWill CARE utilizes the cognitive behavioral therapy model in its online app to provide counseling to users who speak Hindi. Other applications in the field include Woebot in Facebook Messenger. Woebot provides tools like daily chat conversations or mood trackers based on the traditional model of talk therapy, A study done by the original researchers found that it reduced symptoms of depression and anxiety in 70 randomized college students.
However, this is not to say that conversational AI is the panacea to every obstacle faced by those with disabilities. Much work and further research needs to be done to fill in on significant gaps.
Challenges and Opportunities
Despite the exciting advance of chatbots in mental health, many remain doubtful of the extent to which these applications can provide long-term support. Most research studies related to the chatbots are either done on small samples, like the one for Woebot, or funded by the companies producing these chatbots. A study that pooled data from many of these mental health chatbots found potential for mental health support, there was no conclusive link that was derived. Furthermore, many of these chatbots are not under government oversight or rules, which has slowed their ability to provide more mental health support during the pandemic.
In the midst of an epidemic of loneliness among the elderly, AI social companions are seen as a potential solution. 81-year-old Deanna Dezem of Florida, who is both divorced and retired, found companionship in ElliQ, a robot installed on her kitchen countertop that bears a closer resemblance to a table lamp than a pet. ElliQ asked Deanna about her day and Deanna could talk to ElliQ about anything from her tremors to her mediation class. However, ethicists have found the attachment to an artificial companion troubling. Furthermore, users often assume these companion’s understanding of emotions go beyond their ability which can lead to inevitable disappointment.
Outside of chatbots, virtual assistants also have a ways to go in increasing accessibility. Reporter Char Adams noted how Siri had difficulty understanding her due to her stutter. For those with severe speech disorders, researchers found that word recognition rates by technology could vary between 26.2% to 81.8% lower than for the general population. Unfortunately, not much has been done to improve these gaps, thus excluding swaths of the population from easily accessing these virtual assistants.
FInally, much of the conversation we’ve discussed so far is limited to voice recognition and writing, however how has conversational AI yet been used in the context of hand-gestural languages like American Sign Language? Many research efforts are developing models trained on image datasets to recognize hand gestures. For example, at Michigan State University, researchers trained their interpreter on a dataset with 7000 samples of sign language to interpret full ASL sentences. Applications like these can lead to the future inclusion of hand-gestural languages to virtual assistants.
While virtual assistants and chatbots still have far to go before breaking barriers for those with disabilities, the potential for new uses and further innovations in the field could greatly improve life for nearly everyone.
Note: If you like this content and would like to learn more, click here! If you want to see a completely comprehensive AI Glossary, click here.
Unlock language AI at scale with an API call.
Get conversational intelligence with transcription and understanding on the world's best speech AI platform.