Voice assistants are notorious for misinterpreting input from speakers with local accents (looking at all you exasperated Glaswegians). But because voice recognition algorithms are built from libraries of standard pronunciations and speech patterns, people who have difficulties with speech or enunciation also have trouble accessing these technologies. And because they may have physical disabilities as well, these are often the very people voice assistants could help the most.
That includes many people with Down syndrome. “My daughter has a smaller mouth and a longer tongue. Picture talking with a marshmallow in your mouth,” explains Ed Casagrande, chair of the board of directors at the Canadian Down Syndrome Society. Out of the box, Google’s voice assistant misunderstands about every third word from an average speaker with Down syndrome.
“Project Understood,” a new initiative from CDSS, aims to improve Google’s algorithms by building out the database of voices. Spots from FCB Canada follow Matt MacNeil, a Canadian with Down syndrome who works with CDSS, as he travels to Google headquarters in Mountain View, California, to work with Google engineers and product managers to refine the voice recognition tools.
Other people with Down syndrome can participate by “donating” their voices—recording test phrases and tongue twisters like “I owe you a yo-yo today” on the Project Understood website.
Previous efforts from FCB Canada for CDSS include the award-winning “Down Syndrome Answers” campaign, which let people with Down syndrome field questions online from worried parents, and “Anything But Sorry,” a PSA against using the “s-word.”