Slide to unlock
Today’s touch-screen world is leaving behind an entire segment of the population: those with motor impairments. iSchool Ph.D. candidate Martez Mott wants to change that.
Today’s touch-screen world is leaving behind an entire segment of the population: those with motor impairments. iSchool Ph.D. candidate Martez Mott wants to change that.
Scroll Down
We live in a touch-screen world. One where tiny computers full of information live in our pockets. Where boarding passes are printed at touch-screen kiosks and groceries are paid for at touch-screen card readers. A world where extending a finger and making contact with a particular point gets us what we want.
Well, except for the millions of us across the world who experience motor impairments — the millions of us who are being left behind.
“I’m pretty sure if you went around and asked, ‘Hey, what do you think a touch is?’, you might get different answers, but it would probably converge around something that looks like this,” says Martez Mott, raising his pointer finger before tapping it on the table in front of him. “For the participants in my studies, that was not the case at all.”
Mott is a Ph.D. candidate in the UW’s Information School and an active member of the philanthropy-funded MAD (Mobile + Accessible Design) Lab and the DUB (Design. Use. Build.) Group, where his research centers on making touch screens accessible for people who live with motor impairments, from cerebral palsy to muscular dystrophy to Parkinson’s disease.
“Touch screens were developed for a certain type of use,” says Mott, who received financial support through the University of Washington Graduate Opportunity Award. It’s simple: If you want to interact with something on your device, you have to be able to extend a finger and touch it cleanly and accurately. “That’s it,” he says. “Everything else is relegated to some other type of functionality — if you touch your phone with two fingers, it thinks you’re trying to zoom.”
For those who live with cerebral palsy, however, touch might look like patting the screen with the back of the hand, or dragging a palm or a fist. “When that happens, the screen picks up a big blobby area,” says Mott, “and the system has no idea what to do with it.”
Yet.
For those who live with cerebral palsy, touch might look like patting the screen with the back of the hand, or dragging a palm or a fist.
What happens if you can’t touch your device with a single finger?
When a single finger makes contact with the screen, the device registers the center of that point as the user’s intended touch point.
When the back or side of the hand makes multiple points of contact with the screen, the device can’t register which point is the user’s intended touch point.
Calibration software for touch-screen accessibility
Imagine a system that adapts to touch — whatever that might look like. Enter: Smart Touch. The project is a work in progress, constantly being tested and tuned by Mott and a small team of researchers, including his advisor, Associate Professor Jacob Wobbrock.
The objective? To get touch-screen access to a level where anybody can pick up a device and, with a simple calibration procedure, be able to use it right out of the box without needing additional specialized technologies.
Think of it as the counterpart to automated speech recognition, where the device asks you to repeat a few phrases so it can pick up on your unique speech patterns. In Mott’s model, the device asks you to touch a couple targets — a balloon or a bull’s-eye, for example — so it can pick up on your unique touch patterns and adjust how it responds.
And it’s all guided by Mott’s participants, who give him real-life, real-time feedback.
Meet Martez Mott
The device asks you to touch a couple targets — a balloon or a bull’s-eye, for example — so it can pick up on your unique touch patterns and adjust how it responds.
With real-life users
Two years ago, at the start of the project, Mott lugged a huge, table-sized Microsoft Surface into Provail: a local center that helps people with disabilities pursue the lives they want to live. Among the nonprofit’s many offerings? Computer classes — and lots of motivated learners willing to participate in his study.
There, Mott worked with eight volunteers to make Smart Touch a reality, beginning by asking them to hit targets on the screen. “I had a wide range of participants, so I saw a wide range of strategies used to complete the tasks,” he says. The first person would prop one hand on the tablet for support while dragging the other hand across the screen. The next person would interact with the device in an entirely different way. This, says Mott, is why inclusive innovation — actually looping the software’s target population into the design process — is so important.
“When people are motivated to use a technology, they’ll employ a lot of different strategies in order for that to become possible. It was difficult to get an understanding of what ‘touch’ looks like,” he says. “People used the software in ways I couldn’t have predicted, which resulted in a lot of failure — and fixing — for me.”
Martez Mott works on Smart Touch with Provail participant Ken Frye
Ken Frye, a Provail participant in Mott’s study who has cerebral palsy, has been in the radio industry for over 40 years, with an impressive career that includes gigs at Seattle’s very own KEXP and Centralia’s Live 95.1. Today, he hosts his own radio show "Ken’s Ten," which features music from the ’20s to today “because all music can and should live together” — but that doesn’t come without its challenges. Watch to learn how Smart Touch could transform the way Frye navigates the world.
Failure is part of the learning process, and now — “after a lot of learning,” says Mott, laughing — he has a system up and running. And as he creates newer versions of Smart Touch over the next few years, retooling algorithms to improve accuracy and creating better designs for performance gains, the iterations will only get better.
“The hope is that you could buy a device, turn on an accessibility setting, and then go through a calibration procedure for better touch performance so you can use your phone or tablet however you like,” says Mott, who’s slated to wrap up his dissertation by 2019.
“This research isn’t just a theoretical contribution; it’s a practical one,” he says. “Hearing my participants tell me how important and impactful this work can be in their daily lives is the most rewarding thing. It’s what drives me.”
Ken Frye interacts with a touch-screen device
And as far as rolling it into the market? That’s a few years away, but Mott has a vision — starting with buy-in from the tech industry. “I would love to see every touch-enabled device — public or private — be accessible, with this technology integrated into every operating system, quietly running in the background. I want to see a check-out kiosk at the grocery store embedded with intelligence that can infer people’s behaviors based on their unique touch,” he says. “We can do it. We just haven’t done it yet.”
Ken Frye chats with KEXP DJ Cheryl Waters
“My hope for technology is that somebody like me can be clearly understood. Having a device I could use would mean freedom.”
Ken Frye
Originally published April 2017
Sign up for our monthly newsletter to discover the latest on new innovations, useful tips and information, and eye-opening stories coming out of the UW community.