Skip to content ↓

Students develop assistive technologies

Undergraduate teams create helpful phone apps and devices for people with disabilities.
Undergraduates Marcus Lowe and Victoria Sun developed a vision-based touch screen interpretation system for a blind woman who previously had trouble using her coffee machine’s controls.
Caption:
Undergraduates Marcus Lowe and Victoria Sun developed a vision-based touch screen interpretation system for a blind woman who previously had trouble using her coffee machine’s controls.
Credits:
Photo: Marcus Lowe and Victoria Sun

Lindsay, a 23-year-old Somerville resident and researcher at the Massachusetts Eye and Ear Infirmary, is a coffee lover. She has a Keurig brewer, which allows her to make a wide variety of hot and cold beverages at the touch of a button. However, Lindsay, who is blind, has trouble using the machine’s touch-screen controls.

Last fall, as part of 6.S196 (Principles and Practice of Assistive Technology), undergraduates Marcus Lowe and Victoria Sun teamed up with Lindsay in an effort to develop a piece of assistive technology that would help her use the many different features of her coffee maker.

The goal of PPAT is for small teams of students to work with clients in the Cambridge area to develop assistive technology — a device, piece of equipment, mobile application or other solution — that helps the client live more independently. The course is led by Electrical Engineering and Computer Science Professor Seth Teller and co-taught by EECS Associate Professor Rob Miller, both principal investigators in MIT’s Computer Science and Artificial Intelligence Lab.

“The class evolved naturally from my lab's research efforts to develop various types of assistive technology for people with disabilities, such as self-driving wheelchairs for people with MS, ALS, brain or spinal cord injuries, and wearable machine vision systems for blind and visually impaired people,” Teller says. “Given Professor Miller’s focus on human-computer interaction, and the fact that both of our labs attract UROPs (Undergraduate Research Opportunities Program at MIT) interested in the human side of technology, it was a natural step to create an undergraduate subject focusing on this area.”

At the close of the fall term, PPAT students presented the culmination of their efforts to fellow classmates and others, demonstrating a variety of new assistive technologies: accessible touch- and speech-based nurse calls for a client with MS; augmented caregiver access and E911 capability for a client with ALS; accessible tablet control of an adjustable bed for a client with MS; and a vibrating bracelet to notify a blind and hearing-impaired client of incoming calls on her mobile phone.

During the final presentations on Dec. 5, Lowe and Sun presented their technology that helps Lindsay take full advantage of her coffee machine: They created a vision-based touch screen interpretation system for Lindsay’s iPhone. As a frequent user of assistive technologies like VoiceOver, a screen reader built into various Apple devices, Lindsay was comfortable using the iPhone. Thus, the students developed a system that could work in tandem with the technology Lindsay already uses on her phone.

The system Lowe and Sun developed features a stationary stand, where Lindsay can secure her iPhone above the touch screen that controls her coffee maker. To make a cup of coffee, her phone takes a picture of the touch screen and asks Lindsay what she would like to drink. The application then gives Lindsay oral guidance about how to use the touch screen to make exactly the type of drink she wants. Using a grid made with thin strips of laminate to guide her fingers, Lindsay can navigate the screen, pressing the correct buttons. The application then reviews and confirms her selection.

“They did a great job,” Lindsay says of Lowe and Sun’s work. “One of the upsides to their solution is the ability to confirm the current state of the coffee maker screen. Now I know and can confirm what kind of drink I am going to make.”

Lowe and Sun put particular effort into enabling Lindsay to monitor the operation of the assistive technology system, by designing the program to repeat relevant commands as needed. Eventually, Lindsay can, if she wishes, memorize the sequences and forgo using the assistive technology altogether.

For their project, undergraduates Priya Saha and Veronica Newlin worked with Haben, a third-year law student at Harvard Law School who is blind and partially hearing impaired. Like Lindsay, Haben uses a lot of assistive technology, including VoiceOver. Haben asked Saha and Newlin for assistance in developing a method for notifying her of incoming calls and text messages through a discreet vibrating bracelet. As she cannot hear or see her iPhone ringing, Haben’s strategy was to check her phone every hour for calls and texts, meaning she missed about 90 percent of all incoming calls.

Saha and Newlin developed a custom-built metal bracelet, which resembles a piece of jewelry and houses a small motor that vibrates for four seconds when notifying Haben of incoming calls. The team used a Bluetooth to connect the iPhone with the bracelet’s motor, and designed a motor driver specifically to function as the bracelet’s notification system.

While the team had trouble with notifying Haben of incoming text messages due to restrictions on accessing Apple’s messaging center, they estimate that the rate of success for notification of incoming calls using their system is between 90 and 100 percent.

Teller and Miller say they were impressed with Saha and Newlin’s work, in particular their resourcefulness in solving technical aspects of Haben’s problem, and how well they got to know their user, which resulted in a specialized product that suited Haben’s needs.

Related Links

Related Topics

More MIT News