Skip to content ↓

Smart glove teaches new physical skills

Adaptive smart glove from MIT CSAIL researchers can send tactile feedback to teach users new skills, guide robots with more precise manipulation, and help train surgeons and pilots.
Press Inquiries

Press Contact:

Rachel Gordon
Phone: 617-258-0675
MIT Computer Science and Artificial Intelligence Laboratory
Close
Collage of four images of a hand wearing a white, fabric-based glove with black fingertips and haptics and sensors sewn in. Two use cases shown include manipulating a robotic arm and playing a piano.
Caption:
A digitally embroidered smart glove developed at MIT can assist with piano lessons and human-robot teleoperation with the help of a machine-learning agent that adapts to how different users react to touch.
Credits:
Image: Alex Shipps/MIT CSAIL

You’ve likely met someone who identifies as a visual or auditory learner, but others absorb knowledge through a different modality: touch. Being able to understand tactile interactions is especially important for tasks such as learning delicate surgeries and playing musical instruments, but unlike video and audio, touch is difficult to record and transfer.

To tap into this challenge, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and elsewhere developed an embroidered smart glove that can capture, reproduce, and relay touch-based instructions. To complement the wearable device, the team also developed a simple machine-learning agent that adapts to how different users react to tactile feedback, optimizing their experience. The new system could potentially help teach people physical skills, improve responsive robot teleoperation, and assist with training in virtual reality.

An open-access paper describing the work was published in Nature Communications on Jan. 29.

Will I be able to play the piano?

To create their smart glove, the researchers used a digital embroidery machine to seamlessly embed tactile sensors and haptic actuators (a device that provides touch-based feedback) into textiles. This technology is present in smartphones, where haptic responses are triggered by tapping on the touch screen. For example, if you press down on an iPhone app, you’ll feel a slight vibration coming from that specific part of your screen. In the same way, the new adaptive wearable sends feedback to different parts of your hand to indicate optimal motions to execute different skills.

The smart glove could teach users how to play the piano, for instance. In a demonstration, an expert was tasked with recording a simple tune over a section of keys, using the smart glove to capture the sequence by which they pressed their fingers to the keyboard. Then, a machine-learning agent converted that sequence into haptic feedback, which was then fed into the students’ gloves to follow as instructions. With their hands hovering over that same section, actuators vibrated on the fingers corresponding to the keys below. The pipeline optimizes these directions for each user, accounting for the subjective nature of touch interactions.

“Humans engage in a wide variety of tasks by constantly interacting with the world around them,” says Yiyue Luo MS ’20, lead author of the paper, PhD student in MIT’s Department of Electrical Engineering and Computer Science (EECS), and CSAIL affiliate. “We don’t usually share these physical interactions with others. Instead, we often learn by observing their movements, like with piano-playing and dance routines.

“The main challenge in relaying tactile interactions is that everyone perceives haptic feedback differently,” adds Luo. “This roadblock inspired us to develop a machine-learning agent that learns to generate adaptive haptics for individuals’ gloves, introducing them to a more hands-on approach to learning optimal motion.”

The wearable system is customized to fit the specifications of a user’s hand via a digital fabrication method. A computer produces a cutout based on individuals’ hand measurements, then an embroidery machine stitches the sensors and haptics in. Within 10 minutes, the soft, fabric-based wearable is ready to wear. Initially trained on 12 users’ haptic responses, its adaptive machine-learning model only needs 15 seconds of new user data to personalize feedback.

In two other experiments, tactile directions with time-sensitive feedback were transferred to users sporting the gloves while playing laptop games. In a rhythm game, the players learned to follow a narrow, winding path to bump into a goal area, and in a racing game, drivers collected coins and maintained the balance of their vehicle on their way to the finish line. Luo’s team found that participants earned the highest game scores through optimized haptics, as opposed to without haptics and with unoptimized haptics.

“This work is the first step to building personalized AI agents that continuously capture data about the user and the environment,” says senior author Wojciech Matusik, MIT professor of electrical engineering and computer science and head of the Computational Design and Fabrication Group within CSAIL. “These agents then assist them in performing complex tasks, learning new skills, and promoting better behaviors.”

Bringing a lifelike experience to electronic settings

In robotic teleoperation, the researchers found that their gloves could transfer force sensations to robotic arms, helping them complete more delicate grasping tasks. “It’s kind of like trying to teach a robot to behave like a human,” says Luo. In one instance, the MIT team used human teleoperators to teach a robot how to secure different types of bread without deforming them. By teaching optimal grasping, humans could precisely control the robotic systems in environments like manufacturing, where these machines could collaborate more safely and effectively with their operators.

“The technology powering the embroidered smart glove is an important innovation for robots,” says Daniela Rus, the Andrew (1956) and Erna Viterbi Professor of Electrical Engineering and Computer Science at MIT, CSAIL director, and author on the paper. “With its ability to capture tactile interactions at high resolution, akin to human skin, this sensor enables robots to perceive the world through touch. The seamless integration of tactile sensors into textiles bridges the divide between physical actions and digital feedback, offering vast potential in responsive robot teleoperation and immersive virtual reality training.”

Likewise, the interface could create more immersive experiences in virtual reality. Wearing smart gloves would add tactile sensations to digital environments in video games, where gamers could feel around their surroundings to avoid obstacles. Additionally, the interface would provide a more personalized and touch-based experience in virtual training courses used by surgeons, firefighters, and pilots, where precision is paramount.

While these wearables could provide a more hands-on experience for users, Luo and her group believe they could extend their wearable technology beyond fingers. With stronger haptic feedback, the interfaces could guide feet, hips, and other body parts less sensitive than hands.

Luo also noted that with a more complex artificial intelligence agent, her team's technology could assist with more involved tasks, like manipulating clay or driving an airplane. Currently, the interface can only assist with simple motions like pressing a key or gripping an object. In the future, the MIT system could incorporate more user data and fabricate more conformal and tight wearables to better account for how hand movements impact haptic perceptions.

Luo, Matusik, and Rus authored the paper with EECS Microsystems Technology Laboratories Director and Professor Tomás Palacios; CSAIL members Chao Liu, Young Joong Lee, Joseph DelPreto, Michael Foshey, and professor and principal investigator Antonio Torralba; Kiu Wu of LightSpeed Studios; and Yunzhu Li of the University of Illinois at Urbana-Champaign.

The work was supported, in part, by an MIT Schwarzman College of Computing Fellowship via Google and a GIST-MIT Research Collaboration grant, with additional help from Wistron, Toyota Research Institute, and Ericsson.

Press Mentions

Scientific American

Scientific American reporter Riis Williams explores how MIT researchers created “smart gloves” that have tactile sensors woven into the fabric to help teach piano and make other hands-on activities easier. “Hand-based movements like piano playing are normally really subjective and difficult to record and transfer,” explains graduate student Yiyue Luo. “But with these gloves we are actually able to track one person’s touch experience and share it with another person to improve their tactile learning process.”

Related Links

Related Topics

Related Articles

More MIT News