Hand research spans basic science, virtual reality


Capable of moving in some 20 independent ways, it contains about 20 muscles and 20 nerve types and is operated by another 20 muscles in the forearm. Indeed, the instrument is so complex that only recently have engineers developed affordable machines with enough sensors, actuators and computing power to study it in detail.

As a result, researchers still know relatively little about such basic functions as how a fingerpad receives tactile information and routes those data to the brain. A better understanding of how the hand works, however, could aid the diagnosis and treatment of conditions like carpal tunnel syndrome. More data are also key to robotic hands, and to software that allows us to touch and feel virtual objects generated by a computer.

Introducing MIT's Touch Lab. In addition to exploring the basic science behind the hand, its scientists are applying their results to high-tech systems like virtual reality for surgical simulations in which the user can "see, touch and feel" the tissues involved. Another application: a microscope for imaging what happens at the fingertip when we touch something. Such systems, in turn, allow further experiments for gleaning that much more about how the hand works. For example, robotic devices developed in the lab are used to apply controlled stimuli to the hands of human subjects in experiments to study perception.

"We want to understand the overall perceptual capabilities of the human hand and sense of touch, as well as the underlying mechanisms," said Mandayam A. Srinivasan, director of the Laboratory for Human and Machine Haptics (the facility's formal name). Dr. Srinivasan holds appointments through the Research Laboratory of Electronics (RLE) and the Department of Mechanical Engineering.

Touch Lab research falls into three broad categories: human haptics (the latter term describes all aspects of hands and how they work), machine haptics (robotic devices) and computer haptics (the generation of virtual "objects" that the user can touch and feel). "Think of these areas as roughly analogous to human vision, machine vision and computer graphics," Dr. Srinivasan said.

Projects in human haptics range from developing computer models of the fingerpad to recording the neural signals sent from receptors in the finger to the brain (a collaboration with Yale University). "Touch begins in the fingertips but ends in the brain," Dr. Srinivasan explained. The research spans the fields of biomechanics (the study of properties such as fingerpad stiffness and friction), neurophysiology and psychophysics (a branch of psychology that relates physical quantities such as force applied on the fingerpad to perception).

Basic research on human haptics could lead to better ways of evaluating hand impairments. "Audio and visual tests are very sophisticated," Dr. Srinivasan said, "but the tests clinicians have for the sense of touch, for example, are very primitive." This is primarily because the human haptic system is so complex and because "there's been no dominant clinical need," he noted. It's easier to live with a damaged hand than with a loss of sight or hearing.

Nevertheless, as Touch Lab scientists and others learn more about the hand, Dr. Srinivasan foresees "a suite of devices that could be common for hand evaluations." (He doubts that in the near future a single device could do everything because the hand is too complex.) Such machines might, for example, "measure how quickly functionality is returning to a hand transplant," and if problems arise they "could give doctors hints as to what's responsible."

ULTRASOUND MICROSCOPE

Last year Touch Lab scientists completed an ultrasound microscope that takes images of the layers of skin at the fingertip, much like diagnostic ultrasound creates images of fetuses. The microscope is helping the scientists learn more about how fingertip layers deform and change their geometry when we touch something.

Developed by Dr. Srinivasan and Balasundar Raju, a graduate student in electrical engineering and computer science, the instrument could also be applied to the detection of skin cancer. "Sunburn and skin cancer result in changes in the mechanical properties of the skin. This microscope could potentially detect those properties," Dr. Srinivasan said.

In computer haptics work at the Touch Lab, "we're focusing on how to convey the texture, shape and stiffness of computer-generated objects such as a cube or heart," he said.

Other projects explore human-machine interactions. For example, Touch Lab researchers are studying the effects of vision and sound on the touch perception of subjects who interact with a computer through a haptic interface (a robotic device that translates the touch and feel of virtual objects to the user). They are also developing virtual environments in which two or more people in different locations use haptic interfaces to explore and manipulate the same object.

VIRTUAL SURGERY

Software in development simulates a surgical procedure -- a laparoscopy -- on the computer. "The goal is to develop a system that will help train doctors," said Cagatay Basdogan, an RLE research scientist who is doing the work with Dr. Srinivasan.

Laparoscopy involves making small incisions in the body. Instruments including a scope for seeing the tissues involved and various surgical tools are inserted through these incisions and manipulated to complete an operation such as removing the gall bladder.

In the simulation, the user sees an on-screen image of the organs and tissues in question. By manipulating a stencil that's analogous to a computer mouse, the user can "place a 'tool' over, say, a lung, press down on it, see the deformation of the lung and 'feel' how soft and pliable it is," Dr. Basdogan said. (The hardware for the stencil system, known as Phantom, was developed by other MIT researchers a few years ago.)

To date, the researchers have completed a virtual probe and forceps for the simulation. The forceps is used to pinch and pull back folds of virtual tissue. "Now we're working on cutting tools and on computer models for bleeding. Tissues should bleed if you cut them," Dr. Basdogan said. The MIT researchers and collaborators at Massachusetts General Hospital (MGH) and Harvard Medical School are also preparing to make measurements of the properties of real tissues. Those properties will be incorporated into the virtual system.

Touch Lab research is funded by the National Institutes of Health, the Defense Advanced Research Projects Agency, the Office of Naval Research, Harvard University and MGH.

A version of this
article appeared in the
March 17, 1999

issue of MIT Tech Talk (Volume
43, Number
23).


Topics: Health sciences and technology, Artificial intelligence

Back to the top