The National Science Foundation (NSF), along with the journal Science, honored a team of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) for their video "Revealing Invisible Changes In The World," where they used an algorithm to reveal subtle motions unseen by the naked eye.
The CSAIL team — graduate students Michael Rubinstein and Neal Wadhwa, alumni Eugene Shih SM '01, PhD '10 and Hao-Yu Wu MNG '12, associate professor Frédo Durand, and professors William T. Freeman and John Guttag — earned honorable mention for their video at the 10th annual International Science & Technology Visualization Challenge, a highly competitive international contest.
In the video, the researchers demonstrated an algorithm they developed that amplifies and allows for analysis of subtle movements and variations in color in ordinary videos. “Imagine you had special glasses that allowed you to see subtle changes that cannot be seen with the naked eye,” says Freeman, the video’s narrator.
By applying this algorithm to seemingly static videos, researchers were able to reveal previously hidden motions, such as a baby's breathing, a construction crane swaying and blood pulsing through arteries in a human wrist.
The algorithm is able to amplify movements by analyzing color variations for each pixel in a video over time. The algorithm then amplifies the variation, resulting in a video that highlights subtle color changes and movements. The algorithm also allows for further analysis by allowing researchers to analyze the frequency of the variation and, in the case of a medical evaluation, for instance, determine a patient’s heart rate without using sensors. Results are comparable to those achieved through the use of hospital-grade monitors.
“Our goal was to develop a tool which will allow you to see changes that you cannot normally see. In the video, we tried to show that there is essentially a whole world of small changes and motion that we cannot see with our own eyes, and we gave examples of such phenomena that our algorithm can make visible,” Rubinstein says.
Code is available online for individuals interested in exploring the subtleties of motion. The team was interested in applying their work to the challenge, because they felt the challenge had broad appeal to a general audience, and the project would be a visually interesting way to share their work.
One user of the team’s code has already posted magnified videos of his wife’s pregnancy, showing enhanced fetal movement, according to Rubinstein. Researchers hope to find additional applications for this work in fields such as medicine, civil engineering and architecture.
“Our team is still actively working on this direction, so people can expect more to come,” Rubinstein says. “We hope that it will motivate people to look deeper into this type of processing and different applications it can support.”
The annual challenge invites researchers, illustrators, photographers, computer programmers, videographers and graphics specialists from around the world to submit creative illustrations, information graphics, interactive visualizations and videos that intrigue, explain and educate others about science. The Feb. 1 issue of Science will feature the winning submissions, which will be available online at the NSF website.