• To add backgrounds in post-production, technicians have to know exact camera positions and lens optical parameters used during filming. Currently, visual tracking involves 3,000-pound cranes fitted with rotary measuring devices that lack complete accuracy. Mack's invention puts Intersense optical-inertial and Airtrack inertial sensors onto the camera itself, which precisely record necessary data and eliminate the need to manually figure out tracking info.

    To add backgrounds in post-production, technicians have to know exact camera positions and lens optical parameters used during filming. Currently, visual tracking involves 3,000-pound cranes fitted with rotary measuring devices that lack complete accuracy. Mack's invention puts Intersense optical-inertial and Airtrack inertial sensors onto the camera itself, which precisely record necessary data and eliminate the need to manually figure out tracking info.

    Full Screen

Special effects leap forward

To add backgrounds in post-production, technicians have to know exact camera positions and lens optical parameters used during filming. Currently, visual tracking involves 3,000-pound cranes fitted with rotary measuring devices that lack complete accuracy. Mack's invention puts Intersense optical-inertial and Airtrack inertial sensors onto the camera itself, which precisely record necessary data and eliminate the need to manually figure out tracking info.

Members of the MIT community have a history of transforming visual effects; present work is helping to advance green screen technology.


Members of the MIT community have a history of transforming visual
effects. Herbert Kalmus (1903) and MIT Physics Professor Daniel
Comstock co-developed Technicolor (yes, it was named after MIT) in 1915 and Bill
Warner ’80 created the Avid digital editing system.

And now Eliot Mack
SM ’96 hopes to add his name to the list. His portable Previzion system
enables accurate matching of live-action foregrounds and
computer-generated backgrounds so directors can see beyond a green
screen at how the final shot will look. Watch video of the technology in action.

While green-screen technology is not new (meteorologists swear by it),
creating live photorealistic images with it is. Current technology has
problems accounting for motion tracking, image resolution, focusing and
defocusing background shots, and capturing lens adjustment
calibrations, which are crucial for post-production work. Mack has
refined his technology to automatically generate camera tracking data
and to not miss a single strand of hair against the backdrop.
“Essentially, we’re recreating the world on the fly,” he says. So far,
it’s been used on the television show V, the upcoming Tim Burton movie Alice in Wonderland, and the Knight Rider made-for-TV movie.

Read the Slice of MIT blog post to learn how Mack's invention works.


Topics: Alumni/ae, Innovation and Entrepreneurship (I&E), TV, Visual arts

Comments

Back to the top