Scientists at MIT and the University of Texas at Arlington (UTA) have developed a new type of microscopy that can image cells through a silicon wafer, allowing them to precisely measure the size and mechanical behavior of cells behind the wafer.
The new technology, which relies on near-infrared light, could help scientists learn more about diseased or infected cells as they flow through silicon microfluidic devices.
“This has the potential to merge research in cellular visualization with all the exciting things you can do on a silicon wafer,” says Ishan Barman, a former postdoc in MIT’s Laser Biomedical Research Center (LBRC) and one of the lead authors of a paper describing the technology in the Oct. 2 issue of the journal Scientific Reports.
Other lead authors of the paper are former MIT postdoc Narahara Chari Dingari and UTA graduate students Bipin Joshi and Nelson Cardenas. The senior author is Samarendra Mohanty, an assistant professor of physics at UTA. Other authors are former MIT postdoc Jaqueline Soares, currently an assistant professor at Federal University of Ouro Preto, Brazil, and Ramachandra Rao Dasari, associate director of the LBRC.
Silicon is commonly used to build “lab-on-a-chip” microfluidic devices, which can sort and analyze cells based on their molecular properties, as well as microelectronics devices. Such devices have many potential applications in research and diagnostics, but they could be even more useful if scientists could image the cells inside the devices, says Barman, who is now an assistant professor of mechanical engineering at Johns Hopkins University.
To achieve that, Barman and colleagues took advantage of the fact that silicon is transparent to infrared and near-infrared wavelengths of light. They adapted a microscopy technique known as quantitative phase imaging, which works by sending a laser beam through a sample, then splitting the beam into two. By recombining those two beams and comparing the information carried by each one, the researchers can determine the sample’s height and its refractive index — a measure of how much the material forces light to bend as it passes through.
Traditional quantitative phase imaging uses a helium neon laser, which produces visible light, but for the new system the researchers used a titanium sapphire laser that can be tuned to infrared and near-infrared wavelengths. For this study, the researchers found that light with a wavelength of 980 nanometers worked best.
Using this system, the researchers measured changes in the height of red blood cells, with nanoscale sensitivity, through a silicon wafer similar to those used in most electronics labs.
As red blood cells flow through the body, they often have to squeeze through very narrow vessels. When these cells are infected with malaria, they lose this ability to deform, and form clogs in tiny vessels. The new microscopy technique could help scientists study how this happens, Dingari says; it could also be used to study the dynamics of the malformed blood cells that cause sickle cell anemia.
The researchers also used their new system to monitor human embryonic kidney cells as pure water was added to their environment — a shock that forces the cells to absorb water and swell up. The researchers were able to measure how much the cells distended and calculate the change in their index of refraction.
“Nobody has shown this kind of microscopy of cellular structures before through a silicon substrate,” Mohanty says.
“This is an exciting new direction that is likely to open up enormous opportunities for quantitative phase imaging,” says Gabriel Popescu, an associate professor of electrical engineering and computer science at the University of Illinois at Urbana-Champaign who was not part of the research team.
“The possibilities are endless: From micro- and nanofluidic devices to structured substrates, the devices could target applications ranging from molecular sensing to whole-cell characterization and drug screening in cell populations,” Popescu says.
Mohanty’s lab at UTA is now using the system to study how neurons grown on a silicon wafer communicate with each other.
In the Scientific Reports paper, the researchers used silicon wafers that were about 150 to 200 microns thick, but they have since shown that thicker silicon can be used if the wavelength of light is increased into the infrared range. The researchers are also working on modifying the system so that it can image in three dimensions, similar to a CT scan.
The research was funded by the National Institute of Biomedical Imaging and Bioengineering and Nanoscope Technologies, LLC.