Thursday, April 18, 2024

World’s smallest LED can convert existing phone cameras into microscopes

Lensless imaging is a promising technique to realize low-cost, high-resolution, and large field-of-view (FoV) microscopes without requiring bulky and expensive optical components such as objectives and lenses. It has been widely used in coherent diffraction imaging, ptychography, and phase tomography.

Now, a team of researchers from Singapore-MIT Alliance for Research and Technology (SMART) has developed the world’s smallest LED (light emitting diode) that enables the conversion of existing mobile phone cameras into high-resolution microscopes.

Smaller than the wavelength of light, the new LED was used to build the world’s smallest holographic microscope. This paves the way for existing cameras in everyday devices such as mobile phones to be converted into microscopes via only modifications to the silicon chip and software. The technology also represents a significant step forward in the miniaturization of diagnostics for indoor farmers and sustainable agriculture.

The research also paves the way for a major advancement in photonics – building a powerful on-chip emitter smaller than a micrometer, which has long been a challenge in the field.

The light in most photonic chips originates from off-chip sources, which leads to low overall energy efficiency and fundamentally limits the scalability of these chips. To address this issue, researchers have developed on-chip emitters using various materials such as rare-earth-doped glass, Ge-on-Si, and heterogeneously integrated III-V materials.

The novel LED developed by SMART researchers is a CMOS-integrated sub-wavelength scale LED at room temperature exhibiting high spatial intensity and possessing the smallest emission area among all known Si emitters in scientific literature. To demonstrate a potential practical application, the researchers then integrated this LED into an in-line, centimeter-scale, all-silicon holographic microscope requiring no lens or pinhole, integral to a field known as lensless holography.

The research team also developed a deep neural network architecture to improve image reconstruction quality. This novel, untrained deep neural network incorporates total variation regularisation for increased contrast and takes into account the wide spectral bandwidth of the source.

The untrained neural network demonstrated in this study allows researchers to use novel light sources without prior knowledge of the source spectrum or beam profile, such as the novel and smallest known Si LED described above, fabricated via fully commercial, unmodified bulk CMOS microelectronics.

The neural networking algorithm is able to reconstruct objects measured by the holographic microscope. This enables enhanced examination of microscopic objects such as cells and bacteria without the need for bulky conventional microscopes or additional optics.

The researchers envision that this synergetic combination of CMOS micro-LEDs and the neural network can be used in other computational imaging applications, such as a compact microscope for live-cell tracking or spectroscopic imaging of biological tissues such as living plants.

“On top of its immense potential in lensless holography, our new LED has a wide range of other possible applications. Because its wavelength is within the minimum absorption window of biological tissues, together with its high intensity and nanoscale emission area, our LED could be ideal for bio-imaging and bio-sensing applications, including near-field microscopy and implantable CMOS devices,” said Rajeev Ram, Principal Investigator at SMART CAMP and DiSTAP, and co-author of papers. “Also, it is possible to integrate this LED with on-chip photodetectors, and it could then find further applications in on-chip communication, NIR proximity sensing, and on-wafer testing of photonics.”

Journal reference:

  1. Iksung Kang, Marc de Cea, Jin Xue, Zheng Li, George Barbastathis, and Rajeev J. Ram. Simultaneous spectral recovery and CMOS micro-LED holography with an untrained deep neural network. Optica, 2022; DOI: 10.1364/OPTICA.470712