Carnegie Mellon University

Colorful light bending around corners

September 27, 2023

Steering Light at the Speed of Sound

By Stacey Federoff

Aaron Aupperlee

Autonomous cars could see their surroundings 100 times faster with a system developed by researchers at Carnegie Mellon University.

Using ultrafast optics, the system could make self-driving cars and other robotics that rely on LiDAR (Light Detection and Ranging) devices much safer when navigating the world.

"The ability to get a measurement 100 times faster can mean life or death," said Ioannis Gkioulekas, assistant professor at the School of Computer Science’s Robotics Institute. "It can mean that a car can stop before crashing into a pedestrian crossing the street, hitting a bike or causing a collision."

The system uses ultrasound to control how a beam of light travels through a transparent material, such as water or plastic. Light follows the path of high-pressure regions through a medium. Ultrasound can quickly and dynamically sculpt pressure patterns, allowing researchers to form complex 3D patterns of light and steer them quickly.

"We hit light with sound," said Adithya Pediredla, who was the lead author on the project as a project scientist in the Robotics Institute and is now an assistant professor at Dartmouth College. "The sound acts as a pressure wave, compressing and rarefying the medium, and turning it into a lens that steers the light."

Traditional LiDAR devices create 3D maps by using rotating mirrors to steer a laser beam and scan multiple points in a sequence. The mechanical movement of the mirrors is slow, and it creates a bottleneck for how fast LiDAR devices can scan their surroundings. 

"Our technology replaces the rotating mirrors. The laser beam can now scan at the speed of sound, making the LiDAR scanning process 100 times faster," said Srinivasa Narasimhan, professor at the Robotics Institute."Additionally, unlike the circular patterns scanned by existing LiDAR devices, our technology can scan arbitrary patterns, making it possible to focus on small but important parts of the scene that LiDAR may normally miss, like distant vehicles or pedestrians."

Think of a pen in a glass of water. The water’s mechanical density, measured as the refractive index, bends the light coming off of the pen, making it appear broken. The team’s technology works similarly.

"The refractive index, a property of transparent materials, controls how fast or slow light travels through them," Gkioulekas said. "The technology we have developed changes the refractive index everywhere inside a material like water in a way that we control. As a beam of light travels through the material, it bends not just once, like with the pen, but continuously, tracing a curve through the material. By using ultrasound to control the refractive index of the material, we can program these curves and route light through the material."

The idea for this research was inspired from prior work by Maysam Chamanzar, associate professor in the College of Engineering’s Electrical and Computer Engineering Department, on better ways to steer light inside tissue for applications such as brain imaging and brain stimulation. Called optogenetics, this type of brain stimulation has previously been researched to understand brain function and discover new therapeutics for conditions such as Parkinson’s disease using techniques based on surgically implanted optical fibers.

The new light-steering system adapts this approach beyond brain imaging to any application requiring programmable optics. Besides LiDAR, imaging systems such as laser-scanning projectors and fluorescence microscopes could benefit from this innovation. The team hopes that others can build upon this work and use it as a platform technology for different types of applications.

"Ultrafast, programmable scanning of light can open many new doors in the future, some of which maybe we have not even thought about yet," Chamanzar said.

The project combines ideas from acousto-optics — the interaction of sound and light with each other and different materials — with ultra-fast optics and electronics, signal processing theory, and post-processing algorithms. Bringing together these different areas of engineering and computer science expertise was integral to making this new light steering technology possible.

The team presented their paper, "Megahertz Light Steering without Moving Parts," at the Computer Vision and Pattern Recognition (CVPR) 2023 Conference. The research was supported by awards from the National Science Foundation, AWS Cloud Credits for Research, the Sybiel Berkman Foundation and a Sloan Research Fellowship.