Thursday, April 18, 2024

The Photoacoustic Airborne Sonar System for aerial underwater surveys

High-resolution imaging and mapping of the ocean and its floor have been limited to less than 5% of the global waters due to technological barriers. Currently, radar and LiDAR are considered incredibly quick and effective tools for mapping and surveying the entire Earth’s landscapes from aircraft and satellites but are incapable of deep penetration into the water.

To make flying drones see better underwater, engineers at Stanford University have developed an airborne method for underwater imaging objects by combining light and sound to break through the air and water interface’s seemingly impassable barrier. The system, called “Photoacoustic Airborne Sonar System” or PASS, could be installed beneath drones to enable aerial, underwater surveys and high-resolution mapping of the deep ocean.

Oceans cover about 70% of the Earth’s surface, yet only a small fraction of their depths have been subjected to high-resolution imaging and mapping. Because the air and space-based radar and LiDAR systems have only been able to be sent out from underwater until now. On the other hand, sound waves lose more than 99.9% of their energy through reflection at the transition between air and water. The remaining 0.1% of the energy does create a sonar signal, but that loses a further 99.9% of its energy upon coming back up from the water into the air.

Schematic of proposed airborne sonar system
Schematic of proposed airborne sonar system. Credit: Kindea Labs

The Photoacoustic Airborne Sonar System (PASS) first fires a laser from the air that gets absorbed at the water surface. When the laser is absorbed, it generates ultrasound waves that propagate down through the water column and reflect off underwater objects before racing back toward the surface. The returning sound waves are still sapped of most of their energy when they breach the water surface, but by generating the sound waves underwater with lasers, the researchers can prevent the energy loss from happening twice.

At its heart, PASS plays to the individual strengths of light and sound. “If we can use light in the air, where light travels well, and sound in the water, where sound travels well, we can get the best of both worlds,” explained Aidan Fitzpatrick, a Stanford graduate student in electrical engineering.

The reflected ultrasound waves are recorded by instruments called transducers. Software is then used to piece the acoustic signals back together like an invisible jigsaw puzzle and reconstruct a three-dimensional image of the submerged feature or object.

We have developed a system that is sensitive enough to compensate for a loss of this magnitude and still allow for signal detection and imaging,” said study leader Amin Arbabian.

To date, PASS has only been tested in the lab in a container the size of a large fish tank. “Current experiments use static water, but we are currently working toward dealing with water waves,” Fitzpatrick said. “This is challenging, but we think a feasible problem.”

The next step, the researchers say, will be to conduct tests in a larger setting and, eventually, an open-water environment. “Our vision for this technology is on-board a helicopter or drone,” Fitzpatrick said. “We expect the system to be able to fly at tens of meters above the water.