Sunday, May 5, 2024

MIT’s liquid neural networks help drones navigate unseen environments

Drone technology provides enormous benefits and has a wide range of applications, such as surveillance and delivery applications, search and rescue, wildlife monitoring, firefighting, healthcare, and agriculture. But sending them into unfamiliar environments with precision and ease can be a challenge.

Now, researchers at the Massachusetts Institute of Technology (MIT) have developed a method for robust flight navigation agents to master vision-based fly-to-target tasks in intricate, unfamiliar environments.

Inspired by the adaptability of organic brains, MIT created its liquid neural networks in 2021. Artificial intelligence and machine learning algorithms are able to learn and adapt to new data in the real world.

The key to liquid networks’ robust performance under distribution shifts is their ability to dynamically capture the true cause and effect of their given task. The networks can then extract crucial aspects of a task and ignore irrelevant features, allowing acquired navigation skills to transfer targets seamlessly to new environments.

The liquid neural networks outperformed many state-of-the-art counterparts in navigation tasks. The algorithms showed prowess in making reliable decisions in unknown domains like forests, urban landscapes, and environments with added noise, rotation, and occlusion, the university said.

“Our experiments demonstrate that we can effectively teach a drone to locate an object in a forest during summer and then deploy the model in winter, with vastly different surroundings, or even in urban settings, with varied tasks such as seeking and following,” says Daniela Rus, MIT professor and paper co-author of the paper. “This adaptability is made possible by the causal underpinnings of our solutions. These flexible algorithms could one day aid in decision-making based on data streams that change over time, such as medical diagnosis and autonomous driving applications.”

The MIT team first trained their system on data collected by a human pilot to see how they transferred learned navigation skills to new environments under drastic changes in scenery and conditions. The liquid neural network’s parameters can change over time, making them not only interpretable but more resilient to unexpected or noisy data.

In testing the network, researchers found that drones were able to track moving targets and execute multi-step loops between objects in never-before-seen environments. The team believes that the ability to learn from limited expert data and understand a given task while generalizing to new environments could make autonomous drone deployment more efficient, cost-effective, and reliable.

“Robust learning and performance in out-of-distribution tasks and scenarios are some of the key problems that machine learning and autonomous robotic systems have to conquer to make further inroads in society-critical applications,” says Alessio Lomuscio, professor of AI safety in the Department of Computing at Imperial College London. “In this context, the performance of liquid neural networks, a novel brain-inspired paradigm developed by the authors at MIT, reported in this study is remarkable. If these results are confirmed in other experiments, the paradigm here developed will contribute to making AI and robotic systems more reliable, robust, and efficient.”

Journal reference:

  1. Makram Chahine, Ramin Hasani, Patrick Kao, Aaron Ray, Ryan Shubert, Mathias Lechner, Alexander Amini, Daniela Rus. Robust flight navigation out of distribution with liquid neural networks. Science Robotics, 2023; DOI: 10.1126/scirobotics.adc8892