One of the most powerful ways to train robots to navigate a house and accomplish useful tasks in the real world is to teach them in simulation. Exploring the virtual world allows AI agents to practice a task thousands or even millions of times faster than they could in a real physical space.
Back in 2019, Facebook rolled out the first version of the platform called ‘AI Habitat,’ an open-source simulation platform of photorealistic 3D home environments that could be used to train robots to navigate these environments. Now the company is taking it to the next level.
This week, the company announced Habitat 2.0, a next-generation simulation platform that will let artificial intelligence (AI) researchers train machines not only to navigate through 3D virtual environments but also to interact with objects just as they would in an actual kitchen, dining room, or other commonly used space.
The new platform features vastly improved speeds, new benchmarks, and a reconstructed dataset. This will help researchers train robots to navigate these environments much faster and more efficiently than before and also complete tasks within that environment, such as stocking the fridge, setting the table for dinner, loading the dishwasher, and even taking out the garbage.
The second version of Facebook’s simulation platform relies on a new dataset called ReplicaCAD. Habitat 2.0 uses a mirror image dataset to its predecessor, but the previously static 3D scans have been converted to individual 3D models with physical parameters, collision proxy shapes. This means robots can be trained to move around and manipulate them in a whole new way.
To ensure that robots are being taught effectively, the new data set includes information about the material composition, geometry, and texture. The interactive recreations also incorporated information about size and friction, whether an object (such as a refrigerator or door) has compartments that could open or close, and how those mechanisms worked, among other considerations. ReplicaCAD features 111 unique layouts of single living space and 92 objects that took 3D artists 900 hours to create.
Habitat 2.0’s speed also shows a marked improvement over the past version. The platform can simulate a Fetch robot interacting in ReplicaCAD scenes at 1,200 steps per second (SPS) while existing platforms typically run at 10 to 400 SPS. Such speeds significantly cut down on experimentation time, allowing researchers to complete experiments that would typically run over six months in as little as two days.
Researchers believe that their new platform will provide a research framework for training embodied AI for years to come. “We hope that the ability to perform more complex tasks in the simulation will bring us closer to the AI that can help make our everyday lives easier and better,” said Facebook research scientist Dhruv Batra.