Tuesday, March 26, 2024

Horse whisperers can help engineers build robots with more capabilities

As horses did thousands of years before, robots are entering our lives and workplaces as companions and teammates – they vacuum our floors and help educate and entertain our children. Studies are showing that social robots can be effective therapy tools to help improve mental and physical health. They are already found in factories and warehouses, working collaboratively with human workers and sometimes even called co-bots.

Now, researchers at the Robotics Institute at Carnegie Mellon University say age-old interactions between people and their horses can also teach us something about building robots designed to improve our lives.

“There are no fundamental guiding principles for how to build an effective working relationship between robots and humans,” said Eakta Jain, an associate professor of computer and information science and engineering at UF’s Herbert Wertheim College of Engineering. “As we work to improve how humans interact with autonomous vehicles and other forms of AI, it occurred to me that we’ve done this before with horses. This relationship has existed for millennia but was never leveraged to provide insights for human-robot interaction.”

The team conducted a year of fieldwork observing the special interactions between horses and humans at the UF Horse Teaching Unit in Gainesville, Florida.

As a member of the UF Transportation Institute, Jain was leading the human factor subgroup that examines how humans should interact with autonomous vehicles or AVs.

“For the first time, cars and trucks can observe nearby vehicles and keep an appropriate distance from them as well as monitor the driver for signs of fatigue and attentiveness,” Jain said. “However, the horse has had these capabilities for a long time. I thought, why not learn from our partnership with horses for transportation to help solve the problem of natural interaction between humans and AVs.”

Engineers taking inspiration from the animal world to create robots with more capabilities is not new, though most studies have been inspired by the relationship humans have with dogs. Jain and her colleagues are the first to bring together engineering and robotics researchers with horse experts and trainers to conduct on-the-ground field studies with the animals.

A thematic analysis of Jain’s notes resulted in findings and design guidelines that can be applied by human-robot interaction researchers and designers.

“Some of the findings are concrete and easy to visualize, while others are more abstract,” she says. “For example, we learned that a horse speaks with its body. You can see its ears pointing to where something caught its attention. We could build in similar types of nonverbal expressions in our robots, like ears that point when there is a knock on the door or something visual in the car when there’s a pedestrian on that side of the street.”

When the trainer first works with a horse, he looks for signs of respect from the horse for its human partner. “We don’t typically think about respect in the context of human-robot interactions,” Jain says. “In what ways can a robot show you that it respects you? Can we design behaviors similar to what the horse uses? Will that make the human more willing to work with the robot?”