Throughout history, cowboys have shared epic tales of horses that have taken them home on dark, foggy nights. Legendary horses have even carried wounded soldiers through battle zones.
Sensible and highly trained, these horses remind computer scientist and roboticist Eakta Jain of fully autonomous robots, specifically self-driving cars.
Vehicles today have different levels of autonomy, with level zero or one relating to an antilock braking system. Level two or three has adaptive cruise control, the ability to follow lanes or park. Such volition builds all the way to level five, which is full autonomy.
Those horses that take a rider home are level five, Jain says. “You trust them with your life.”
Interested in how that trust develops and if it could help improve human-robot interactions, Jain, of the University of Florida in Gainesville, immersed herself in the horse world during a year-long sabbatical that ended in 2022. Her experiences gave her a vast amount of information to address open questions about how humans should interact with robots of differing levels of autonomy.
“Only recently have researchers started focusing on the earliest stages of human-robot interaction, such as the creation of first impressions,” Jain and University of Florida colleague Christina Garner-McCune wrote in the April Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. “Our work seeks to uncover principles for the stage after the first meeting.”
During her sabbatical, Jain watched equine classes at her university, where she took notes based on her observations and conversations with students, teaching assistants and the instructor. She also spoke with other trainers and horse owners. And she learned to ride.
“I had never interacted with a horse before,” Jain says. “I was a new rider learning to interact with a horse who was not a new horse [to riding], but it was still the early stages of relationship building.”
When working with horses that are new to interacting with humans, the animals must learn how to interpret and appropriately respond as trainers do basic handling, such as leading and brushing them. This training helps ready horses for more advanced handling, such as accepting the saddle and bridle and carrying a human on their first few rides. New riders must learn the intricacies of how horse behavior communicates information and what cues horse handlers can use to direct horses to perform desired actions or correct the horse when it doesn’t behave as expected.
Analogously, people must learn how to command robots to perform specific tasks. They also must learn what to do when interactions with robots don’t go according to plan. Robots need to respond to inputs from humans in a predictable way. But more autonomous robots, like horses, must also dynamically respond to changing conditions, such as a self-driving car stopping to avoid hitting something, even when humans command it to continue forward.
In human-horse partnerships, horses primarily communicate nonvocally. For instance, their ears tend to point toward whatever they’re paying attention to. This behavior could inspire the creation of robots with electronic “ears” that can swivel in the direction of a ringing doorbell or people talking near an autonomous vehicle, thereby alerting humans to these sounds, Jain says. “Instead of the robot saying, ‘Knock on the door. Beep, beep,’ if its ears were to point toward the door, it’s a much less jarring, much more subtle way of informing” people that the robot hears something it’s paying attention to.
Such subtleness could also be used as a sign of respect. Trainers and riders work with horses intensively to build respect. Horses nonvocally communicate their respect for their trainers and riders by matching their pace or giving them safe personal space when they walk. The horses’ ears even point toward trainers as a show of respect when they’re paying attention to them. Trainers require that horses show signs of respect in basic interactions before “they will move on to the next, more complicated interaction,” Jain says. Respect can grow into trust. But, even with experienced horses and riders, trust is not a given.
There may be a similar limitation with robots. “Perhaps the same types of nonverbal expressiveness that’s used in the human-horse context can be used to communicate that the robot respects the human,” Jain says, which “might make humans feel safer around robots. It might make them more open to working with a robot.”
Probing what it means for robots and humans to respect and trust each other is largely uncharted territory, Jain says. Understanding how respect and trust grow between horse and human could be a game-changer.