Human technology

Humans move unconsciously in sync with robots

Humans, like most social animals, mirror each other’s mannerisms and facial expressions through an act that psychologists call mimicry. Most often, mimicry helps humans feel more positive about the person they are with.

“When humans interact, they adopt each other’s rhythms in terms of breathing, speech and movement – and that’s unconscious,” says Ghiles Mostafaoui, a researcher at the University of Cergy-Pontoise in France. It is, he says, “a kind of social cement”.

In some cases, mimicry can facilitate bonds across species, as between humans and apes. But can we observe this imitation between a human and a robot? “If we had this kind of interaction with machines, robots or computers, you can get better intuitive interactions,” says Mostafaoui.

He and his colleagues found thanks to a small study published this week in PLOS ONE that humans actually mirror the movement of the humanoid robots they interact with, and that the rhythmic coordination between human and robot resembles that between human and human.

“We had one or two publications with the proof of concept,” says Mostafaoui. “This is the first to prove that humans can unwittingly coordinate with a robot, if the robot moves in a manner similar to humans.”

Bop like the bot bope

Setup was simple. Fifteen human subjects were placed facing a humanoid robot called NAO, seated on a table in front of them. Subjects had to place their right arm in front of them and move it up and down. The robot did the same. Its movement was controlled by an external computer running an algorithm that allowed the robot to synchronize the movements of its arms with those of the human or to move at a fixed rate, like a metronome. The humans weren’t told what the robot was going to do in advance – they didn’t know if the robot was moving on its own or reacting to the way they were moving.

The researchers asked each human subject to move their arms as they wished, regardless of what the robot was doing. Then they asked the subject to try to keep up with the robot. There was also a control condition in which human subjects moved their arms freely while wearing a blindfold and headphones.

[Related: MIT scientists taught robots how to sabotage each other]

They observed that nearly all of the subjects eventually matched the robot’s rhythm. The outlier was a dancer who always moved syncopated or staggered with the robot. “We never managed to synchronize it with the robot, even when the robot was controlled with the neural model that I developed to synchronize it with humans,” says Mostafaoui. “We think she was more intentionally avoiding the robot beat.”

Most participants noticed that they synchronized with the robot, but they were unsure whether it was themselves or the robot adjusting their tempo. Perhaps finding a way to measure intentionality by EEG or fMRI could be a next step in mapping this pattern of behavior onto neurobiology, Mostafaoui proposes.

There could be practical applications to this research. Mostafaoui was part of a team that used the NAO robot with schizophrenic patients to rehabilitate their coordination of movements. The idea was to use NAO to help patients calibrate their movement coordination. Many of these patients do not actively participate in sports and may have social deficits that make it difficult for them to coordinate, Mostafaoui says, further impairing their ability to carry out normal social interactions.

Recently, he was asked to study the robot’s effect on patients with catatonia, a condition that causes people to freeze their movements uncontrollably. But what’s special about this condition, Mostafaoui notes, is that if you move past them, they will subconsciously imitate you. He thinks a bot might be able to help here.

Should engineers teach robots simple social skills?

Unintentional coordination is important in human social contexts, perhaps because it is related to Warning and learning. And if it’s important for human-human interaction, that means it’s important for human-computer interaction.

[Related: Do we trust robots enough to put them in charge?]

So what does this mean for the design of future robots? Engineers began to move away from hard mechanical robots towards softer robots that have characteristics of human or animal physiology, which could make movements and gestures more natural. But to make their presence in our world feel more natural, Mostafaoui believes these robotic systems need to react to us. This means building new systems, robots and algorithms with natural motor and sensory reflexes, and they don’t need to be so complex to be effective. For their experiment, it was not so important that the robot looked human; on the contrary, it was more important that he move as such.

“If I try to predict everything you do, our interaction won’t be very natural,” he says. “We don’t need to control or predict the position of every joint.” What makes two-way interactions seamless are the simple notions, like nodding when another person does to acknowledge that you’re listening.