In order to understand how a gait looks, though, the robot needs objective data, not subjective judgements. So the researchers used algorithms that analyzed videos of the people walking, with each person’s image overlaid by a skeleton with 16 joints, including at the neck, shoulders, and knees. Then, they used deep learning algorithms to get the system to associate certain skeletal gaits with the emotions that the human volunteers associated with those walking people.
They ended up with the new ProxEmo algorithm. They load this into a cute little yellow four-wheeled robot (the Jackal, from the robotics company Clearpath) with a camera mounted on it. As the robot rolls along, the camera watches passing pedestrians, while ProxEmo overlays each human with that 16-joint skeleton—the objective measurement of gaits, which the algorithm has learned to associate with certain emotions. That’s how ProxEmo can take a guess at your emotional state, and direct the robot to respect your personal bubble, which may grow larger if the system thinks you’re angry, and shrink if it thinks you’re happy.
But this bubble isn’t a perfectly rounded sphere—it’s oblong, a kind of ellipsoid. “So there’s more space ahead of you, there’s some space at the sides, and there’s less personal space behind you,” says Bera. When the robot is approaching a human head-on, it needs to give a lot of space up front, but less so as it passes on the side or when it resumes its original course once it’s behind the human.
The tricky bit, though, is that the robot trained on clean data extracted from people walking in a lab environment—that is, their limbs were all visible at all times. But that isn’t always the case in the real world. So to prepare the robot for a chaotic world, the researchers had to incorporate a little bit of chaos in the form of “noise.” In this context, it doesn’t mean sound, but training it to handle variables like clothing and if the person is carrying something that might cover their hand. “Adding noise won’t change the emotion, but we’ve added different kinds of noises, and then trained the system to understand even if one hand is missing, you’ve still been looking at that person for a while,” says Bera.
One day, ProxEmo might be paired with another system that reads facial cues, building an even more complex emotional intelligence, which makes it more likely to know if it should ask to be of assistance, or give the person a wide berth. Robots have to get much, much better at picking up such nuances, if we want them to walk and roll among us without hurting people or themselves.
“When robots behave in an unnatural way, not only can people become uncomfortable, but also it can lead potentially to collisions, because the person cannot guess how this robot is moving,” says MIT computer scientist Dina Katabi, who wasn’t involved in this work. “So if a robot, for example, can avoid making people uncomfortable because it’s getting too close, that would be beneficial.”
No sense, after all, in turning a fancy robot into a very expensive punching bag.
More Great WIRED Stories
social experiment by Livio Acerbo #greengroundit #wired https://www.wired.com/story/proxemo-robot-guesses-emotion-from-walking