Robots Are Getting More Social to Better Understand You

Key Takeaways

  • A new MIT study shows how robots can socially interact with each other and understand the differences between those interactions.
  • Eventually, MIT researchers hope the model will work on robot and human interactions.
  • Researchers say quantifying social interactions won’t help just robotics, but also the automotive industry, healthcare, and more.

When we think of robots, we think of cold machines without much understanding of human nature, but that could soon change.

new study published by a group of researchers from the Massachusetts Institute of Technology looks at how robots can become more social and how we define social interactions as a whole. The study’s findings will make it possible for a future where robots are more helpful and understand humans, which will prove crucial as robots play more of a role in our daily lives.

“Robots will increasingly become part of our lives, and although they are robots, they need to understand our language,” Boris Katz, principal research scientist and head of the InfoLab Group in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), and a member of the Center for Brains, Minds, and Machines (CBMM), told Lifewire in a video call.

“But more importantly, they will also need to understand the way humans interact with each other.”

What the Study Found 

Titled “Social Interactions as Recursive MDPS,” the study evolved from the authors’ interests in quantifying social interactions.

Andrei Barbu, a research scientist at CSAIL and CBMM and co-author of the study, told Lifewire that almost no datasets and models look at social interactions within computer science.

“The categories for social interactions are unknown; the degree to which a social interaction is happening or not happening is unknown,” he said during a video call. “And so we really thought this is the kind of problem that may be amenable to sort of more modern machine learning.”

onurdongel / Getty Images

The researchers established three different types of robots with different physical and social goals and had them interact with each other. Barbu said a level zero robot had only a physical goal in mind; a level one robot had physical and social goals to help other robots but assumed that all other robots only had physical goals. Finally, a level two robot assumed all robots had both social and physical goals.

The model was tested by placing robots in a simple environment to interact with each other based on their levels. Then, human test subjects were shown video clips of these robot interactions to determine their physical and social goals.

The results showed that, in most instances, the study’s model agreed with humans on if/what social interactions were occurring in different clips. This means the technology to spot social interactions is getting better and could be applied to robots and all sorts of other applications.

A High-Tech Future That’s More Social 

Barbu said they would expand this research to test not only robot-to-robot social interactions but also how robots can interact with humans on a social level—something that is drastically needed in robotics.

“One part of the future is robots that are more understanding of us,” he said. “Right now, for the most part, robots are not particularly friendly. They’re not particularly safe in many cases to be around, and that’s because they can easily do something dangerous or unpredictable to us. So having a robot that can actually help you do something is very important.”

Think of it as actually having a conversation with Alexa or Siri and having these assistants accurately help you instead of constantly misunderstanding you. The study’s authors have also published a follow-up research paper that extended the framework for richer social interactions between robots such as cooperation, conflict, coercion, competition, and exchange.

And while a world where robots can understand us better will be helpful, Barbu said there are lots of places where social skills for machines will play a role.

“For example, we do work with the Toyota Research Institute, and autonomous cars actually need to have a certain amount of social skills when you get to some intersection,” Barbu explained. “In that scenario, it’s not just about who has the [right-of-way]—it’s often about the social interaction between the two cars.”

However, Barbu said that even more importantly, the ability to quantify social interactions with this model would open the doors to help monitor social interactions for diseases and disorders such as autism, depression, Alzheimer’s, and more.

“This kind of thing really matters in cognitive science because social interactions are understudied—they’re kind of a big black box,” he said. “And being able to quantify them makes a huge difference.”

[via: Lifewire]

[Photo: Alex Knight/Unsplash]

About Allison Murray