We've located the part of the brain which understands social interactions
This article by Jon Walbrin, PhD Researcher in Cognitive Neuroscience, at the School of Psychology, was originally published on The Conversation. Read the original article.
The ability to quickly detect and recognise the purpose of a social interaction is as indispensable today as it would have been to our ancient ancestors – but how does the brain do it?
Figuring out the meaning of an interaction between other humans enables us to respond to, and think about, others accordingly. It means we can intervene at the point where a disagreement between friends disintegrates into a bitter argument. Or recognise a nuanced power struggle between two opposing politicians.
Our research team recently identified the part of the brain that plays a crucial role in this complex, yet effortless, process. What’s more, we’ve found that this region is not only sensitive to the presence of interactive behaviour, but also to the contents of interactions (for example whether people are helping each other or not).
In our study, we used a brain imaging technique – functional magnetic resonance imaging (fMRI) – to measure the responses of our participants to brief video clips. These depicted two human figures either interacting or not interacting with each other.
Intriguingly, a single brain region – the right side of the posterior superior temporal sulcus (pSTS) – was the only part of the brain that could differentiate between interactive and non-interactive videos. That is, the right pSTS was the only brain region that was reliably more activated by the videos. This strongly implicates the important role that this region plays in perceiving interactive behaviour.
Measuring social interactions
This isn’t the first time that the pSTS has been linked with processing social information, however. The pSTS and broader temporal lobe region (the blue part of the brain in the image (right) are known to be sensitive to other categories of social visual information. This includes faces and bodies, as well as theory of mind processing – that is, when actively thinking about what’s going on in another person’s mind. These types of social information are often at play during social encounters, so could it be that interaction sensitivity in the pSTS is actually due to differences in face, body, or theory of mind processing instead?
To figure out whether this was the case, we ran a second fMRI experiment. Here the videos that participants watched were very carefully controlled. These new videos did not contain sources of social information other than actions, so any pSTS sensitivity should have been from the interactive information.
Removing faces and bodies from videos about human social interactions seems almost impossible, if not inherently eerie. But we designed a special set of animations that showed 2D shapes moving “purposefully” around a scene.
Previous research has shown that carefully controlling how shapes like these move can create the strong impression that the shapes are “alive” and moving in an intentional way. These are much like the meaningful actions that humans might perform, for example, opening a door, or pulling a lever.
Competition and cooperation
Using these videos (at the top of the page), we looked at two aspects of how the pSTS processes visual interactive information. First, we analysed whether it could tell two interacting shapes from two non-interacting shapes, similar to our first experiment. Then we looked at whether the pSTS could tell the difference between two different types of interactions: competition – where the shapes worked against each other. For example, one tried to open a door, while the other tried to close it – and cooperation. In this latter footage, the shapes worked together. For example, they both tried to open a door.
As we predicted, the right pSTS was able to reliably discriminate between interaction compared to non-interaction videos, a finding that complements what we saw in our first experiment. Similarly, the pSTS could also reliably discriminate between competitive and cooperative interactions. Together, these findings demonstrate that the pSTS is a region centrally involved in processing visual social interaction information.
However, it seems unlikely that one small chunk of brain could be entirely responsible for such a complex and dynamic process. So we also compared responses in a neighbouring brain region – the temporo-parietal junction – that is associated with theory of mind processing, and so may also contribute to interaction perception. We observed that, like the pSTS, the junction could reliably tell apart interactions from non-interactions, and competition from cooperation, although to a weaker extent.
To bring it all together, these findings show the crucial role that the pSTS, and to some extent, the temporo-parietal junction play in perceiving visual social interactions. However, our results also open up a lot of interesting questions for future interaction research. We don’t yet know which type of interactive information is most important in detecting whether two people are interacting. Nor do we know how pSTS interaction responses differ in people that tend to show atypical social visual responses, such as with autism spectrum disorders.
Publication date: 20 March 2018