martes, 31 de julio de 2018

People Read Facial Expressions Differently – NIH Director's Blog

People Read Facial Expressions Differently – NIH Director's Blog





People Read Facial Expressions Differently

Posted on  by 
Credit: Lydia Polimeni, NIH
What do you see in the faces above? We constantly make assumptions about what others are feeling based on their facial expressions, such as smiling or frowning. Many have even suggested that human facial expressions represent a universal language. But an NIH-funded research team recently uncovered evidence that different people may read common facial expressions in surprisingly different ways.
In a study published in Nature Human Behaviour, the researchers found that each individual’s past experience, beliefs, and conceptual knowledge of emotions will color how he or she interprets facial expressions [1]. These findings are not only fascinating, they might lead to new ways to help people who sometimes struggle with reading social cues, including those with anxiety, depression, bipolar disorder, schizophrenia, or autism spectrum disorder.
Jon Freeman’s Social Cognitive & Neural Sciences lab at New York University, New York, studies how people make split-second judgments about others based on their appearance, including facial expressions. In the new study, Freeman and doctoral student Jeffrey Brooks explored how a person’s internal grasp of six core emotions—anger, disgust, happiness, fear, sadness, and surprise—can color their perception of quintessential facial expressions.
Studying cognition can be tricky, so Brooks and colleagues had to design their study with creativity and care. They started by asking a large group of people to rate the conceptual similarity between all 15 possible pairs of those six core emotions (for example, anger and disgust, sadness and happiness). They also asked people to rate each of the six emotions on how well they correspond to 40 feelings, thoughts, and actions, such as “crying” or “clenching fists.” They pulled together this information to index each participant’s view of the similarities among the six emotions.
Next, they dove deeper to explore how those conceptual understandings of emotions influenced the way those individuals read facial expressions. That required some innovation. Participants used a computer mouse-tracking technology that recorded their unconscious hand movements while deciding which of two emotions listed on a screen best described a stereotypical facial expression.
Why record subtle hand movements? Their indecision provided a window into their cognitive state. When participants had to choose between facial expressions that they viewed as similar, such as anger and disgust, they would sometimes move their mouse toward anger before selecting disgust on the screen (or vice versa). When two emotions were viewed as unrelated, such as sadness and happiness, participants moved the mouse from the expressive face straight to the corresponding emotion.
These initial studies showed the researchers that conceptual and perceptual similarities track together. That is, a person who rated anger and disgust as highly similar emotions conceptually also saw similarities in faces representing those two emotions. For those who didn’t, the connection literally didn’t enter the picture.
Next, the researchers probed a little further. They asked people to choose which of two faces reflected a particular emotion. What the participants didn’t know is that the two faces actually showed an identical neutral face with a few random photographic tweaks to suggest differences.
After a participant had clicked for lots of these sample faces, the researchers averaged out his or her selections representing each of the six emotions. That average face offered a window into that individual’s view of what each of the emotions look like.
The averages yielded what Brooks and colleagues had predicted. Study participants who indicated greater conceptual overlap between two emotions, such as anger and sadness, would yield an image representing anger that looked sad. Similarly, their image of sadness would look angrier.
The findings suggest it might be worth conducting studies to explore whether people, including those with autism or other disorders, might learn to read faces more instinctively by spending more time learning and thinking about the commonly held meanings behind those emotions and their relationships to one another. Freeman and Brooks suggest the findings also may have implications for developing facial recognition technologies designed to detect emotions.
Freeman’s lab is now conducting complementary studies to explore what happens in the brain as people make such judgments about others’ emotions. As we await those results, it’s worth keeping in mind that your answer to the question, “What do you see in the face above?” is not just about those faces. It’s also about you.
References:
[1] Conceptual knowledge predicts the representational structure of facial emotion perception. Brooks JA, Freeman JB. Nature Human Behaviour. 23 July 2018.
Links:
Autism Spectrum Disorder (National Institute of Mental Health/NIH)
Depression (NIMH)
Freeman Lab (New York University)
NIH Support: National Institute of Mental Health

No hay comentarios:

Publicar un comentario