How to avoid death row...

Whether jurors decide to hand down a life sentence or the death penalty depends in part on their perception of the defendant’s appearance. That’s the finding from interviews with 80 jurors during their involvement in real-life American murder cases.

Even after taking into account the nature of the murder, defendants who were perceived by jurors to be sorry and sincere were more likely to be sentenced to life imprisonment than to be sentenced to death. On the other hand, defendants who appeared bored or who looked frightening were more likely to be given the death penalty. That’s despite jurors being instructed to make their decision based only on the legal facts of the case.

Appropriately, the jurors’ choice of punishment was also related to the nature of the crime. They were more likely to opt for the death penalty if the victim was made to suffer before being killed, or if they were maimed or mutilated after death. Murders that weren’t premeditated were more likely to be punished by life imprisonment.

“Finding that trial outcomes are not solely the result of legal facts and evidence brought out during the trial, but are attributable to extra-legal factors, including the defendant’s appearance, may be disturbing to many who believe in the integrity of our criminal justice system…” said Michael Antonio, author of the study.

Antonio, M.E. (2006). Arbitrariness and the death penalty: How the defendant’s appearance during trial influences capital jurors’ punishment decision. Behavioural Sciences and the Law, 24, 215-234.
You have read this article Forensic with the title April 2006. You can bookmark this page URL Thanks!

...And how fingerprint experts are biased by context

Itiel Dror and colleagues at Southampton University recruited five fingerprint experts with 85 years of experience between them, and asked them to analyse a pair of fingerprints that, unbeknown to them, they had previously declared as matching in a real-life criminal case five years earlier.

Crucially, the researchers misled the experts, telling them that the pair of prints – including one from the scene of the crime, and one from a suspect – were the same pair that had led to the wrongful arrest of an innocent Muslim as the Madrid bomber. In this context, and even though they were allowed to use their usual lab facilities, only one of the experts now declared the two prints as a match. Three said the prints didn’t match, and one said a definite decision couldn’t be made.

“This study shows that fingerprint identification decisions of experts are vulnerable to irrelevant and misleading contextual influences”, the researchers said. “Further research should use different and more subtle manipulations to examine in greater depth when such factors affect performance and render the experts vulnerable to misjudgements”.

Dror, I.E., Charlton, D. & Peron, A.E. (2006). Contextual information renders experts vulnerable to making erroneous identifications. Forensic Science International, 156, 74-78.

Link to lead researcher Dr. Dror talking on BBC's Newsnight programme.
You have read this article Cognition / Forensic with the title April 2006. You can bookmark this page URL Thanks!

Why eye movement therapy works

It involves recalling your horrific experience and then following your therapist’s moving finger with your eyes, which may sound a bit wacky, but as a treatment for post-traumatic stress, eye movement desensitisation and reprocessing therapy (EMDR) is endorsed by National Institute for Clinical Excellence guidelines. However, the treatment continues to attract controversy, not least because it’s unclear how it works. But now Christopher Lee and colleagues report that EMDR’s critical ingredient is that it allows traumatised people to relive their trauma ‘at a distance’, as a detached observer.

Lee’s team followed 44 traumatised patients – some were car crash survivors, others had been sexually assaulted – through their first session of EMDR. Those patients whose statements during therapy suggested they were recalling their trauma at a distance (e.g. “The faces seem all blurred”; “It doesn’t seem so real”) showed the most improvement in their symptoms a week later. By contrast, there was no association between the number of statements made by patients that related to reliving the trauma first hand (e.g. “I am in the ambulance”; “I see her crawling away from me”) and their improvement a week later.

The researchers said this undermines the notion that EMDR works like traditional exposure therapy, in which patients are encouraged to relive their trauma first hand. “A distancing process…was associated with more improvement than when participants relived the trauma experiences” they said.

Although critics of EMDR have doubted the importance of the eye movement aspect of the therapy, Lee’s team concluded “The distancing may be partly facilitated by the distraction of the eye movement task…[or] facilitated by the therapist encouraging a dual focus of attention, that is, simultaneously being aware of the trauma material and of being in the therapist’s office”.
Lee, C.W., Taylor, G. & Drummond, P.D. (2006). The active ingredient in EMDR: Is it traditional exposure or dual focus of attention? Clinical Psychology and Psychotherapy, 13, 97-107.

Articles in The Psychologist magazine for and against EMDR.
You have read this article Mental health with the title April 2006. You can bookmark this page URL Thanks!

Stop looking at my feet!

There’s something about feet that allows us to detect the movement of other people or animals even in the most difficult viewing conditions.

That’s the implication of a study by Nikolaus Troje and Cord Westhoff, who were investigating the remarkable human sensitivity to biological movement. Previous research has shown that in the pitch dark, all it takes for us to be able to recognise the shape of a walking human, is for someone to wear a spot of light on each of the main joints of their body.

However, if the image is turned upside down, we’re useless at recognising human movement in this way, leading researchers to suggest it’s the signature configuration of the joints (disrupted when inverted) that allows us to recognise biological motion.

However, Troje and Westhoff’s experiments suggest this can’t be the sole explanation. They presented participants with a display that looked like either a person, cat or pigeon, walking in the dark, with a light on each of their main joints (see here), but they distorted the positioning of the lights, thus removing the configural information previously thought to be so vital. Crucially, they found that even with the configural information removed, the participants were still better at recognising which direction the person/animal was walking in when they viewed the image the right way up, compared with upside down. That is, inverting the images must have disrupted some other source of information the participants were using besides the spatial arrangement of the joints.

So they tried inverting some parts of the image but not others, and found the secret lay in the feet. Only inverting the feet disrupted performance, while contrarily, even with the positioning of the lights distorted, and with the rest of the body upside down, so long as the joints of the feet were shown moving the right way up, the participants were able to tell which direction the person or animal was walking in.

The researchers said the movement of feet may serve as a kind of ‘life detector’, providing “…a reliable cue for the presence and the location of an animal in the visual environment”. They added: “The observation that it is relatively easy to get close to wild animals in a car, a canoe, or similar vehicle might be due to the absence of the typical movement of feet”.
Troje, N.F. & Westhoff, C. (2006). The inversion effect in biological motion perception: Evidence for a ‘life detector’? Current Biology, 16, 821-824.

Link to host lab.
Link to interactive demo of the stimuli.
You have read this article Cognition / Perception with the title April 2006. You can bookmark this page URL Thanks!

Goth subculture linked with history of suicide and self harm

A Scottish study that collected information from 1,258 teenagers when they were aged 11, 13, 15 and 19 has found particularly high rates of attempted suicide and self harm (cutting, scratching, or scoring) among those who said they identified with the Goth subculture.

Of the 15 teenagers who described themselves as heavily into Goth culture at age 19, 53 per cent said they’d self harmed at some stage in their lives, and 47 per cent said they had tried to kill themselves. By contrast, of the 1165 teenagers who said they didn’t identify at all with the Goth culture, only 6 per cent reported they had previously self harmed, and just 5 per cent reported ever having tried to kill themselves.

Other factors associated with self harm and suicide were being female, having divorced or separated parents, smoking and drug taking, and prior depression. But even controlling for these factors, identification with the Goth subculture remained the strongest predictor of self harm and suicide.

Among the other 14 common youth subcultures that the teenagers were asked about, several others, including Punk and Mosher, were also associated with an unusually high prevalence of self harm and suicide, although to a lesser extent than Goth.

Whether the Goth culture plays a causal role in people’s self harm and/or suicide, or if instead people who self harm are drawn to the Goth culture remains unclear. Of the 25 participants who said they had at some point in their lifetime identified with Goth culture, five had harmed themselves before identifying with Goth, two afterwards and four at about the same time.

Robert Young, lead researcher on the study, said: “Since our study found that more reported self-harm before, rather than after, becoming a Goth, this suggests that young people with a tendency to self harm are attracted to the Goth subculture. Rather than posing a risk, it's also possible that by belonging to this subculture young people are gaining valuable social and emotional support from their peers.”
Young, R., Sweeting, H. & West, P. (2006). Prevalence of deliberate self harm and attempted suicide within contemporary Goth youth subculture: longitudinal cohort study. BMJ, Online First. DOI:10.1136/bmj.38790.495544.7C.

Link to responses to this research on the BMJ site.
Link to research centre.
Link to blog article on how the media report on Goths.
You have read this article Mental health / Social with the title April 2006. You can bookmark this page URL Thanks!

Localising 'Oops!' in the brain

Researchers have identified the part of the brain that is activated when we make a costly mistake, and they think the same region may be implicated in conditions like obsessive compulsive disorder (OCD) that are associated with disproportionate anxiety in everyday life.

Stephan Taylor and colleagues at the University of Michigan scanned the brains of 12 healthy participants while they performed a task that required them to press a particular button as fast as possible when they saw certain letters embedded among a string of distracters (e.g. the letter ‘S’” embedded like this ‘HHHSHHH’). Participants started out ten dollars in credit, and if they didn’t react fast enough, or they pressed the wrong button in response, they either missed out on a cash reward or incurred a cash penalty.

The researchers found that a part of the brain called the rostral anterior cingulate cortex (rACC), a region of the frontal lobe associated with emotions, was activated far more when participants incurred a cash penalty, than when they just missed a reward.

“In general, the response to a mistake that cost them money was greater than the response to other mistakes, and the involvement of the rACC suggests the importance of emotions in decision and performance-monitoring processes” said lead researcher Stephan Taylor.

An earlier study with OCD sufferers found mistakes triggered activity in this brain region even when no penalty was incurred. “It appears to us so far that OCD patients may have a hyperactive response to making errors, with increased worry and concern about having done something wrong” Taylor said. His team now hope to test OCD sufferers on this task, and to study the impact of cognitive behavioural therapy on their response to errors.
Taylor, S.F., Martis, B., Fitzgerald, K.D., Welsh, R.C., Abelson, J.L., Liberzon, I., Himle, J.A. & Gehring, W.J. (2006). Medial frontal cortex activity and loss-related responses to errors. The Journal of Neuroscience, 26, 4063-4070.
You have read this article Brain / Mental health with the title April 2006. You can bookmark this page URL Thanks!

Does the way mothers think about their difficult children matter?

Early findings have shown that the mothers of badly behaved young children think about their child’s behaviour in a characteristic way, tending to believe that their bad behaviour is intentional and has to do with the nature of the child rather than the child’s circumstances. This has led some to propose that the way such mothers think about their children’s behaviour may actually be contributing to the children’s conduct problems.

To test this idea Charlotte Wilson and colleagues recruited 80 mothers whose three-year-old children had been identified by community nurses as being particularly naughty and oppositional.

Consistent with past research, the researchers found that the mothers of the worst-behaved children (in terms of temper tantrums and disobedience) had more negative thoughts about their child’s behaviour – they tended to think their children behaved badly on purpose and would do so regardless of the circumstances.

But crucially, when the researchers followed the mothers up a year later, they found it was children’s behaviour that seemed to be affecting their mothers’ thoughts, rather than the other way around. That is, mothers with more difficult children at the first testing point were more likely to have developed negative thoughts about their child’s behaviour a year later. In contrast, children at age three whose mothers had more negative thoughts, tended not to have became more poorly behaved a year later.

The researchers said “…it is becoming clearer that early hard-to-manage behaviour in children has an impact on maternal thoughts and beliefs. In contrast, this study throws further doubt on the hypothesis that parental attributions have a direct effect on children’s conduct problems”.
Wilson, C., Gardner, F., Burton, J. & Leung, S. (2006). Maternal attributions and young children’s conduct problems: A longitudinal study. Infant and Child Development, 15, 109-121.
You have read this article Developmental with the title April 2006. You can bookmark this page URL Thanks!

Left overs

Studies that didn't make the final cut this fortnight:

Evidence-based guidelines have little influence on the clinical practice of psychotherapists and clinical psychologists.

Straight men were more likely to accept an unfair cash offer in a game of Ultimatum after viewing pictures of sexy women or lingerie, especially if they had high testosterone (as determined by the ratio of their second and fourth fingers).

A brain scanning study shows how learning is consolidated in the brain after training, even when we're busy with an unrelated task.
You have read this article Extras with the title April 2006. You can bookmark this page URL Thanks!

The dark side of the American dream?

"From the Archives", first published in the Digest 24.11.03.

Do you dream of untold wealth, fast cars and plasma screen TVs? Did you know that research suggests people who strive for financial success tend to be less satisfied with their lives?

Nobel prize-winner Daniel Kahneman and colleagues wondered whether this negative effect would disappear if people's income were taken into account. Like any other dream, might the goal of financial success be harmful only for those that fail? Humanist psychologists would disagree, arguing that seeking happiness through wealth is destined to failure.

Kahneman and his team had access to the financial aspirations of 12,894 American students when they began university in 1976, together with information on their financial status and levels of life-satisfaction collected between 1995 and 1997.

Contrary to the predictions of humanists, Kahneman found that, overall, the richer people were, the higher their life-satisfaction. And although, overall, dreams of wealth at university predicted subsequent reduced life-satisfaction, this relationship disappeared with financial success. Furthermore, the enhanced life-satisfaction that came from financial success was unaffected by whether or not individuals had dreamt of wealth when they were younger.

The message, it seems, is that striving for wealth and failing will make you miserable. Financial success, meanwhile, is likely to make you happier whether you dreamt of it or not.

Nickerson, C., Schwarz, N., Diener, E. & Kahneman, D. (2003). Zeroing in on the dark side of the American Dream: A closer look at the negative consequences of the goal for financial success. Psychological Science, 14, 531-536.
You have read this article Social with the title April 2006. You can bookmark this page URL Thanks!

Magazine reports on eating disorders are superficial and misleading

The propagation of the ‘thin ideal’ by glossy magazines is held by many to be partly responsible for the prevalence of eating disorders among women in western society. Ironically, for many people, those same magazines have become their main source of information about eating disorders. Now Rebecca Inch and Noorfarah Merali report that coverage of eating disorders by these magazines is inappropriate, with overemphasis on weight loss strategies, thinness, and a widespread failure to accurately convey the grave health consequences associated with eating disorders.

Inch and Merali analysed eating disorder coverage from 1998 to 2003 in ten magazines: Cosmopolitan, Glamour, Mademoiselle, Self, Seventeen, Vogue, Young and Modern, People Weekly and Teen People.

Although research suggests that bulimia (insatiable overeating) is up to three times more common than anorexia (deliberate starvation), Inch and Merali found that 75 per cent of the 42 articles they identified were features on anorexia.

They also found 97 per cent of the articles mentioned at least one disordered eating behaviour, with many highlighting common weight loss strategies such as the consumption of non-nutritive substances, and yet scarcely more than half mentioned the fact that eating disorders are potentially fatal (in fact according to a 2000 study, eating disorders have the highest mortality rate of all psychiatric disorders).

Furthermore, whereas most articles mentioned the exact menu used by eating disorder suffers when they were ill, fewer than 15 per cent gave a similar description of what sufferers ate after they had recovered. Similarly, a sufferer’s weight when they were ill was mentioned more often than their healthy weight.

The researchers made a number of suggestions to improve magazine coverage of eating disorders, including a call for more pieces on bulimia, a greater emphasis on the serious health consequences of eating disorders, and on healthy eating plans. “These recommendations could be presented to the magazines industry through the dissemination of this study to magazine editors in lay language”, they said.

Inch, R. & Merali, N. (2006). A content analysis of popular magazine articles on eating disorders. Eating Disorders, 14, 109-120.

Link to the Eating Disorders Association for help and advice.
You have read this article Health / Mental health with the title April 2006. You can bookmark this page URL Thanks!

Who will behave violently in the next two years?

Psychologists and psychiatrists often get the blame on the rare occasions that a former mental health patient goes on to commit a violent act (see note below). They are expected to be able to predict which patients are a risk. But a new American study has found a sample of 67 psychologists, psychiatrists, nurses and social workers were unable to use archived hospital admission evaluations and clinical notes to predict which of 52 patients went on to behave violently in the next two years. The finding comes from an investigation by Michael Odeh and colleagues into the cues used by mental health professionals to predict whether a patient is likely to be violent.

The 67 clinicians were asked to use the patient reports to predict how likely it was that each patient went on to be violent in the two years following their hospital admission. Afterwards the clinicians were asked which information they had used to come to that judgment.

The thirteen most commonly used cues for predicting dangerousness were: past assaults, non-compliance with medication, history of substance abuse, presence of psychosis, violent thoughts, previous admission to a psychiatric hospital, paranoid delusions, a diagnosis of mental illness, uncooperativeness, a history of poor impulse control, prior use of a weapon, hostility, and family problems. However, these cues did not accurately predict which patients went on to behave violently in the next two years.

The research also revealed professional differences. For example, nurses and social workers cited ‘hostility’ three times as frequently as psychologists and psychiatrists. They were also twice as likely to cite delusions, medication compliance and family problems as relevant cues predicting dangerousness – perhaps, the authors suggested, because of their more direct involvement in patient care.

While noting that the clinicians in this study never actually had the opportunity to meet the patients they were assessing, the researchers concluded that “the findings in this study may be further justification for a more structured approach to help clinicians evaluate risk factors when making clinical predictions”.

An earlier study by the same researchers suggested that although individual clinicians’ predictions of violence were inaccurate, accuracy was achieved by aggregating the judgments of multiple clinicians.

Odeh, M.S., Zeiss, R.A. & Huss, M.T. (2006). Cues they use: Clinicians’ endorsement of risk cues in predictions of dangerousness. Behavioural Sciences and The Law, 24, 147-156.

Note, 95 per cent of murders are not committed by psychiatric patients and most psychiatric patients are not dangerous.
You have read this article Forensic with the title April 2006. You can bookmark this page URL Thanks!

Clever children's brains develop differently

An investigation into the link between intelligence and brain development, rare in its use of a longitudinal methodology and large sample size, has found superior intelligence is associated with particularly dynamic developmental changes to the cortex – rapid cortical thickening during childhood, followed by a period of marked pruning during adolescence.

“’Brainy children’ are not cleverer solely by virtue of having more or less grey matter at any one age," the researchers said. "Rather, IQ is related to the dynamic properties of cortical maturation”.

Philip Shaw and colleagues at the National Institute of Mental Health in America divided 307 participants aged between 3 and 25 years into three groups – average, high and superior intelligence – based on their scores on age-appropriate IQ tests. Over half the sample had at least two brain scans, 30 per cent had three or more scans.

They found the frontal cortex of participants with ‘superior’ intelligence started off thinner than in ‘high’ and ‘average’ intelligence participants, but thickened rapidly until the age of 11 years, at which stage rapid thinning occurred. By contrast, the cortex of the high and average intelligence participants thickened more slowly until the age of about 7 or 8 years, followed by a period of less marked thinning than in the superior intelligence participants.

“The prolonged phase of prefrontal cortical gain in the most intelligent might afford an even more extended ‘critical’ period for the development of high-level cognitive cortical circuits”, the researchers said.
Shaw, P., Greenstein, D., Lerch, J., Clasen, L., Lenroot, R., Gogtay, N., Evans, A., Rapoport, J. & Giedd, J. (2006). Intellectual ability and cortical development in children and adolescents. Nature, 440, 676-679.
You have read this article Brain / Cognition / Developmental / Intelligence with the title April 2006. You can bookmark this page URL Thanks!

Don't worry, anxiety has its benefits

A quick internet search for book titles hints at the scale of the market in helping people become less anxious: ‘Overcome anxiety’…; ’Calming your anxious mind’…; ’What to do when you worry too much’…; ’Power over panic’ and on and on they go. But hang on a minute, it’s a dangerous world out there – what if being anxious is actually a sensible approach for staying alive?

Enter William Lee at the Institute of Psychiatry and colleagues, who used data from the Medical Research Council National Survey of Health and Development (MRC NSHD) to find out whether anxious people have fewer fatal accidents.

Using the survey to follow the fortunes of 5,362 people born in 1946, the researchers found that those individuals who had higher anxiety – as determined by the opinion of their school teacher when they were 13 – were significantly less likely to die in accidental circumstances before they were 25 (only 0.1 per cent of them did) than were non-anxious people (0.72 per cent of them did). Similar trends were observed when anxiety was measured using the teachers’ anxiety judgments when the sample were 15-years-old, or using the sample’s own completion of a neuroticism questionnaire when they were 16. By contrast, anxiety had no association with the number of non-accidental (e.g. illness-related) deaths before 25. “Our findings show, for the first time in a representative sample of humans, a relatively strong protective effect of trait anxiety”, the researchers said.

It’s not all good news for anxious people though. After the age of 25 they started to show higher mortality rates than calmer types thanks to increased illness-related deaths.

“Our results suggest there are survival benefits of increased trait anxiety in early adult life, but these may be balanced by corresponding survival deficits in later life associated with medical problems”, the researchers said.
Lee, W.E., Wadsworth, M.E.J. & Hotop, M. (2006). The protective role of trait anxiety: a longitunidal cohort study. Psychological Medicine, 36, 345-351.
You have read this article Mental health with the title April 2006. You can bookmark this page URL Thanks!

Change your personality, learn a new language

The personality of people who are bilingual changes depending on which language they use, lending credence to the Czech proverb “Learn a new language and get a new soul”.

That’s according to Nairan Ramirez-Esparza and colleagues who assessed the personality of dozens of people in America and Mexico who were fluent, current users of both English and Spanish.

Participants twice completed a questionnaire gauging the ‘Big Five’ personality dimensions of extraversion, agreeableness, conscientiousness, openness and neuroticism – once in English and once in Spanish. Across three separate samples, the researchers observed the same pattern – when the participants completed an English version of the questionnaire, they tended to score higher on extraversion, agreeableness and conscientiousness, and slightly lower on neuroticism, compared with when they completed a Spanish version.

However, it’s important to note that the overall shape of participants’ personalities did not change profoundly depending on which language they used. The researchers explained: “Thus, an extravert does not suddenly become an introvert as she switches languages; instead a bilingual becomes more extraverted when she speaks English rather than Spanish but retains her rank ordering within each of the groups”.

The research with bilinguals was consistent with a study of thousands of monolingual participants who spoke only Spanish, or only English, that showed English speakers tended to score higher on extraversion, agreeableness and conscientiousness.

Careful analysis confirmed none of these reported effects were due to the way the questionnaires were translated. Instead, the researchers explained the effect of using different languages on personality as a kind of ‘Cultural Frame Switching’ – “the tendency of bicultural individuals (i.e. people who have internalised two cultures, such as bilinguals) to change their interpretations of the world… in response to cues in their environment (e.g. language, cultural icons)”.
Ramirez-Esparza, N., Gosling, S.D., Benet-Martinez, V., Potter, J.P. & Pennebaker, J.W. (2006). Do bilinguals have two personalities? A special case of cultural frame switching. Journal of Research in Personality, 40, 99-120.

Link to a site where you can test your own personality (the same site used by the researchers to test monolinguals).
You have read this article Language / Personality / Social with the title April 2006. You can bookmark this page URL Thanks!

Student drinking

There’s a new approach to reducing student drinking that’s based on the finding that most students think all their mates drink loads more than they do, which encourages them to drink more themselves.

‘Social norms’ campaigns aim to reduce student drinking by spreading the word that, actually, the majority of students don’t drink that much. With the jury still out on how effective these campaigns are, Kelly Broadwater and colleagues wanted to investigate the premise behind the approach – the idea that students want to drink more when they believe their peers drink more than they do.

The researchers asked 171 first year students to report how much they drank over the last month; how much they wished they had drunk over the last month; and to estimate how much they thought their peers drank.

Consistent with the social norms approach, 91 per cent of the sample thought their peers drank more than they did. But contrary to the approach, when it came to how much the students wished they had drunk, there was no difference between the students who thought their peers drank more than them, and the students who thought their peers drank less than them.

Moreover, when they were followed up a month later, those students who said earlier that they wished they drank more, actually reported they didn’t drink any more than usual over the following month. It was only the students who said they wanted to drink less, who reported a change, saying that they had indeed drunk less than usual over the ensuing month.

The researchers said more work was needed to clarify the mechanisms behind social norms campaigns, and acknowledged the limitations of their own findings: “Although we found no evidence that our college student participants who perceived their peers were heavier drinkers than themselves desired to increase their drinking, the fact that 91 per cent of participants believed their close friends were heavier drinkers than themselves left us with limited power to detect differences”.
Broadwater, K., Curtin, L., Martz, D.M. & Zrull, M.C. (2006). College student drinking: Perception of the norm and behavioural intentions. Addictive Behaviours, 31, 632-640.
You have read this article Mental health with the title April 2006. You can bookmark this page URL Thanks!


Studies that didn't make the final cut this fortnight:

Girls are better at recognising faces.

What kind of teenager carries a weapon?

The children of fathers with stressful jobs are at increased risk of suicide.

Awake rats learn new routes by mentally replaying them in reverse, whereas sleeping rats replay new routes forwards.
You have read this article Extras with the title April 2006. You can bookmark this page URL Thanks!

The 'pain' of rejection

"From the Archives", first published in the Digest 10.11.03.

When rock group REM sing "everybody hurts sometimes" they are, of course, referring to the inevitability of emotional anguish rather than our shared need to occasionally reach for the paracetemol. But forget poetic licence, new research suggests this tendency to express emotional angst in terms of physical pain is scientifically justified.

Naomi Eisenberger (pictured) and colleagues scanned the brains of 13 undergraduates while they each played 'virtual' catch with two other people. Participants believed cartoon images represented the other players but in fact the whole procedure was run by a computer programme. In the "explicit rejection" condition, the imaginary players appeared to deliberately ignore the participant, passing the ball only between themselves.

Eisenberger found activity in the anterior cingulate cortex - previously linked to the experience of physical pain - was increased during the "rejection condition" and correlated with participants' self-reported feelings of exclusion. Meanwhile, activity in the right ventral prefrontal cortex - previously associated with the regulation of physical pain - was correlated with reduced distress following rejection. The authors concluded that "social pain is analogous to physical pain, alerting us when we have sustained injury to our social connections".

Eisenberger, N.I., Lieberman, M.D. & Williams, K.D. (2003). Does rejection hurt? An fMRI study of social exclusion. Science, 290-292.

Link to full-text.
You have read this article Brain / Rejection with the title April 2006. You can bookmark this page URL Thanks!