For infants, walking is more than just another step in motor development

When an infant starts walking, this important achievement is more than just a milestone in motor control. According to Melissa Clearfield, the child's newfound locomotor skill arrives hand-in-hand with a raft of other changes in social behaviour and maturity. This is an unfolding, interactive process of development that before now has been little explored by psychologists.

Clearfield first had 17 non-walking infants (aged between 9 and 11 months) twice spend ten minutes exploring a 3m by 3m floor area dotted with toys, and with their mother and three other people positioned in each corner. The infants first explored the area crawling and then in a baby walker (this piece of equipment allows infants who can't yet walk to move around in an upright position as if walking).

The infants spent the same amount of time interacting with toys and people, gesturing and vocalising, whether they were crawling or in the baby walker. In other words, there wasn't anything about being in an upright position per se that changed the social behaviour of these children.

Next, Clearfield had a new group of 16 infants (also aged nine to eleven months) perform the same task, except these children were all walkers. These walking infants, even though they were age-matched to the first group, spent considerably more time vocalising and making socially-directed gestures, such as pointing at or waving a toy whilst looking at their mothers. Overall, the walkers spent three times as much time interacting with their mothers, and twice as much time interacting with the toys, compared with crawlers of the same age.

A final study tested another set of fourteen 9-month-old infants on the same exploratory task, once a month for six months, to see how their behaviour changed, not by virtue of their age, but rather according to whether they had yet learned to walk (onset of walking ability varied across the group, but all were walking by 15 months).

Irrespective of age, Clearfield found that infants gestured far more during their first walk session compared with their last crawl session, and that they interacted with their mothers more, and their toys less, during their first walk session compared with both their last crawl session and their second walk session.

By twelve months of age, eight members of this final infant group were walking, whilst six were still crawling. Comparing the walkers and crawlers revealed once again that the walkers interacted more with their mothers and performed more social gestures. 'This more mature mode of interaction did not come about through age or more experience in the world,' Clearfield said, 'but rather, the transition to independent walking itself changed how infants interact with others.'

The message is that the same developmental processes that lead an infant to take its first steps, also seem to drive changes in their social behaviour. Importantly, the baby walker study showed this isn't simply because of different opportunities afforded by being in an upright position. 'Under this explanation,' Clearfield concluded, 'processes such as perception, attention, memory, cognition, and social behaviours all shift to accommodate infants' new mode of moving through the world, and each process affects and is affected by the changes in the other processes. From this dynamic view, learning to walk becomes much more than simply a motor milestone; instead, it becomes the core of system-wide changes across many developing domains.'

ResearchBlogging.orgClearfield, M. (2011). Learning to walk changes infants’ social interactions. Infant Behavior and Development, 34 (1), 15-25 DOI: 10.1016/j.infbeh.2010.04.008
You have read this article Developmental with the title February 2011. You can bookmark this page URL Thanks!

A psychology lesson in a box?

Marc Smith of Boroughbridge High School, North Yorkshire, reviews the AS/A2 Biopsychology PsyKit from Uniview Worldwide. Price £49 (excl. VAT)

"While I have been aware of products offered by Uniview for some time and have purchased a number of their DVD’s, I have tended to produce my own resources and lesson activities. The idea of a ‘Psychology lesson in a box’, therefore, was immediately appealing and offered the prospect of reducing a teacher's workload by offering everything that was needed for an entire lesson. The kit is certainly packed with ‘stuff’ (some more useful than others) and briefly comprises: an animated neuroscience DVD illustrating the brain's response to nicotine, cocaine and marijuana; a brain jelly mould (presumably for making jelly in the shape of a brain); a shower cap activity kit (more later); two bags of jelly brain sweets; 12 badges displaying a brain with the caption ‘are you using yours?’; 6 brain function magnets; 3 mini neuron soft toys; 12 metal puzzles; a stopwatch; a ruler and finally a ‘scrap sack’ for an echo-location activity.

The twenty-five minute DVD (individual price £40) is very well produced - although the American narrator may confuse some students with his pronunciation of some key terms - and it includes some effective animations illustrating the synapse and neurotransmitter release. The brain jelly mould is a rather curious thing and is perhaps best filled with plaster or silicone rather than jelly. The shower cap activity kit is another curiosity. In the past I have used a baseball cap and post-it notes to carry out this activity rather than a bright pink shower cap with magnetic laminated cards. The premise is quite simple and involves students placing the cards on the appropriate part of the cap to represent the likes of localisation and brain damage. I suspect that getting one of my male students to sit with a bright pink shower cap on his head may prove problematic - but would at least liven up the lesson.

The remainder of the kit, for me, has limited appeal. Some schools have rules about handing out sweets to pupils so the jelly brain sweets could be a non-starter and I know that many of my own sixth formers would recoil at the prospect of having to wear a badge. The neuron soft toys are certainly fun and informative but students could probably do with a more detailed three-dimensional representation of a brain cell – a colleague of mine would get students to make their own three dimensional microbes which, I suspect, had a bigger impact on them.

The kit certainly represents value in a monetary sense, but perhaps only because teaching resources remain so expensive. Unfortunately, much of the kit would go unused in my classroom due to the uncertainly surrounding exactly what my students would learn from them. Nevertheless, the kit is fun and would certainly appeal to teachers wishing to wind down on a Friday afternoon."

-Link to the biopsychology PsyKit.
You have read this article with the title February 2011. You can bookmark this page URL Thanks!

How well can we communicate emotions purely through touch?

Romantic couples outperformed pairs of strangers
Whether it's a raised eyebrow or curl of the lip, we usually think of emotions as conveyed through facial expressions and body language. Science too has focused on these forms of emotional communication, finding that there's a high degree of consistency across cultures. It's only in the last few years that psychologists have looked at whether and how the emotions can be communicated purely through touch.

A 2006 study by Matthew Hertenstein demonstrated that strangers could accurately communicate the 'universal' emotions of anger, fear, disgust, love, gratitude, and sympathy, purely through touches to the forearm, but not the 'prosocial' emotions of surprise, happiness and sadness, nor the 'self-focused' emotions of embarrassment, envy and pride. Now Erin Thompson and James Hampton have added to this nascent literature by comparing the accuracy of touch-based emotional communication between strangers and between those who are romantically involved.

Thirty romantic couples (the vast majority were heterosexual) based in London took part. One partner in each romantic pair attempted to communicate 12 different emotions, one at a time, to their partner. They sat at opposite sides of a table divided by a curtained screen. The emotional 'decoder' slid their forearm through the curtain for the 'encoder' to touch, after which the 'decoder' attempted to identify which of the 12 emotions had been communicated. The participants were filmed throughout.

After this, the romantic couples were split up and participants paired up with a stranger to repeat the exercise (encoders and decoders kept whichever role they'd had first time around). Strangers were usually formed into same-sex pairs, to avoid the social awkwardness of touching an opposite-sex partner. This created an unfortunate confound, acknowledged by the researchers, which is that most romantic couples were opposite-sex whereas most stranger pairs were same-sex. However, focusing only on results from same-sex pairs versus opposite-sex pairs suggested gender was not an important factor.

The key finding is that although strangers performed well for most emotions, romantic couples tended to be superior, especially for the self-focused emotions of embarrassment, envy and pride. Thompson and Hampton calculated that chance performance (i.e. merely guessing) would produce an accuracy rate of 25 per cent. Although there were 12 emotions to select from, the rationale here is that some are far more similar to each other than others, so even a guesser would perform better than 1/12 accuracy. Romantic partners communicated universal emotions, prosocial and self-focused emotions with an accuracy of 53 per cent, 60 per cent and 39 per cent, respectively - in each case, far better than chance performance. In contrast, strangers achieved accuracy rates of 39 per cent, 56 per cent and 17 per cent, for universal, prosocial, and self-focused emotions respectively, with the last considered as no better than chance performance.

How did the romantic couples achieve their greater accuracy? They touched for longer, but this wasn't correlated with accuracy. Using footage of the experiment, the researchers coded the types of touch used (a wide range of discrete touch types were identified, from trembling and scratching to slapping and squeezing), and for each emotion it was clear that strangers were using similar kinds of touch as were romantic couples. This means that there were either subtle differences in the touching used by romantic couples, which the experimenters had failed to detect, or the 'decoders' were interpreting the same touch cues differently when they were delivered by an intimate partner.

This topic is ripe for further investigation - for example, does the touch advantage shown by romantic couples extend to non-emotional communication? Would other long-term, but non-sexual, relationship partners such as siblings, show a similar advantage? And would romantic partners still display an advantage if they didn't know who was doing the touching? 'Our findings extend the literature on the communication of emotion,' the researchers said. 'The nature of particular relationships appears to have the ability to diminish the ambiguity of emotional expression via touch.'

ResearchBlogging.orgThompson, E., and Hampton, J. (2011). The effect of relationship status on communicating emotions through touch. Cognition and Emotion, 25 (2), 295-306 DOI: 10.1080/02699931.2010.492957
You have read this article Emotion with the title February 2011. You can bookmark this page URL Thanks!

The Special Issue Spotter

We trawl the world's journals so you don't have to:

The senses in language and culture (The Senses and Society).

Cognitive-Behavioral Therapy in the Schools (Psychology in the Schools).

Clinical Speech and Language Studies in Honour of Susan Edwards (Journal of Neurolinguistics).

Attachment Processes in Early Head Start Families (Attachment and Human Development).

Psychology and Law (Current Directions in Psychological Science).

Memory Services (Aging and Mental Health).

Eating Disorders and Mindfulness (Eating Disorders).

Psychiatrists Views on the Place of Religion in Psychiatry (Mental Health, Religion and Culture).

Forensic Research in Offenders with Intellectual and Developmental Disabilities, Part 1 and Part 2 (Psychology, Crime and Law).
You have read this article Special Issue Spotter with the title February 2011. You can bookmark this page URL Thanks!

Stroke cures man of life-long stammer

The cerebellum is coloured green in this model
Thanks to the success of the King's Speech movie, most of us are familiar with the 'developmental' kind of stammering that begins in childhood. However, more rarely, stammering can also have a sudden onset, triggered by illness or injury to the brain. Far rarer still are cases where a person with a pre-existing, developmental stammer suffers from brain injury or disease and is subsequently cured. In fact, a team led by Magid Bakheit at Mosley Hall Hospital in Birmingham, who have newly reported such a patient, are aware of just two prior adult cases in the literature.

Bakheit's patient, a 54-year-old bilingual man, suffered a stroke that caused damage to the left side of his brain stem and both hemispheres of his cerebellum - that's the cauliflower-shaped structure, associated with motor control and other functions, which hangs off the back of the brain. The man's brain damage left him unsteady on his feet, gave him difficulty with swallowing and his speech was slightly slurred. But remarkably, his life-long stammer, characterised by repetitions of sounds, and which caused him social anxiety and avoidance, was entirely gone - an account corroborated by his wife. By the time of his discharge from hospital, the slowing of his speech was much improved and yet thankfully his stammer remained absent.

The researchers can't be sure, but they think the remission of the man's stammer is likely related to his cerebellum damage, which may have had the effect of inhibiting excessive neural activation in that structure. This would be consistent with previous research showing that people who stammer have exaggerated activation in the cerebellum compared with controls, and with the finding that successful speech therapy is associated with reductions to cerebellum activation compared with pre-treatment. A second, related possibility is that, pre-stroke, the man's cerebellum was somehow having a detrimental effect on his basal ganglia (a group of sub-cortical structures involved in motor control and other functions) and that this adverse effect was ameliorated by the stroke-induced damage. This would be consistent with reports of stammers developing in patients with diseases, such as Parkinson's, that affect the basal ganglia.

A third and final possibility, the researchers said, is simply that the slowing of the man's speech somehow aided his stammer. Indeed, reducing the rate of speech is a therapeutic approach. However, this certainly wasn't a conscious strategy employed by the patient, and as we've seen, his stammer remained in remission even as his speech rate improved.

'The complete remission of stammering following a posterior circulation stroke in our patient suggests that the cerebellum and/or its connections with brain structures has an important role in maintaining developmental stammering,' the researchers concluded.

ResearchBlogging.orgBakheit AM, Frost J, and Ackroyd E (2011). Remission of life-long stammering after posterior circulation stroke. Neurocase : case studies in neuropsychology, neuropsychiatry, and behavioural neurology, 17 (1), 41-5 PMID: 20799135
You have read this article Brain / Language / Unusual case studies with the title February 2011. You can bookmark this page URL Thanks!

How to cheat a brain-scan-based lie detector

Cheating the scanner is relatively easy
Sure, it's possible to differentiate patterns of truth-telling brain activity from patterns of lying-related activity. But contrary to media hype, experts have been quick to point out that the accuracy of brain-scan based lie detection is often no better than with traditional approaches, such as the polygraph. Furthermore, these experts warn, brain-scan methods could, in theory, be easily thwarted by liars with even modest levels of guile. That claim is no longer purely theoretical, for in a new study, Girgio Ganis and his colleagues have used a popular paradigm to show just how easy it is for lying participants to trick the brain-scanner.

Twenty-six participants had their brains scanned whilst they looked at the same six dates appearing for half a second each, one at a time, on a screen. For each date they had to indicate with a button press whether it was their date of birth - yes or no. This was repeated several times. In the truth-telling condition, none of the dates was their birth date and the participants simply told the truth and said 'no' to each date. In the lying condition, one of the dates was their birth date and their task was to lie and indicate 'no' whenever it appeared. An equivalent set-up in a real-life criminal case might involve a suspect repeatedly looking at the same selection of knives and indicating whether they owned any of them. One further twist to the task was that participants had to look out for a further specific meaningless date - this was just to make sure they stayed engaged with the task.

When the participants lied and pressed 'no' whenever their birth date was visible then, consistent with past research, their mendacity was revealed in their brain activity. In particular, because of its salience, seeing one's birth date was associated with raised activity in the front-most part of the brain, on both sides. Focusing on activity in these regions, a simple algorithm could tell whether a participant was lying or not with 100 per cent accuracy (the extremely high accuracy may be due to the greater salience of one's own birth date compared with the material used in previous research).

Most importantly, in a repeat of the experiment, the participants were taught a simple cheating technique. All they had to do was ever so slightly move their left index finger, middle left finger, and left toe, respectively, whenever three of the irrelevant dates appeared on the screen (they were instructed to ensure the movements would be too slight to be seen by someone watching). This cheating strategy lent salience to these irrelevant dates, thereby reduced the stand-out salience of one's birth date, and had the effect of reducing the accuracy of the lie-detection algorithm to 33 per cent. In other words, most instances of lying were misidentified as honest responses.

The researchers think that the moving part of this cheating strategy probably isn't necessary. For example, deliberately recalling a certain memory when certain 'irrelevant' stimuli are shown would have the same effect of reducing the stand-out salience of a target stimulus, be that a murder weapon or a date of birth.

'Although these results apply directly only to the specific [and commonly used] laboratory paradigm used here ...,' the researchers said, 'they support the more general point that the vulnerability of the neuroimaging paradigms for deception detection to various countermeasures should be assessed and documented explicitly before they can be used in applied settings.'

ResearchBlogging.orgGanis G, Rosenfeld JP, Meixner J, Kievit RA, and Schendan HE (2011). Lying in the scanner: Covert countermeasures disrupt deception detection by functional magnetic resonance imaging. NeuroImage, 55 (1), 312-9 PMID: 21111834
You have read this article Brain / Forensic with the title February 2011. You can bookmark this page URL Thanks!

When a client confesses to murder

Dr. Jennifer Melfi: What line of work are you in?
Tony Soprano: Waste management consultant.
Client confidentiality in psychotherapy only goes so far. If a client threatens the therapist, another person, or themselves, and the threat is perceived as serious, then most jurisdictions (including the BPS ethics code) recognise this as a valid reason to breach the client's privacy and go to the authorities. But what about the situation in which the client confesses to a past violent act for which they were never prosecuted? What if they tell their therapist that they've previously murdered someone?

Steven Walfish and his colleagues have investigated this issue in a survey of 162 US psychological psychotherapists recruited randomly via the National Register of Health Service Providers. Astoundingly, 21 of the psychologists said that on at least one occasion they'd had a client disclose in therapy that they'd murdered someone, but never been found out (one unlucky psychologist said they'd encountered this scenario six times!).

One hundred and three of the psychologists said they'd had a client disclose having committed an act of previously unreported sexual assault, and 111 of them had had a client disclose a previously unreported act of physical assault. The majority of psychologists said disclosure of past physical assault had happened on three or more occasions; one of them said it had happened more than 200 times!

From an ethical point of view these disclosures of past violent acts are trickier to resolve than threats of future violence, especially if there's no other reason to believe that the client remains a threat. Among the psychologists surveyed in the current research, the majority (63.2 per cent) said such disclosures had had a neutral effect on therapy, 18.8 percent said it was harmful to therapy and a similar proportion (17.9 per cent) viewed it as beneficial.

From a therapeutic perspective, the researchers pointed out that those therapists who viewed the disclosure negatively were at obvious risk of 'negative counter-transference'. This is a fancy way of saying that the disclosure could negatively affect the way the therapist relates to their client, especially if the therapist has themselves previously been a victim of violence. Psychotherapists could be trained to guard against this, but Walfish and his colleagues point out that it's not unusual for therapists to be attacked or threatened by clients and so: 'fears of potential client violence may not always represent an unresolved conflict on the part of the therapist. The psychotherapist knowing this piece of clinical information [the disclosure about past violence], and knowing that the best predictor of future behaviour is past behaviour, may be concerned that they themselves may become a victim of violence.'

Somewhat worryingly, nearly one fifth of the current sample did not feel fully informed about what to do when a client makes a disclosure about past acts of violence, and nearly two thirds felt inadequately prepared for the situation by their graduate training.

Walfish and his colleagues concluded that therapists need to be prepared to hear any material in their consulting rooms, 'regardless of how unusual or unpleasant.' They also need to be aware of their own emotional reactions to disclosures of past violence, how to maintain their own safety, as well as their legal and ethical obligations. 'Graduate training programmes, internship and postdoctoral training settings, and continuing education courses should be encouraged to explore this often difficult topic area in greater depth,' the researchers said.

ResearchBlogging.orgWalfish, S., Barnett, J., Marlyere, K., and Zielke, R. (2010). “Doc, There's Something I Have To Tell You”: Patient Disclosure to Their Psychotherapist of Unprosecuted Murder and Other Violence. Ethics and Behavior, 20 (5), 311-323 DOI: 10.1080/10508422.2010.491743 [ht: Ian Leslie]

A further note on the BPS Ethics Code: The code emphasises the importance of peer support and supervision. If you are a psychologist and unsure how to proceed following a client disclosure, you should seek guidance from your peers and supervisor, fully evaluate the situation, consider alternative courses of action and fully document the process of decision making [thanks to Dr Lisa Morrison Coulthard for this advice]
You have read this article Mental health with the title February 2011. You can bookmark this page URL Thanks!

Milgram's obedience studies - not about obedience after all?

Stanley Milgram's seminal experiments in the 1960s may not have been a demonstration of obedience to authority after all, a new study claims.

Milgram appalled the world when he showed the willingness of ordinary people to administer a lethal electric shock to an innocent person, simply because an experimenter ordered them to do so. Participants believed they were punishing an unsuccessful 'learner' in a learning task; the reality was the learner was a stooge. The conventional view is that the experiment demonstrated many people's utter obedience to authority.

Attempts to explore the issue through replication have stalled in recent decades because of concerns the experiment could be distressing for participants. Jerry Burger at Santa Clara University found a partial solution to this problem in a 2009 study, after he realised that 79 per cent of Milgram's participants who went beyond the 150-volt level (at which the 'learner' was first heard to call out in distress) subsequently went on to apply the maximum lethal shock level of 450 volts, almost as if the 150-volt level were a point of no return [further information]. Burger conducted a modern replication up to the 150-volt level and found that a similar proportion of people (70 per cent) were willing to go beyond this point as were willing to do so in the 1960s (82.5 per cent). Presumably, most of these participants would have gone all the way to 450 volts level had the experiment not been stopped short.

Now Burger and his colleagues have studied the utterances made by the modern-day participants during the 2009 partial-replication, and afterwards during de-briefing. They found that participants who expressed a sense that they were responsible for their actions were the ones least likely to go beyond the crucial 150-volt level. Relevant to this is that Milgram's participants (and Burger's) were told, if they asked, that responsibility for any harm caused to the learner rested with the experimenter.

In contrast to the key role played by participants' sense of responsibility, utterances betraying concern about the learner's wellbeing were not associated with whether they went beyond the 150-volt level. Yes, participants who voiced more concerns required more prompts from the experimenter to continue, but ultimately they were just as likely to apply the crucial 150-volt shock.

However, it's the overall negligible effect of these experimenter prompts that's led Burger and his team to question whether Milgram's study is really about obedience at all. In their 2009 partial-replication, Burger's lab copied the prompts used in the seminal research, word-for-word. The first time a participant exhibited reluctance to continue, the experimenter said, 'Please continue'. With successive signs of resistance, the experimenter's utterances became progressively more forceful: 'The experiment requires that you continue'; 'It is absolutely essential that you continue'; and finally 'You have no other choice, you must go on.'

Burger's revelation (based on their 2009 replication) is that as the experimenter utterances became more forceful - effectively more like a command, or an order - their effectiveness dwindled. In fact, of the participants who were told 'you have no choice, you must continue', all chose to disobey and none reached the 150-volt level. 'The more the experimenter's statement resembled an order,' the researchers said, 'the less likely participants did what the experimenter wished.' It would be interesting to learn if the same pattern applied during Milgram's original studies, but those results were not reported here, perhaps because the necessary data are not available.

Burger and his colleagues said their new observation has implications for how Milgram's studies are portrayed to students and the wider public. Their feeling is that Milgram's results say less about obedience and rather more about our general proclivity for acting out of character in certain circumstances. 'The point is that these uncharacteristic behaviours may not be limited to circumstances in which an authority figure gives orders,' Burger and his team said. 'Few of us will ever find ourselves in a situation like My Lai or Abu Ghraib. But each of us may well encounter settings that lead us to act in surprising and perhaps disturbing ways.'

ResearchBlogging.orgBurger, J., Girgis, Z., and Manning, C. (2011). In Their Own Words: Explaining Obedience to Authority Through an Examination of Participants' Comments. Social Psychological and Personality Science DOI: 10.1177/1948550610397632

More on Milgram:
Milgram's personal archive reveals how he created the 'strongest obedience situation'.
Classic 1960's obediency experiment reproduced in virtual reality.
You have read this article Social with the title February 2011. You can bookmark this page URL Thanks!


Eye-catching studies that didn't make the final cut:

Tipping estimate/guidance on a restaurant bill increases diners' generosity.

Pain helps assuage feelings of guilt.

How workers compensate for windowless offices.

Marriages are more satisfying when wives are thinner than their husbands.

Does a higher income make you more altruistic?

Understanding the causes of women's underrepresentation in science - less to do with discrimination and more to do with the need for organisations to make it easier to balance work/life demands.

'These results suggest that antidepressant use among individuals without psychiatric diagnoses is common in the United States ...'

A call for clinical psychologists to be trained in psychopharmacology.

Why do we listen to music?

Would you swap your lottery ticket for someone else's? The price people are willing to pay to avoid regret.

Minding the gap between neuroscientific and psychoanalytic understanding of autism.

Reducing people's confidence in their memory leads them to perform more checking behaviour - study with implications for understanding OCD.

More research on healthy habit formation.

Classroom discipline across 41 countries. 'In countries that were poorer, more equal, or had more rigid gender roles, students reported higher classroom discipline'.

Progress in the development of an on-line atlas of the mouse brain.

Attempts to avoid ageism by dressing young could backfire. 'In three experiments we found that both male and female young adults negatively evaluated older adults who attempt to look younger compared to older adults who do not attempt to do so ...'

Cognitive neuroscience 2.0: building a cumulative science of human brain function.
You have read this article Extras with the title February 2011. You can bookmark this page URL Thanks!

Learning new faces - A mental ability that doesn't peak until the early thirties

Cognition researchers should beware assuming that people's mental faculties have finished maturing when they reach adulthood. So say Laura Germine and colleagues, whose new study shows that face learning ability continues to improve until people reach their early thirties.

Although vocabulary and other forms of acquired knowledge grow throughout the life course, it's generally accepted that the speed and efficiency of the cognitive faculties peaks in the early twenties before starting a steady decline. This study challenges that assumption.

A massive sample of 44,000 people, aged between ten and seventy, completed an online face learning test in which they were required to study briefly several unfamiliar faces, presented in grey scale without hair or other non-facial distinguishing features. They then had to identify those faces, shown in novel poses and varied lighting conditions, from among further unfamiliar faces.

As you might expect, performance at the task increased steadily through adolescence. But although improvement slowed once adulthood was reached, it didn't stop there. Performance in fact peaked among those participants aged 31.4 years, after which it declined slowly. The pattern of results meant that average performance by 16-year-olds matched the average performance of those aged 65.

The results suggest strongly that face learning capabilities continue to develop into the early thirties, but an alternative explanation is that the sustained changes are more generic, to do with general memory or cognitive abilities. To rule this out, a second study tested nearly 15,000 people on a face task and also a memory task involving names. As before, face learning ability peaked in the early thirties. In contrast, performance at the learning of names peaked at age 23.

A final study used children's faces, in case the earlier studies' use of more mature faces had given older participants an unfair advantage. Even with children's faces, facial learning peaked in the early thirties. However, this prolonged developmental trend wasn't found for inverted faces (performance with these peaked at age 23.5 years), thus suggesting it's specifically the ability to learn new up-right faces that continues to improve into the thirties. It remains to be seen whether this improvement reflects a kind of prolonged innate maturation process or if it's simply a consequence of more years practice at learning faces.

How big were the increases in face learning performance between the end of adolescence and the early thirties? They were modest (the effect size was d=.021) so more research is needed to find out what real life implications, if any, these lingering improvements in ability might have. Another study limitation is the use of a cross-sectional sample. Future research should study changes in ability in the same individuals over time. Notwithstanding these points, the researchers said 'our data illustrate that meaningful changes can and do occur during early and middle adulthood and suggest a need for integration of research in cognitive development and aging.'

ResearchBlogging.orgGermine, L., Duchaine, B., and Nakayama, K. (2011). Where cognitive development and aging meet: Face learning ability peaks after age 30. Cognition, 118 (2), 201-210 DOI: 10.1016/j.cognition.2010.11.002
You have read this article Cognition / Faces with the title February 2011. You can bookmark this page URL Thanks!

A week of sin

Click for full feature
Welcome to the menu for Sin Week on the Research Digest blog, which started on 8 February 2011. Each day for Seven days (with a break on Sunday, naturally) we posted a sinful confession by a psychologist; a new sin fit for the twenty-first century; and an evidence-based way to be good. These online festivities coincided with a feature-length article in the latest issue of The Psychologist on the psychology behind the Seven Deadly Sins.

Here's the full menu of Seven confessions:

John Sloboda - my Wrath
Alex Haslam and Steve Reicher - our Envy
Mark Griffiths - my Pride
Jon Sutton - my Sloth
Wray Herbert - my Gluttony
Cordelia Fine - my Greed
Jesse Bering - my Lust

The Seven new sins:

The Seven ways to be good:
Learn healthier habits
Have an energy drink
Use your inner voice
Practise self control
Clench your muscles
Form if-then plans
Distract yourself

Many thanks to our confessors for baring their souls and especially to Jon Sutton for helping put this special feature together.
You have read this article Sin Week with the title February 2011. You can bookmark this page URL Thanks!

Jesse Bering - my Lust

Jesse Bering is director of the Institute of Cognition and Culture at Queen's University Belfast
I have been happily tasked, on this Valentine’s Day, with writing about the sin of lust. But before I expurgate my lascivious soul, let us first get the concept of lust straight—and I should warn you that this will be the only thing straight about my confession.

Lust is not an easy psychological construct to operationalise. Although it can be used in non-sexual terms, its sexual connotation is primary, as well as the theological incentive for diagnosing it as a deadly sin. The Oxford English Dictionary defines the theological variation as “a sensual appetite regarded as sinful: of the flesh.” And it is in this carnal sense of the word that I have sinned mightily.

In a 2004 treatment, psychologists William Cupach and Brian Spitzberg concede that, 'one who lusts possesses the wish to experience sexual union with another.' But they also note that sexual arousal alone does not capture the entire phenomenon:
'[L]ust is usefully distinguished from sexual arousal (e.g., erect penis, swollen clitoris) and  sexual behavior (e.g., intercourse, oral copulation). The awareness of one’s own physical stimulation does not necessarily entail desire for sexual union, although arousal can be either a precursor to or a consequence of lust. One can experience lust without concomitant physical arousal. Similarly, sexual activity can occur with or without lust. The experience of lust may motivate sexual activity, or sexual activity may breed lust that leads to further sexual activity. [But] sometimes the lust object does not desire sexual activity with the lustful person and sexual union is thwarted.'
It is this “unrequited lust” that best describes the nature of my particular sin, which is in fact more an epoch of lust rather than it is a single lewd act. I may have played, in weaker dosage, a version of the other role in someone else’s narrative. But in this story, I am the lustful party and a specific individual from my adolescent past, who shall go unnamed to protect the innocent, the object of my lust. It is easy to describe him, because his visage I worked deliberately to imprint deep onto my mind’s eye, cognizant even then of my future self reflecting continually back on this image and therefore the eternal need for accuracy. To this day I would recognize the shiny half-moons just above his cuticles or the crook of his knees behind his bronze legs.

I’ll sketch a portrait, but it is, of course, only anaemic compared to the vitality by which the ghost of this boy burns brightly in my head. Olive-skinned, bright-eyed, golden-haired, a Donatello David perfect and cruel. He was little more than an acquaintance, really, but I did try, in my own cloudy, puerile ways, to imbibe his essence within the moralistic constraints I was dealt. And these were not insubstantial, since our only regular field of interaction was a suburban high school cloistered away in a rather conservative part of early 1990’s Ohio. We shared a disease—both being insulin-dependent diabetics—and this discovery of our mutual endocrinal failures was one of the few times that I ever was tempted to believe in God. Through the good fortune of our mutually dysfunctional pancreases, we were tied together symbolically as if by fate. More importantly, this, along with my eventual befriending him on the school tennis team, gave me an ironclad excuse should someone ever suddenly turn the conversation to my curious and frequent mentioning of him.

According to psychologist Dorothy Tennov, I was suffering from a tell-tale case of limerence—a neologism that meant intense emotional and sexual attraction for a desired romantic partner. Here are its key symptoms: intrusive thinking about the person; a yearning for the other person to reciprocate the feelings; the inability to have such feelings for any other person; a fear of rejection; heightened sensitivity to signs of interest on the other’s part; and the tendency to dwell on the person’s positive characteristics and avoid the negative. Tennov believed that nearly all adolescents are stricken with a hobbling bout of limerence at some point in their burgeoning sex lives. Indeed, empirical evidence demonstrates that limerence—also known as “passionate love” and “infatuation”—is strikingly common. In a 1997 study from Personal Relationships, psychologist Craig Hill and his colleagues found that such experiences are concentrated primarily between the ages of 16 to 20 years. Although there are no differences between the sexes in having a mutual infatuation, males are significantly more likely than females to have a meaningful unrequited lustful relationship.

Bering's new book
When lust levels are mismatched due to differences in physical attractiveness or an even more formidable hurdle of having different sexual orientations, limerence can be rather painful. Very little research, so far as I can gather, has been done on the subject of homosexual limerence, but I suspect my case is not so uncommon among gay males, who may find themselves lusting after others who are completely na├»ve, distressed by, or even hostile to their advances. The need for ambiguity in gay male courtship in homophobic societies, coupled with the perceptual biases inherent to the state of limerence wherein one is hypervigilant to even the slightest signs of potential interest in the lustful object, complicates matters profoundly. I remember quite clearly how this boy would steal glances at my lips as I spoke to him, how he asked to sit next to me in the crowded back seat of a hot car, our bare legs sticking together in perspiration, his innocent sharing of a can of soda (a can which, along with a helpful stack of peculiarly homoerotic Men’s Fitness magazines, I kept for beastly pleasures at least a month at my bedside, since his essence had been soldered onto it). All of these things were, in my distorting and wanting mind, delicate little tortures that failed to disconfirm the statistically probable null hypothesis of his heterosexuality.

Given the scalding moralistic climate of the times, my sensitivity and fears of being ostracized, my lust simmered for years, often boiling over into my dreams. I was not out of the closet yet and would not be until my early twenties. I moved away at graduation, pathetically scribbling his name in my notebook like a lovesick schoolgirl at a distant college; he stayed behind, completely unaware of the deep impact that his sheer being had rendered. I knew that my lust for him was a cosmically irrelevant craving, never-to-be relieved, so I threw myself into other things, and other people.

Many years passed until my pancreas intervened again and brought him back into focus. In my late twenties, I went into hypoglycaemic shock in a hotel room in Atlanta while attending a conference; luckily, a friend found me unconscious, called the paramedics, and thirty minutes later, a line of glucose was being transfused into my veins. I’d known this intellectually before, of course, and had already written a good deal about the illusion of an afterlife, but this intimate flirtation with my own mortality taught me that existence really was the equivalent of an on/off light switch. And what a shame, I thought, if I squandered the rest of this absurd gift worrying about making people uncomfortable. It’s time to live an honest life—you can call it a life of sin if you’d like. It doesn’t matter; you’ll perish all the same.

And so, eleven years after I’d last seen him, my great smouldering sin, my limerent lust, finally saw daylight. I’d heard through the grapevine that the object of my attraction had become a rather pale haze of his former glory, an average, married man and father living a very traditional life—but still I wrote him a letter. I purged myself of my feelings for him, sympathizing with the strangeness he must feel as the target of someone whose passions are so misplaced, explaining that this was more a letter for me than it was for him, that it was an exorcism only. I tried to articulate how, in spite of all this, it was important for him to know that I’d loved him.

I can only imagine how bizarre that letter must have seemed to him, how out-of-the-blue and, indeed, probably disturbing (particularly if it had been opened by his wife). For that I apologized, profusely. Still, I wouldn’t apologise for my feelings. They were what they were, and all facts are godly. The letter was my only way to break the spell, once and for all, by facing his rejection, which alone could free me to really love other people.

And his rejection did come, in the form of deafening silence. Whether he was flattered, flabbergasted, or disgusted I’ll never know. But his avoidance is okay. In his work on interpersonal relations, Roy Baumeister has highlighted the fact that, by contrast with a rich cultural stock of guidelines to get someone to fall in love with you, namely through persistence, there are no clear cultural scripts for how people should handle an unwanted lover’s attention. 'In a sense,' write Baumeister and his colleagues, 'both the rejector and the would-be lover end up feeling like victims—one of intrusive pursuit and unwanted attentions, the other of heartbreak and rejection.'

But no matter, since mailing that letter five years ago, there is no more ambiguity, no more haunting regrets. Shortly after placing it in the mailbox, Asmodeus, the demon of lust, kindly turned my attention to a more fitting object, one that is lying in bed next to me now and for whom I am free to lust after in peace and guilt-free carnality.

Jesse also contributed to our special feature The Bloggers Behind the Blogs.

This post is part of the Research Digest's Sin Week. Each day for Seven days we'll be posting a confession, a new sin and a way to be good. The festivities coincide with the publication of a feature-length article on the psychology behind the Seven Deadly Sins in this month's Psychologist magazine.
You have read this article Sin Week with the title February 2011. You can bookmark this page URL Thanks!

Seven new deadly sins: 7) Insert your sin here

Our panel of psychologists suggested Truthiness, Iphonophilia, Narcissistic Myopia, Entitlement, Mobile Abuse, and Excessive Debt as new Deadly Sins relevant to the 21st century. What do you think of these and what new sins do you propose? Celebrity worship? Environmental vandalism? Xenophobia? Please use comments to have your say ...

This post is part of the Research Digest's Sin Week. Each day for Seven days we'll be posting a confession, a new sin and a way to be good. The festivities coincide with the publication of a feature-length article on the psychology behind the Seven Deadly Sins in this month's Psychologist magazine.
You have read this article Sin Week with the title February 2011. You can bookmark this page URL Thanks!

Seven ways to be good: 7) Distract yourself

If at first you don’t succeed, cheat. In Walter Mischel’s classic studies of young children’s self-control, he found that the kids able to resist cookies and marshmallows for longer periods tended to adopt distraction strategies, such as covering their eyes or singing to themselves. Even our chimpanzee cousins are adept at this, although admittedly in their case it’s for greater gain rather than to avoid sin. In a 2007 study Michael Beran at Georgia State University showed that chimps played with toys as a way to distract themselves from a self-filling jar of sweets. The longer they waited before grabbing the jar, the more sweets they’d get. If the jar was out of reach, they didn’t play with the toys so much, which suggests they really were using the toys as a form of distraction.

This post is part of the Research Digest's Sin Week. Each day for Seven days we'll be posting a confession, a new sin and a way to be good. The festivities coincide with the publication of a feature-length article on the psychology behind the Seven Deadly Sins in this month's Psychologist magazine.
You have read this article Sin Week with the title February 2011. You can bookmark this page URL Thanks!

Cordelia Fine - my Greed

Dr Fine is at Macquarie University
Last year I applied for a research fellowship, and while putting the application together I emailed my sister to ask if there is some standard way of describing the impact of your academic work. In less than five minutes she was on the phone to tell me my “h-index”.

Publications are the currency of researchers, and the idea behind the h-index is that it’s sensitive to both the quantity and impact of publications. Basically, the h-index is the academic equivalent of reading someone’s bank statements.

“And you can look up anyone’s h-index online?” I asked.

“Yep,” my sister replied. And then, even though I’d put the question casually, she added, “But don’t go there.”

My first thought on putting down the phone was, “The application can wait.” My second thought was, “Oh. I thought I’d grown out of that person.”

Greed is a rapacious desire and pursuit of wealth and power, and in secondary school it was the status of class swot that I was hungry for. There was a girl at school – let’s call her ‘Alison Stevens’. She played first oboe in the school orchestra, hit a hockey ball with unerring deadliness, but – most importantly – she was smart. The possibility that she might be smarter than me was intolerable and, without ever explicitly acknowledging it, the two of us vied relentlessly to be top of the class. When I scored 97 per cent in a biology test, it was not the sense of wonderment for photosynthesis enabled by my solid grasp of the material that brought me satisfaction. It was the 2 per cent I got on the test that put me ahead of Alison Stevens.

This, I realise now, was excellent preparation for life in academia. Because ‘Publish or Perish’, the mantra of the aspiring researcher, can give rise to a competitive mindset that is basically a grown-up version of ‘Beat Alison Stevens’. We aspire to more, better, bigger…what we do is rarely enough and the academic coffers can always be swelled more. A colleague recently described to me the tense, protracted negotiations that took place over authorship position on a paper about to be submitted to a prestigious scientific journal. Nobody, it seems, said, ‘Oh, just put my name wherever you like. The important thing is that I contributed to the acquisition of new scientific knowledge to the benefit of the our community and society at large, and that’s enough for me.’ If no one has ever actually been murdered for the first author spot, I suspect it is only because that prized position could be retained posthumously.

Of course scientists simply don’t work in a climate where they can afford to be what philosopher of science Philip Kitcher calls the ‘epistemic purist’ – a virtuous scientist for whom the reward of work comes solely from the heady joy of acquiring reliable, generalisable knowledge about nature. The epistemic purist has no interest in the external rewards their work can also bring: the social recognition, the status, the ability to casually say things like, ‘Your coat? Oh, just sling it on the mass spectrometer.’

The idea of a scientific community populated by people for whom external rewards are as nothing certainly has appeal. Moreover, as psychologist Barry Schwartz has pointed out, the seeking of external rewards can sometimes undermine science’s proper goal. Schwartz notes, for example, that the internal goal of science is not served when scientists perform easily publishable but unimaginative and uninformative work in order to maximise their publication output, or keep their results secret from other researchers in order to maintain a competitive edge. At its most extreme, a focus on external rewards tempts scientists to fabricate data.

Fine's latest book is out now in paperback
Then again, just as I probably have Alison Stevens to thank for the A-level results that got me a place at Oxford University (where my intrinsic reward lever was adjusted to the ‘on’ position), so too may the pursuit of external goods bring about valuable scientific discoveries. Kitcher makes the case that ‘sullied’ scientists – those whose motives are a more recognisably human mix of internal and external – may actually make for a cognitively healthier and more productive scientific community. As he concludes, ‘starry-eyed idealism is by no means necessary to serve the community well.’

All of which leaves me unsure whether I want to give up that greed I still carry with me from my school-days, assuming I even could. Some days I think I’m too greedy. But other times (especially when preparing fellowship applications), my magpie eyes are hungry for the prize and I can’t help but wonder if I’m not greedy enough.

This post is part of the Research Digest's Sin Week. Each day for Seven days we'll be posting a confession, a new sin and a way to be good. The festivities coincide with the publication of a feature-length article on the psychology behind the Seven Deadly Sins in this month's Psychologist magazine.
You have read this article Sin Week with the title February 2011. You can bookmark this page URL Thanks!

Seven ways to be good: 6) Form if-then plans

When your willpower levels have been drained by an earlier test, that’s when you’re most vulnerable to temptation. One way to protect yourself is to form so-called ‘if-then’ plans. For example, imagine that you wanted to avoid getting angry the next time your boss is overly critical, you could form the plan ‘if my boss says my work is amateurish I will recall the time that I won an award’ – a thought which will hopefully have a soothing effect. The effects of so-called ‘implementation intentions’ have been researched in-depth by Peter Gollwitzer at the University of Konstanz. In one recent study he tested students’ ability to persevere with anagram tasks after they’d resisted laughing while watching comedy clips, thus leaving their willpower depleted. Those who followed the vague plan ‘I will find as many solutions as possible’ performed poorly on the anagram tasks as expected. However, willpower depletion had no such adverse effect on students who followed the additional, more detailed plan: ‘...And if I have solved one anagram, then I will immediately start work on the next!’. [For further information, visit Prof Gollwitzer's website where you will find links to many of his articles]

This post is part of the Research Digest's Sin Week. Each day for Seven days we'll be posting a confession, a new sin and a way to be good. The festivities coincide with the publication of a feature-length article on the psychology behind the Seven Deadly Sins in this month's Psychologist magazine.
You have read this article Sin Week with the title February 2011. You can bookmark this page URL Thanks!

Seven new deadly sins: 6) Excessive debt

‘The financial crisis we're in originated partly because of people running up huge debts they couldn't pay,’ says Roy Baumeister of Florida State University. ‘Politicians and governments also spend beyond their means, creating debts that future generations will be stuck with. If people were mindful of avoiding the sin of excessive debt, both they and society would be better off.’

This post is part of the Research Digest's Sin Week. Each day for Seven days we'll be posting a confession, a new sin and a way to be good. The festivities coincide with the publication of a feature-length article on the psychology behind the Seven Deadly Sins in this month's Psychologist magazine.
You have read this article Sin Week with the title February 2011. You can bookmark this page URL Thanks!

Wray Herbert - my Gluttony

Wray Herbert is director of science comms at the APS
When I first entered the rooms of Alcoholics Anonymous some years ago, one of the long-timers—a rough guy, with more than two decades of sobriety—pulled me aside after a meeting to share his personal view of alcoholism. We were in a bright, hard-floored church basement that carried the sound, and his half-whisper made the words sound a bit conspiratorial: 'I figure everyone is given a share of booze to drink in his life. You can drink it any way you want, and most people spread theirs out over a lifetime. But I drank all mine up. I was a glutton.'

Glutton. It’s not a word you hear much at all these days. In fact, when he uttered the archaic word, my mind rushed to the English literary giants of long ago—the diarist Pepys and Dr. Johnson—consuming enormous quantities of mutton and fowl, and paying with gouty, swollen toes. Those literary gluttons seem to be a thing of the past, and the word has fallen into disuse, too. You certainly don’t hear it in recovery circles, and indeed most sober alcoholics would likely reject this old-timer’s view of the disorder. You’re much more likely to hear alcoholism described as a medical disease, or a spiritual crisis.

But I like the idea of alcoholic gluttony. It rang true to me back then, and it still does. It cuts through a lot of hair-splitting debate and gets right to the heart of the matter: lack of self-control. Call it what you like, but at the end of the day there’s no getting away from the behavior—the excessiveness, the lack of restraint, the—yes—gluttony.

Yet labeling alcoholism as gluttony does not make it simple to understand. Indeed, alcoholic gluttony is maddeningly complex, and in a way this vice—this deadly sin—captures human nature in all its irrational nuance. Looking back now, I believe that my career as a science journalist has paralleled my drinking career; my unfolding relationship with alcoholic gluttony shaped the questions I asked, and how I asked them.

My scientific interest in boozing preceded my own excesses, because my father died a full-blown alcoholic. But my memories from childhood were not of a reckless man, but rather a vibrant, engaged man—a hiker, a sailor, an educator. Then somewhere along the way things changed, for no obvious reason. There was no tragic trigger, just the usual disappointments, and he drank more and more. I recall sitting at his kitchen table late in his life, and he was drinking Passport Scotch disguised with OJ—his drink of choice—and thinking: He’s chosen this path freely, with full understanding of the tradeoffs. But I watched him clinically and warily, because I knew I carried some of his genes, and his transformation reflected back on me.

As I watched my father’s alcoholism progress—and then my own—I began asking other questions: Do we have a brain disease? Are there particular neurotransmitters run amok. I read widely in the literature about genetics and addiction and stress, about suspect neurotransmitters, and brain anatomy related to pleasure and risk and will, and even wrote a newsweekly cover story on the interplay of genetics and misfortune. None of this got me very far. Alcoholism appears to run in families, and many experts believe there are genes—probably a handful of them—underlying the disorder. There are candidate brain chemicals and structures. So I probably inherited a propensity of some kind. But so what? As one geneticist explained to me years ago, there is no elbow-bending gene. That is, no genetic or neuroscience findings will ever alter the fact that alcoholics—at every stage of their drinking history—are making decisions. Every time we pick up a bottle or pour a finger of whiskey, it’s a choice—it’s the option we’re freely selecting, at least for that moment.

So I moved on from what I now saw as a reductionist neuro-genetic view of alcoholism to an interest in cognitive psychology. Specifically, I wanted to know how we make decisions and judgments and choices, and why so many of our choices are not in our own best interest. Ironically, my preoccupation with irrational decision making coincided with a sharp spike in my own drinking. I was increasingly isolated in my alcoholism—skipping my favorite watering holes for a bottle at home; I drank at lunch every day, and often in the morning. The “holidays” I took from booze were more and more difficult. My drinking life wasn’t feeling like a choice—but I had no other way to explain it. I couldn’t blame it on anyone else. Even self-destructive decisions are decisions, and I began devouring the scientific literature on emotions and distorted thinking, looking for an explanation for my own poor life choices.

And that’s where I am today. My research as a science journalist led me to the study of cognitive biases—the heuristic traps that, once helpful, now lead us all too often into perilous territory. I focused on irrational thinking, and as my own head cleared in my chosen sobriety, I explored all kinds of distorted thinking—culminating in a book on the topic, called On Second Thought. On Second Thought is about the surprisingly automated lives we live—often at the price of our happiness—and it’s also a guide of sorts to more deliberate thinking. It’s not about alcoholic gluttony, but the title could well describe my own questioning of my own harmful life choices—and the change I made.

My next project—in the works—is on alcoholic gluttony. In the course of researching and writing On Second Thought—I was sober by then—I kept stumbling on psychological science that illuminates the process of recovering from alcoholism. Much of it is counterintuitive—the need for powerlessness, the dangers of self-reliance, the power of moral inventory and honesty. Many recovering alcoholics see the steps of recovery as a spiritual path, with no need for scientific explanation. I don’t argue with that, but I also think there’s a breed of sober alcoholics who are curious about the workings of the mind as it chooses—first a destructive path, then a life-changing one. They are the audience for the next book. Let’s call them recovering gluttons.

This post is part of the Research Digest's Sin Week. Each day for Seven days we'll be posting a confession, a new sin and a way to be good. The festivities coincide with the publication of a feature-length article on the psychology behind the Seven Deadly Sins in this month's Psychologist magazine.
You have read this article Sin Week with the title February 2011. You can bookmark this page URL Thanks!

Seven new deadly sins: 5) Mobile abuse

Mobile abuse: ‘Shouting into your cell phone on the bus, or as the curtain is going up at the opera – that happened to me,’ says Helen Fisher at Rutgers University. ‘I mean where are these people coming from, where is their brain? It is extreme narcissism.’

This post is part of the Research Digest's Sin Week. Each day for Seven days we'll be posting a confession, a new sin and a way to be good. The festivities coincide with the publication of a feature-length article on the psychology behind the Seven Deadly Sins in this month's Psychologist magazine.
You have read this article Sin Week with the title February 2011. You can bookmark this page URL Thanks!

Seven ways to be good: 5) Clench your muscles

We tend to associate acts of willpower with people clenching their jaw or fists. A study published last year showed that this muscular tension isn’t merely a side-effect of willpower, it actually helps bolster our self-control [pdf]. Across five studies, Iris Hung at the National University of Singapore and Aparna Labroo at the Booth School of Business showed that various forms of muscle flexion, from fist clenching to calf muscle tightening helped participants to endure pain now for later benefit (e.g. take more time to read a distressing news story about a disaster in Haiti, which in turn led them to give money to a relevant charity in line with how much the story mattered to them); and to resist short- term gain (e.g. snack food) in order to fulfil a long-term gain of better health. Muscle flexing only worked when participants were already motivated. For example, if long- term health was unimportant to them, muscle flexing made no difference. So flexing appears to augment willpower rather than changing motivations and attitudes. Muscle clenching was also only effective when performed at the same time as an act of will.

This post is part of the Research Digest's Sin Week. Each day for Seven days we'll be posting a confession, a new sin and a way to be good. The festivities coincide with the publication of a feature-length article on the psychology behind the Seven Deadly Sins in this month's Psychologist magazine.
You have read this article Sin Week with the title February 2011. You can bookmark this page URL Thanks!

Jon Sutton - my Sloth

Dr Jon Sutton is editor of The Psychologist
Straight out of my PhD, I arrived at Glasgow Caledonian University in 1998 to take up a research lectureship. I had a few publications under my belt, and in my first meeting with the Head of Department he said: ‘Well you’ve already fulfilled your quota for the next Research Assessment Exercise – you can relax.’ That was all it took: I opened the door to sloth, and it crawled in.

I loosened that belt a notch, and although I did all that was expected of me in my 18 months north of the border I couldn’t fool myself. I knew I had succumbed to sin. And I think there’s pretty high co-morbidity when it comes to sin: ‘Satan finds some mischief still for idle hands to do’, and lust, gluttony, pride, greed and envy were all shoe-horned into my one-bedroomed flat in Govanhill. I left wrath outside though, that seemed too much like hard work. As did grant proposals, original thinking and learning how to do something beyond a t-test.

During the PhD, I felt immersed in an individual pursuit with an end, a thesis, in sight. Now this was a ‘proper’ job, and perhaps the extrinsic motivators – the RAE, salary, advancement – served to undermine the intrinsic drive (an idea explored in some classic psychological studies, e.g. Deci, 1971; Lepper et al., 1973). Perhaps the drive to specialise in research and carve out a niche didn’t suit my eclectic (i.e. easily distracted) mind. Perhaps it was actually the variety of routes available to me that led to ‘choice paralysis’ and the failure to choose any of them. Whatever the reality, the narrow path to becoming ‘the bullying guy’ stretched wearily into the horizon of my mind, and my soul grew sluggish and torpid at the thought of the journey.

Operating in a comfort zone, it became too easy to crawl out of bed that little bit later, to slope off to the pub a touch early, to only go to ground to urinate and defecate once a week. (That last one is actual sloths, but you get the idea.) Sloths sometimes remain clinging to the branch after death, and I feared the same fate: in place but inactive. Medieval theologian Thomas Aquinas spoke of sloth as ‘sluggishness of the mind’, and I could feel my brain atrophy through lack of use (and sambuca).

So when I heard about the job editing The Psychologist, it appealed in part because I saw a new road, creative possibility, and no hiding place. And so it has proved: a 9-5 daily, weekly, monthly, annual grind, deadline after deadline, constant pressure to produce and develop. Heaven. Because without that, I’m on the highway to hell. At one mile an hour.

This post is part of the Research Digest's Sin Week. Each day for Seven days we'll be posting a confession, a new sin and a way to be good. The festivities coincide with the publication of a feature-length article on the psychology behind the Seven Deadly Sins in this month's Psychologist magazine.
You have read this article Sin Week with the title February 2011. You can bookmark this page URL Thanks!

Seven ways to be good: 4) Practise self-control

Willpower is like a muscle – the more you train it, the more powerful it will become, thus helping you to resist the Seven Deadly Sins. For example, in a study published last year, Mark Muraven at the University of Albany had a subset of participants spend two weeks practising acts of self-control, such as resisting eating naughty food. These participants subsequently excelled at a lab measure of self-control compared with their own baseline performance. By contrast, no such improvement was observed among control participants who merely spent the same time completing maths problems (a task which, although onerous, Muraven claims doesn’t depend on the ability to resist impulses) or writing about any incidental acts of self-control they’d achieved. This latter condition was included to ensure that it is specifically the practice of self-control that is beneficial not merely spending time thinking about self-control. Also, participants in all groups were told that their activity would boost self-control, so as to rule out mere expectancy effects.

This post is part of the Research Digest's Sin Week. Each day for Seven days we'll be posting a confession, a new sin and a way to be good. The festivities coincide with the publication of a feature-length article on the psychology behind the Seven Deadly Sins in this month's Psychologist magazine.
You have read this article Sin Week with the title February 2011. You can bookmark this page URL Thanks!

Seven new deadly sins: 4) Entitlement

‘Entitlement is the absolutist requirement that all one’s egocentric demands for “justice” not only be fully met, but also be of keen interest to the rest of the world, no matter how trivial and inconsequential the injustices, and irrespective of how great the
redress of perceived inequity has been to-date,’ says Bill Winogron at S4Potential. ‘It’s a close cousin to what American psychologist Albert Ellis more wittily named "Musturbation".’

This post is part of the Research Digest's Sin Week. Each day for Seven days we'll be posting a confession, a new sin and a way to be good. The festivities coincide with the publication of a feature-length article on the psychology behind the Seven Deadly Sins in this month's Psychologist magazine.
You have read this article Sin Week with the title February 2011. You can bookmark this page URL Thanks!

Mark Griffiths - my Pride

Prof Griffiths is at Nottingham Trent Uni
Before writing this blog, I knew very little about ‘The Sin of Pride’. To me it was the title of an LP by The Undertones that I bought in 1983 when I was 16 years old from Castle Records in Loughborough. I perhaps learned a bit more about it when I watched Brad Pitt in the film ‘Se7en’ (which coincidentally just happens to be one of my all-time favourite films).

Since agreeing to write this I did a bit of research on the subject (which admittedly means I did a quick Google search followed by a more considered in-depth search on Google Scholar). While I’m no expert on the topic I can at least have a decent pub conversation about it if anyone is prepared to listen. Just to show my complete ignorance, I wasn’t even aware that the sin of pride is the sin of all sins (although I could in a pub quiz be relied upon to name the seven deadly sins).

I was asked to write on this topic because I was seen as someone who is very proud of the work that I do (and for the record, I am). However, I have often realized that just because I am proud of things that I have done in my academic career it doesn’t necessarily mean others think in the same way. In fact, on some occasions I have been quite taken aback by others’ reactions to things that I have done for which I feel justifiably proud (but more of that later!).

At a very basic level, the sin of pride is rooted in a preoccupation with the self. However, in psychological terms, pride has been defined as 'a pleasant, sometimes exhilarating, emotion that results from a positive self-evaluation' and has been described as one of the three ‘self-conscious’ emotions known to have recognizable expressions (shame and embarrassment being the other two). From my reading of the psychological literature, it could perhaps be argued that pride has been regarded as having a more positive than negative quality, and is usually associated with achievement, high self-esteem and positive self-image – all of which are fundamental to my own thinking. My reading on the topic has also led to the conclusion that pride is sometimes viewed as an 'intellectual' or secondary emotion. In practical [psychological] terms, pride is either a high sense of one's personal status or ego, or the specific mostly positive emotion that is a product of praise or independent self-reflection.

One of the most useful distinctions that can be made about pride (and is rooted in my own personal experience), is what Stephen Lea and Paul Webley (who just happened to be my two PhD supervisors at the University of Exeter) distinguish as 'proper pride' and 'false pride'. They claim that:

'Proper pride is pride in genuine achievements (or genuine good qualities) that are genuinely one's own. False pride is pride in what is not an achievement, or not admirable, or does not properly belong to oneself. Proper pride is associated with the desirable property of self-esteem; false pride with vanity or conceit. Proper pride is associated with persistence, endurance and doggedness; false pride with stubbornness, obstinacy and pig-headedness.'

As I noted above, there have been times when I have been immensely proud of doing something only for friends and colleagues to be appalled. ‘Proper pride’ as Lea and Webley would argue. One notable instance was when I wrote a full-page article for The Sun on ‘internet addiction’ published in August 1997. I originally wanted to be a journalist before I became a psychologist, and my journalist friends had always said that to get a full-page ‘by-line’ in the biggest selling newspaper in the UK was a real achievement. I was immensely proud - apart from the headline that a sub-editor had dubbed my piece ‘The Internuts’ – and showed the article to whoever was around.

I had always passionately argued (and still do) that I want my research to be disseminated and read by as many people as possible. What was better than getting my work published in an outlet with (at the time) 10 million readers? My elation was short-lived. One close colleague and friend was very disparaging and asked how I could stoop so low as to “write for the bloody Sun?” Similar comments came from other colleagues and I have to admit that I was put off writing for the national tabloids for a number of years. (However, I am now back writing regularly for the national dailies and am strong enough to defend myself against the detractors).

More recently in 2007, I was invited to the House of Commons by the ex-Leader of the Conservative Party, Iain Duncan Smith and invited to Chair his Centre For Social Justice Working Party on Gambling and write a report as part of the Conservative Party’s ‘Breakthrough Britain’ initiative. Anyone who knows me will attest that my political leanings are left of centre and that my working with the Conservatives on this issue was not something I did without a lot of consideration. I came to the conclusion that gambling was indeed a political issue (rather than a party political issue) and if the Conservative Party saw this as an important issue, I felt duty bound to help given my research experience in the area. I spent a number of months working closely with Iain Duncan Smith’s office and when the report was published I was again very proud of my achievement.

However, as soon as the report came out I received disbelieving and/or snide emails asking how I could have 'worked with the Conservatives'. I have spent years trying to put the psychosocial impact of gambling on the political agenda. If I am offered further opportunities by those with political clout, I won’t think twice about taking them. I am still immensely proud of such actions despite what others may think.

This post is part of the Research Digest's Sin Week. Each day for Seven days we'll be posting a confession, a new sin and a way to be good. The festivities coincide with the publication of a feature-length article on the psychology behind the Seven Deadly Sins in this month's Psychologist magazine.
You have read this article Sin Week with the title February 2011. You can bookmark this page URL Thanks!