Child's play! The developmental roots of the misconception that psychology is easy

The widespread misconception that psychology is easy and mere common sense has its roots in the biased way that children work out whether a topic is challenging or not.

Frank Keil and colleagues asked children aged between five and thirteen, and adults, to rate the difficulty of questions from physics (e.g. How does a spinning top stay upright?), chemistry, biology, psychology (e.g. Why is it hard to understand two people talking at once?) and economics. The questions had been carefully chosen from earlier pilot work in which they'd all been rated as equally difficult by adults.

Consistent with the pilot work, the adults in the study proper rated the questions from the different disciplines as equally difficult. However, children from age 7 to 13 rated psychology as easier than the natural sciences - physics, chemistry and biology, which they rated as equally difficult.

Young children can't possibly have the depth of understanding to know which scientific questions are more difficult. Instead they must resort to some kind of mental short-cut to make their verdict. Keil's team think that children's feelings of control over their own psychological faculties - memories, emotions and so forth - and the superficial familiarity of those kinds of concepts, likely lead them to believe psychological concepts are easier to understand.

A second study provided this account with some support. This time children and adults rated the difficulty of questions from within the various branches of psychology. Similar to the first study, the children, but not the adults, rated questions related to social psychology, personality and emotions as progressively easier, compared with questions related to cognition, perception and biological psychology, which they rated as progressively more difficult.

So, when do these childish misconceptions leak through into adult judgments? For a third study, another batch of children and adults were again presented with the same questions from the different scientific disciplines, but this time they were asked to say whether they would be able to solve each question on their own (or require expert help) and to estimate what proportion of the adult population would know the answers.

This time the adults as well as the children tended to say they could solve more psychology questions on their own, compared with questions in the other sciences, and kids and adults estimated that more people knew the answers to the psychology questions. Remember these were psychology questions that adults had already rated as just as difficult and complex as questions in the other sciences. 'Such biases [towards seeing psychology as easy] may be observed when tasks do not so directly ask about difficulty of understanding and instead use measures such as ease of learning on one's own,' the researchers said.

Keil's team said their findings have real-life implications, for example in the court-room. 'If psychological phenomena are seen as usually quite easy to understand and largely self-evident and if such judgments are inaccurate and underestimate the need for experts,' they warned, 'cases might well be decided in ways that unfairly exclude valuable expert insights.'

In fact, the researchers pointed out that such situations have already occurred. In the US trial of former Presidential Assistant I. Lewis 'Scooter' Libby, for example, the judge disallowed the use of psychology experts on memory, on the basis that the jury could rely on their common sense understanding of memory. This is particularly ironic given that prior psychology research has shown that jurors and judges have a woefully poor understanding of how memory actually works.
_________________________________

ResearchBlogging.orgKeil FC, Lockhart KL, & Schlegel E (2010). A bump on a bump? Emerging intuitions concerning the relative difficulty of the sciences. Journal of experimental psychology. General, 139 (1), 1-15 PMID: 20121309

Related article in The Psychologist magazine: 'Isn't it all just obvious?'
You have read this article Developmental with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/child-play-developmental-roots-of.html. Thanks!

Scary health messages can backfire

A short while ago there was a shocking advert on British TV that used slow motion to illustrate the bloody, crunching effects of a car crash. The driver had been drinking. Using these kind of scare tactics for anti drink-driving and other health issues makes intuitive sense. The campaigners want to grab your attention and demonstrate the seriousness of the consequences if their message is not heeded. However, a new study makes the surprising finding that for a portion of the population, scare tactics can back-fire, actually undermining a message's efficacy.

Steffen Nestler and Boris Egloff had 297 participants, 229 of them female, average age 35, read one of two versions of a fictional news report from a professional medical journal. The report referred to a study showing links between caffeine consumption and a fictional gastro-intestinal disease 'Xyelinenteritis'. One version was extra-scary, highlighting a link between Xyelinenteritis and cancer and saying that the participant's age group was particularly vulnerable. The other version was lower-key and lacked these two details. Both versions of the article concluded by recommending that readers reduce their caffeine consumption.

Before gauging the participants' reaction to the article and its advice, the researchers tested them on a measure of 'cognitive avoidance'. People who score highly on this personality dimension respond to threats with avoidance tactics such as distracting themselves, denying the threat or persuading themselves that they aren't vulnerable.

The key finding is that participants who scored high on cognitive avoidance actually rated the threat from Xyelinenteritis as less severe after reading the scary version of the report compared with the low-key version. Moreover, after reading the scary version, they were less impressed by the advice to reduce caffeine consumption and less likely to say that they planned to reduce their caffeine intake.

On the other hand, highly cognitive avoidant participants were more responsive to the low-key report than were the low cognitive avoidant participants. In other words, for people who are cognitively avoidant, scary health messages can actually back-fire.

'Practically, our results suggest that instead of giving all individuals the same threat communications, messages should be given that are concordant with their individual characteristics,' Nestler and Egloff said. 'Thus, the present findings are in line with the growing literature on tailoring intentions to individual characteristics, and they highlight the role of individual differences when scary messages are used.'
_________________________________

ResearchBlogging.orgNestler, S., & Egloff, B. (2010). When scary messages backfire: Influence of dispositional cognitive avoidance on the effectiveness of threat communications Journal of Research in Personality, 44 (1), 137-141 DOI: 10.1016/j.jrp.2009.10.007

Also on the Digest:
-Morbid warnings on cigarette packs could encourage some people to smoke.
-How to promote the MMR Vaccine.
-Public health leaflets ignore findings from health psychology.
You have read this article Health / Personality with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/scary-health-messages-can-backfire.html. Thanks!

Research Digest voted best psychology blog

Research Blogging Awards 2010 Winner!I'm thrilled to announce that the Research Digest blog has been voted the best psychology blog in the inaugural Research Blogging Awards. My thanks to everyone who voted for the Digest, to the awards founder Dave Munger and the sponsors at Seed Magazine. Congratulations to the winners in the other categories, especially to Ed Yong of Not Exactly Rocket Science who won the overall research blog of the year.

The Digest blog is mentioned in this podcast about the awards.
You have read this article Announcements with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/research-digest-voted-best-psychology.html. Thanks!

Large, longitudinal study finds tentative links between internet use and loneliness

Internet use is growing at a phenomenal rate and much ink has been spilled by commentators forecasting the psychological consequences of all this extra web-time. A lot of that comment is mere conjecture whilst many of the studies in the area are cross-sectional, with small samples, producing conflicting results. The latest research contribution comes from Irena Stepanikova and her colleagues and involves a massive sample, some of whom were followed over time. The results suggest that more time on the internet is associated with increased loneliness and reduced life satisfaction. However, it's a complicated picture because the researchers' different outcome measures produced mixed results.

Over thirteen thousand people answered questions about their internet use, loneliness and life satisfaction in 2004 and in 2005. They'd been chosen at random from a list of US land-line numbers. The majority of the people quizzed in 2004 were different from those quizzed in 2005, but 754 people participated in both phases, thus providing some crucial longitudinal data.

An important detail is that the researchers used two measures of internet use. The first 'time-diary' method required participants to consider six specific hours spread out over the previous day and to estimate how they'd spent their time during those hours. The other 'global recall' measure was more open-ended and required participants to consider the whole previous twenty-four hours and detail as best they could how they'd used that time.

The cross-sectional data showed that participants who reported spending more time browsing the web also tended to report being lonelier and being less satisfied with life. This association was larger for the time-diary measure. The strength of the association was modest, but to put it in perspective, it was five times greater than the (inverse) link between loneliness and amount of time spent with friends and family. Turning to web-communication, the global recall measures showed that time spent instant messaging, in chat rooms and news groups (but not email) was associated with higher loneliness scores. For the time-diary measure, it was increased email use that was linked with more loneliness.

The longitudinal data showed that as a person's web browsing increased from 2004 to 2005, their loneliness also tended to increase (based on the global recall measure only). Both measures showed that increased non-email forms of web communication, including chat rooms, also went hand in hand with increased loneliness. Finally, more web browsing over time was linked with reduced life satisfaction by the time-diary measure, whilst more non-email web communication over time was linked with reduced life satisfaction by the global recall measure.

Perhaps the most important message to come out of this research is that the results varied with the measure of internet use that was used - future researchers should note this. The other message is that more time browsing and communicating online appears to be linked with more loneliness, the two even increase together over time. However, it is important to appreciate that we don't know the direction of causation. Increased loneliness may well encourage people to spend more time online, rather than web time causing loneliness. Or some other factor could be causing both to rise in tandem. It's worth adding too that the web/loneliness link held even after controlling for time spent with friends and family. So if more web use were causing loneliness, it wasn't doing it by reducing time spent socialising face-to-face.

'We are hopeful that our study will stimulate future research ... ,' the researchers said, 'but at this point any claims suggesting that as Internet use continues to grow in the future, more people will experience loneliness and low life-satisfaction would be premature.'
_________________________________

ResearchBlogging.orgStepanikova, I., Nie, N., & He, X. (2010). Time on the Internet at home, loneliness, and life satisfaction: Evidence from panel time-diary data Computers in Human Behavior, 26 (3), 329-338 DOI: 10.1016/j.chb.2009.11.002
You have read this article Technology with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/large-longitudinal-study-finds.html. Thanks!

Extras

Eye-catching studies that didn't make the final cut:

Many of us misunderstand mirrors.

How to Gain Eleven IQ Points in Ten Minutes: Thinking Aloud Improves Raven's Matrices Performance in Older Adults.

Well-Being Is Related to Having Less Small Talk and More Substantive Conversations.

Stress broadens men's sexual tastes.

Exposure to the letter 'F' impairs students' performance. The letter 'A' has the opposite effect.

Optimism boosts the immune system.

The adverse effects of using social norms in health communication.

Cameron's body language rated as more attractive and younger than Brown's.

We persist in putting men's names before women's.

Brain scan reads people's episodic memories.

Personality impressions associated with four distinct humor styles.

People act less altruistically and are more likely to cheat and steal after purchasing green products than after purchasing conventional products. [Open Access]

The multiple meanings of “neuro” in neuropsychology.

Last year, functional magnetic resonance imaging made its debut in court. Virginia Hughes asks whether the technique is ready to weigh in on the fate of murderers. [Open Access]

Moms do badly, but grandmas do worse: The nexus of sexism and ageism in children's classics.

Can't wait for the fortnightly round-up of 'Extras'? Several of these links appeared first on the editor's Twitter feed.
You have read this article Extras with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/extras.html. Thanks!

A social version of a basic cognitive mechanism

We're slower to direct our attention to the same location twice in succession, a well-established phenomenon that cognitive psychologists call 'inhibition of return' (IoR). It's thought the mechanism may act to make our search of the visual scene more efficient by deterring us from looking at the same spot twice. Now Paul Skarratt and his colleagues have documented a new 'social' form of inhibition of return, in which people are slower to attend to a location that social cues, such as gaze direction, suggest another person has already attended to.

Twelve participants sat at a table with an animated character projected opposite. Each participant and their animated partner had two lights and two buttons in front of them, near the middle of the table (see figure above). One light/button pair was to the left, the other pair was to the right. The basic task was to press the corresponding button as fast as possible when its light came on. Participants were slower to respond to a light when the animated partner had just responded to the adjacent light on their side of the table - this is what you might call a weak version of social inhibition of return. However, when two large vertical barriers were put up with a gap in the middle, so that the participants could only see their partner's eyes and initial reaching action, and not their actual button presses, this social IoR disappeared.

In a second experiment, the animated partner was replaced with a human. This time, the social IoR effect occurred even when the barriers were erected and only the partner's eye gaze and initial hand movement could be seen. In other words, inferences about where the partner was going to attend, based on their eyes or early hand movement, seemed to be enough to inhibit a participant's own attention to the same location. For some reason, this strong version of social IoR only occurred with a real, human partner, not the animated, computer-controlled partner of the first experiment.

The final experiment added yet another visual barrier, which left only the partner's eyes or only their early hand movement visible. This was to try to establish which cue was the more important for provoking social IoR. The answer was that both cues were equally effective.

It's only supposition at this stage, but Skarratt and his team think social IoR could be supported by the postulated mirror neuron system. Monkey research has shown, for example, that there are mirror neurons in the premotor cortex that fire whether a monkey sees another person grasp an object or if they just see the initial part of that grasping movement.

'Although the critical mechanisms underlying social IoR remain to be discovered,' the researchers said, 'the current study indicates that it can be generated independently of direct sensory stimulation normally associated with IoR, and can occur instead on the basis of an inference of another person's behaviour.'
_________________________________

ResearchBlogging.orgSkarratt, P., Cole, G., & Kingstone, A. (2010). Social inhibition of return. Acta Psychologica, 134 (1), 48-54 DOI: 10.1016/j.actpsy.2009.12.003

Figure courtesy of Paul Skarratt.
You have read this article Cognition / Perception with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/a-social-version-of-basic-cognitive.html. Thanks!

The sight of their own blood is important to some people who self-harm

The sight of their own blood plays a key role in the comfort that some non-suicidal people find in deliberately cutting themselves. That's according to a new study by Catherine Glenn and David Klonsky that suggests it is those self-harmers who have more serious psychological problems who are more likely to say the sight of blood is important.

There are plenty of anecdotal reports hinting at the importance of the sight and taste of blood to self-harmers, as well as references in popular music. 'Yeah you bleed just to know you're alive,' sing the Goo Goo dolls in Iris. 'I think it's time to bleed I'm gonna cut myself and Watch the blood hit the ground,' sings Korn on Right Now. However, this is the first systematic investigation on the topic.

Glenn and Klonsky recruited 64 self-harmers from a mass screening of 1,100 new psychology students. With an average age of 19, and 82 per cent being female, the students answered questions about their self-harming and other psychological problems and specifically reported on the importance of the sight of blood.

Just over half the participants said that it was important to see blood when they self-harmed, with the most common explanation being that it helps relieve tension and induces calmness. Other explanations were that it 'makes me feel real' and shows that 'I did it right/deep enough'.

The participants who said blood was important didn't differ in terms of age and gender from those who said it wasn't. However, the blood-important group reported cutting themselves far more often (a median of 30 times compared with 4 times) and they were more likely to say they self-harmed as a way of regulating their own emotions. The blood-important group also reported more symptoms consistent with bulimia nervosa and borderline personality disorder.

'Overall, these results suggest that self-injurers who report it is important to see blood are a more clinically severe group of skin-cutters,' the researchers said. 'Therefore, a desire to see blood during non-suicidal self-injury may represent a marker for increased psychopathology.'

Glenn and Klonsky said more research was needed to find out why the sight of blood has the significance it does for some people who self-harm. However, they surmised that the sight of one's own blood could, after an initial rise in heart-rate, lead to a rebound effect characterised by reduced heart-rate and feelings of calmness.
_________________________________

ResearchBlogging.orgGlenn, C., & Klonsky, E. (2010). The Role of seeing blood in non-suicidal self-injury. Journal of Clinical Psychology DOI: 10.1002/jclp.20661
You have read this article Mental health with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/the-sight-of-their-own-blood-is.html. Thanks!

How to give advice

Information, information, information. That's the message from one of the first studies to look at people's preferences for different forms of advice. Reeshad Dalal and Silvia Bonaccio presented hundreds of students with fictional decision-making scenarios, such as choosing which job to apply for. The students were offered various permutations of advice and asked to say how satisfied they'd be if a friend had given them that advice. The different kinds of advice were: which option to go for; which option not to go for; info on how to make the decision (e.g. use a points allocation system); information on one or more of the options; and sympathy about the difficulty of making a decision. Whilst all forms of advice were positively received, the students' consistent preference was for information about one or more of the options.

A second study spiced things up by introducing more varied decision-making scenarios: where to locate a new store; how to lay off excess staff; and how to invest some inheritance. A fresh batch of students were presented with the new scenarios and this time they were to imagine they'd solicited the advice from an expert, rather than a friend, to see if this made any difference to their responses. Information again came out as the most preferred form of advice. However, this time round, specific advice on which option to go for was also particularly well received, especially in the investment scenario.

The researchers said past research on advice giving has tended to focus purely on advice in the form of 'I recommend option X', so this study makes a novel contribution. 'Across the situational and dispositional variables we examined, decision-makers appeared to want their advisors to provide information about the alternatives,' the researchers said. Advice that says 'go for option X' can also be well-received but only in specific circumstances, such as when advice has been explicitly solicited from an expert.

When it comes to lessons for real life, Dalal and Bonaccio said more research was needed to see how their results generalise, but in the meantime they advised: 'Individuals who are advising decision-makers should at the very least be careful to provide information along with their recommendations.'
_________________________________

ResearchBlogging.orgDalal, R., & Bonaccio, S. (2010). What types of advice do decision-makers prefer? Organizational Behavior and Human Decision Processes DOI: 10.1016/j.obhdp.2009.11.007

Related Digest item: 'We're more likely to listen to expensive advice'.
You have read this article Occupational with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/how-to-give-advice.html. Thanks!

The Special Issue Spotter

We trawl the world's journals so you don't have to:

The Cognitive Neuroscience of Aging (Cortex).

Emotional States, Attention, and Working Memory (Cognition and Emotion). [Includes free intro]

Social Anxiety in Childhood: Bridging Developmental and Clinical Perspectives (New Directions for Child and Adolescent Development).

Emotional intelligence (Australian Journal of Psychology).

Celebrating 25 years of the journal Sexual and Relationship Therapy (Sexual and Relationship Therapy).
You have read this article Special Issue Spotter with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/the-special-issue-spotter.html. Thanks!

Disclosure of their diagnosis impairs the social functioning of people with schizophrenia

People don't need to be treated as a stereotype for harm to occur; their mere belief that they could be viewed in a stereotyped fashion is enough - a phenomenon known as 'stereotype threat'. For example, women reminded of the stereotype that men are better at maths tend to perform more poorly in a subsequent maths task, even if they are actually treated fairly. Now Julie Henry and colleagues have extended this line of research to the domain of mental health. They've found that patients with a schizophrenia diagnosis function less well socially, when they think that the person they're chatting with knows their diagnosis.

Thirty people diagnosed with schizophrenia or schizoaffective disorder spent a few minutes chatting on their own to one research assistant and then they did the same with another assistant an hour later. There were a few points of deception: first, the participants were led to believe that the assistants were participants from another study. Also, most importantly, before one of the conversations began, they were told that the assistant knew about their diagnosis of schizophrenia; before the other, they were told the assistant did not know. They were also told, truthfully, that both the people they were to chat with did not themselves have a diagnosis of schizophrenia.

In reality, the research assistants didn't know whether each participant had a diagnosis of schizophrenia or not. This was achieved by having them chat to the participants diagnosed with schizophrenia plus a number of control participants. Crucially, they weren't told in advance who was who.

After each conversation, the research assistants rated the social behaviour of the person they'd just chatted with. The participants in turn rated the behaviour of the assistant they'd just chatted with and they said how they felt the conversation had gone.

The key finding is that the social functioning of the participants with schizophrenia seemed to deteriorate when they thought their conversational partner knew their diagnosis (even though they didn't). Specifically, when they thought their diagnosis had been disclosed, the participants were rated by the research assistants as being more impaired at initiating conversations and at switching topics appropriately, and the assistants also found these conversations less comfortable.

Henry's team can't be sure, but they think these apparent deficits emerged because the participants' concern about how they would be judged, in light of their diagnosis having been disclosed, interfered with their ability to converse in a more effective manner.

A further twist was that the participants with schizophrenia seemed unaware of these effects - they reported finding the conversations, in which they thought their diagnosis was known, just as comfortable and successful as when they thought their diagnosis had been kept hidden. This contrasts with non-clinical research on stereotype threat, in which people seem to be aware of the effects on their performance.

The results provide food for thought regarding when and how mental health diagnoses should be disclosed. The researchers said their findings suggest 'that one of the defining qualities of [schizophrenia] - social skill impairment - is not caused solely by the disorder per se, but rather, also derives from feelings of being stereotyped.'
_________________________________

ResearchBlogging.orgHenry, J., Hippel, C., & Shapiro, L. (2010). Stereotype threat contributes to social difficulties in people with schizophrenia. British Journal of Clinical Psychology, 49 (1), 31-41 DOI: 10.1348/014466509X421963
You have read this article Mental health / Social with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/disclosure-of-their-diagnosis-impairs.html. Thanks!

Thirty years on - the babies judged negatively by their mothers

If a mother has a negative perception of her baby when it's just one month old, there's a strong possibility that same baby will have attachment problems as an adult, thirty or forty years later. That's the claim of a longitudinal study that recommends screening new mothers to see if they have a negative perception of their child, so that any necessary action can be taken to stop the transmission of attachment problems from mother to child.

Elsie Broussard and Jude Cassidy recruited twenty-six adults in the area of Pittsburgh, whose mothers had signed up to a longitudinal study up to forty years earlier. Back then, in the 60s and 70s, the mothers had been asked to rate their one-month-old babies on factors like crying, spitting, sleeping, feeding and predictability, and then do the same for the 'average baby'. Twelve of the babies were judged to be at risk because their mothers had rated them more negatively than an average baby. Back to the present, and the researchers interviewed the adults using the Adult Attachment Interview, which includes questions about memories of their childhood, their memories of separation and loss and whether they felt affected by their parents' behaviour. Based on these kinds of questions, the participants were classified as being securely or insecurely attached, the latter classification suggesting that they have ongoing problems forming healthy emotional attachments to other people.

The key finding is that 9 of the 12 adults who, so many years earlier, had been perceived negatively by their mothers were today classified as insecurely attached adults, compared with just 2 of the 14 adults who'd been positively perceived by their mothers. '...These findings reflect transmission from one individual's representational world to that of another,' the researchers said. In other words, the researchers believe that a mother who views her baby negatively has attachment problems and these problems tend to be passed onto that baby, even affecting his or her attachment style thirty or forty years later.

How could a negative attachment style be transmitted in this way? Apparently, earlier work in Broussard's lab showed that 'mothers with a negative perception of their infants had limited awareness of their infant's states, had difficulties recognising their infant's signals, and lacked a flexible and effective range of responses.' Moreover, the researchers surmised, babies with mothers who perceive them negatively may fail to come to see their mother as a secure base and may come to feel 'rejected and unloved, feelings that may contribute to an insecure state of mind [in adulthood] with respect to attachment.' Given their results, Broussard and Cassidy suggested more professional support be given to new mothers, especially during the critical early period between hospital discharge and the next contact with medical staff.

As with so many studies that look for effects of parenting on children, this study contains a serious confound that's barely touched upon by the researchers. The effects that Broussard and Cassidy attribute to parenting and attachment style could well be genetic. We're not surprised when the children of tall parents grow up to be tall. Perhaps we shouldn't be surprised that the children of insecurely attached parents grow up to be insecurely attached themselves.
_________________________________

ResearchBlogging.orgBroussard, E., & Cassidy, J. (2010). Maternal perception of newborns predicts attachment organization in middle adulthood. Attachment & Human Development, 12 (1), 159-172 DOI: 10.1080/14616730903282464
You have read this article Developmental / Mental health with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/thirty-years-on-babies-judged.html. Thanks!

Reminder of disease primes the body and mind to repel other people

When it comes to avoiding infection, a growing body of evidence suggests we don't just have a physiological immune system, we also have a behavioural immune system - one that alerts us to people likely to be carrying disease, and that puts us off interacting with them. Indeed, there's research showing that people who are more fearful of disease tend to hold more xenophobic attitudes and to display greater prejudice towards people with outwardly visible disabilities. Now Chad Mortensen and his co-workers have extended this line of research by showing that a disease-themed slide show makes people feel less sociable and extravert, and primes their motor system for repelling other people.

In the first study, half of 59 participants watched a disease and infection-themed slide show before completing a measure of their own personality. The other participants watched a slide show about architecture before doing the same. The researchers took pains to conceal the true purpose of the study. They asked participants to rate the slide shows' usefulness for another project and they had them answer irrelevant questions. The key finding was that participants who watched the disease slide show subsequently rated themselves as less extravert than did the control participants. Also, among those participants who scored highly on a measure of fear of disease, those who watched the infection slide show rated themselves afterwards as less open to experience and less agreeable. Taken altogether this suggests that reminders of disease makes us view ourselves as less outgoing and gregarious, especially if we're the kind of person who's already fairly neurotic about infection.

If these effects are real, you'd expect them to have some effect on actual behaviour. The second study tested that by having participants watch one of the slide shows before completing a computer task. The task involved faces and shapes flashing on a screen and participants responding with a button press that required either an extension or contraction of the arm. The take-home finding here was that participants who watched the disease slide show were quicker at the button presses that required them to extend their arm - the same muscle action that would be required to push someone away. This effect was particularly strong among those participants who were more scared of infection. Again, cover stories were used to conceal the true purpose of the study.

'...It appears that humans have evolved a mechanism that responds to environmental cues of disease and modulates attitudes and behaviours in functionally appropriate ways,' the researchers said.

Looking to the future, Chad Mortensen and his colleagues added that it would be interesting to see if there could be a reverse effect in conditions in which risk of infection appeared to be absent. In this case, people normally afraid of infection might become particularly extravert and sociable.
_________________________________

ResearchBlogging.orgMortensen, C., Becker, D., Ackerman, J., Neuberg, S., & Kenrick, D. (2010). Infection Breeds Reticence: The Effects of Disease Salience on Self-Perceptions of Personality and Behavioral Avoidance Tendencies. Psychological Science, 21 (3), 440-447 DOI: 10.1177/0956797610361706

Related open-access article in The Psychologist magazine: 'Parasites, minds and cultures'.
You have read this article Social with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/reminder-of-disease-primes-body-and.html. Thanks!

Extras

Eye-catching studies that didn't make the final cut:

A Systematic Review of Six Decades of Research in Psychopharmacology. Verdict: Huge improvements over time, but could still do better.

When exposed to scary stimuli, anxious people experience a more pronounced sense of time slowing down than do calmer people.

The man who lost his ability to see colours, but didn't know it. Or, in the scientific jargon: 'Anosognosia for cerebral achromatopsia'.

The neuroscience of human intelligence differences.

Women's faces judged more attractive when making eye contact rather than looking a way.

You know when the sun makes you sneeze? This study uses EEG to look at the neural basis of this phenomenon.

New Lancet review of the placebo effect.

'Touch establishes powerful physical and emotional connections between infants and their caregivers, and plays an essential role in development' Systematic review of infant massage.

Meta-analysis finds violent videogames can lead to increased aggression.
You have read this article Extras with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/extras_11.html. Thanks!

The remote rural community that thinks letting someone die is as bad as killing them

In recent years, cognitive scientist Marc Hauser has gathered evidence that suggests we're born with a moral instinct. This moral intuition has been likened to the universal grammar that Chomsky famously suggested underlies our linguistic abilities - certain principles are set in stone, whilst the precise parameters can be set by culture. Thousands of people from multiple countries and different religions and demographic backgrounds have given their verdict on fictional scenarios presented online and from this Hauser has identified some potential moral universals [try out the moral tests for yourself].

One of these near-universal principals is that most people think it is worse to deliberately cause someone harm in order to achieve a greater good, than it is to cause some harm as a side-effect in pursuit of the greater good. Think of deliberately pushing a man in the way of a run-away lorry to save a crowd, as opposed to shouting at the lorry driver, such that he swerves away from the crowd, but instead crashes into and kills a man on the pavement.

Another is that most people think actions that lead to harm are worse than omissions (i.e. not doing something) that lead to harm. Think of a doctor killing a patient with a lethal dose, as opposed to letting them die by not administering a life-saving drug.

Finally, most people think harm delivered via direct physical contact - for example, pushing them to their death - is worse than harm delivered at a distance - for example, via a trap.

Most people match this pattern of responding but so far most participants have been from urban, technologically advanced cultures. Now Marc Hauser and his colleague Linda Abarbanell have translated these kinds of moral scenarios and taken them to a rural Mayan community in the highlands of Chiapas in Mexico.

The rural Mayans showed the usual bias for seeing harm caused deliberately in pursuit of a greater good as more forbidden than harm caused as a side-effect in pursuit of that same greater good. But Abarbanell and Hauser's breakthrough finding is that the rural Mayans didn't believe that harm caused by direct contact was worse than indirect harm and they didn't think active harmful acts were morally worse than harmful acts of omission.

The researchers don't think these differences emerged because of translation problems. Choosing to focus on the omission/active harm type situation, the researchers tried out several different scenarios, including one designed for use with children, and always the results were the same. The rural Mayans saw agents as more causally responsible for active harm, they just didn't see them as more morally blameworthy. Moreover, when Abarbanell and Hauser tested a more urban Mayan population, they did show the usual tendency to see harmful acts of omission as less bad, thus suggesting that this difference in moral judgment is specific to the rural community.

They don't have a specific law that forbids 'looking the other way', so why should the rural Mayans differ on this key moral principle? The researchers said more research is needed, but they think it probably has to do with the 'highly intertwined social relations and their associated obligations' in the rural Mayan society. Future studies could look to see if the omission/active harm distinction is missing from other small-scale, close-knit societies.

'Ultimately,' Abarbanell and Hauser concluded, 'this research may suggest that some psychological distinctions are moral absolutes, true in all cultures, whereas others may be more plastic, relative to a culture's social dynamics, mating behaviour and belief systems.'
_________________________________

ResearchBlogging.orgAbarbanell, L., & Hauser, M. (2010). Mayan morality: An exploration of permissible harms. Cognition DOI: 10.1016/j.cognition.2009.12.007

Image credit: Wikipedia commons.
You have read this article Morality with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/the-remote-rural-community-that-thinks.html. Thanks!

We're slower at processing touch-related words than words related to the other senses

People are slower at responding to tactile stimuli than to input from the other senses. It's not immediately obvious why this should be. It's unlikely to be for mechanical reasons: the retina in the eye is slower at converting input into a neural signal than is the skin. Psychologists think the answer may have to with attention. Perhaps we're not so good at keeping our attention focused on the tactile modality compared with the others. Now Louise Connell and Dermot Lynott have added to the picture by showing that the tactile disadvantage extends to the conceptual domain. That is, we seem to be slower at recognising when a word is tactile in nature than we are at recognising whether words are visual, to do with taste, sound, or smell.

The researchers had dozens of participants look at words on a screen, presented one at a time, and press a button to say if they were related to the tactile modality (e.g. 'itchy') or not. Some words were tactile-related whilst others were fillers and related to the other senses.

The same task was then repeated but with participants judging whether the words were visual-related, auditory and so on, with each sense dealt with by a new block of trials. The key finding is that participants were much slower at this task in the tactile condition than for the other senses. This was the case even when words were presented for just 17ms, which is too fast for conscious detection but long enough for accurate responding.

To make sure the slower performance in the tactile condition wasn't to do with the response requiring a button press (which inevitably causes tactile stimulation), the researchers repeated the experiment with vocal responding via a microphone. The results were pretty much the same.

Ensuring they left no stone unturned, Connell and Lynott also conducted a final experiment to check that there isn't something about tactile words, besides their touchiness, that makes them slower to process. To do this they used words that have both visual and tactile qualities - examples include shaggy and spiky - and they mixed these in among filler words that related to the other senses. The same words were used in the tactile condition (in which participants had to say whether each word was tactile-related or not) and a visual condition. Once again, participants were significantly slower in the tactile condition.

Connell and Lynott say their findings provide further evidence for the tactile sense having a processing disadvantage relative to the other senses. They think this is because there's little evolutionary advantage to sustaining attention to the tactile modality whereas there are obvious survival advantages with the other senses, for example: '...in hunting, where efficacious looking, listening and even smelling for traces of prey could afford an advantage.' You may think of pain and damage detection as reasons for paying sustained attention to the tactile domain, but remember these are served by spinal reflexes. 'We do not wait for the burning or stinging sensation to register with the attentional system before responding,' the researchers said.
_________________________________

ResearchBlogging.orgConnell L, & Lynott D (2010). Look but don't touch: Tactile disadvantage in processing modality-specific words. Cognition, 115 (1), 1-9 PMID: 19903564
You have read this article Cognition / evolutionary psych / Language / Perception with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/we-slower-at-processing-touch-related.html. Thanks!

Darkness encourages unethical behaviour even when it makes no difference to anonymity

Imagine a man sits alone, hunched over his desk, fingers tapping out a project progress report to his boss. Does he decide to lie? If I told you that the sun had nearly set, filling the man's room with darkness, would that make any difference to your answer? It should do. A new study suggests that darkness encourages cheating, even when it makes no difference to anonymity.

Chen-Bo Zhong and colleagues had dozens of undergrad students complete a basic maths task against a time limit. Afterwards they had to fill in an anonymous form indicating how many items out of twenty they'd answered correctly and they had to take a monetary reward from an envelope (up to twelve dollars) in line with their performance. Half the students completed the task in a dimly lit room (though still light enough to see each other) whilst the other half completed the task in a bright room.

A surreptitious coding system allowed the researchers to match up the students' self-completed scoring cards with their actual performance. You guessed it, the students in the dimly lit room tended to exaggerate their performance more than the control group in the bright room (by an average of 4.21 items vs. 0.83 items). Another way of looking at it is that 60.5 per cent of participants in the dim room exaggerated their performance compared with just 24.4 per cent of participants in the bright room.

In the same way that young kids think they are invisible when they cover their eyes, Chen-Bo Zhong's team think the effect they observed occurs as an automatic response to the cover of darkness, even when the lack of light makes no difference to anonymity.

A second study supported this interpretation, finding that student participants wearing sun-glasses chose to share money less fairly in a computer-based economic game than did students wearing normal glasses. Again, the subjective reduction in light made no difference to actual anonymity as the game was played entirely via computer with a partner who participants thought was in another room. The students who said they felt more anonymous tended to share the least money, thus suggesting that perceived anonymity was mediating the effect of darkness on behaviour.

'Darkness appears to induce a false sense of concealment, leading people to feel that their identities are hidden,' the researchers said. The next time you're deliberating over a moral issue, you might want to think about whether you've got the lights on or not!
_________________________________

ResearchBlogging.orgZhong, C., Bohns, V., & Gino, F. (2010). Good Lamps Are the Best Police: Darkness Increases Dishonesty and Self-Interested Behavior. Psychological Science DOI: 10.1177/0956797609360754
You have read this article Cognition / Forensic with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/darkness-encourages-unethical-behaviour.html. Thanks!

The Special Issue Spotter

We trawl the world's journals so you don't have to:

Delusion and Confabulation (Cognitive Neuropsychiatry).

Formal modeling of semantic concepts (Acta Psychologica).

Silence and Memory (Memory). From the editorial: 'As memory researchers particularly, and psychological researchers more broadly, we often focus our observations on what is present—what is expressed, what is rehearsed, what is reported. This special issue of Memory focuses on silence and its implications for memory, and also for the implications of silences that extend beyond memory, to the functioning of individuals, groups, and societies.' Sounds intriguing.

Language, Communication and Schizophrenia (Journal of Neurolinguistics).

New Aspects in the Treatment of Affective Disorders (The International Journal of Neuropsychopharmacology). Open Access issue.
You have read this article Special Issue Spotter with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/the-special-issue-spotter_4.html. Thanks!

Hour-glass figure activates the neural reward centre of the male brain

There's little doubt that many conceptions of attractiveness are faddish - the size zero female model being an obvious example. However, other notions of beauty are more hard-wired, perhaps reflecting an evolutionary adaptation. These aspects of appearance have come to be associated with fertility, signifying 'reproductive fitness' to potential mates. Male facial symmetry is one example. Another is the hour-glass female form. Men in cultures across the world report a preference for women with a lower waist-to-hip ratio. And women with this body shape tend to be more fecund.

Now Steven Platek and Devendra Singh have provided brain imaging evidence to complete the picture. They've shown that the reward centres of men's brains fired up in response to the sight of naked women who'd chosen to have cosmetic surgery to accentuate the curviness of their figures. By contrast, changes to the women's body mass index - including increased slimness - had no such effect. Platek and Devendra said the finding could explain 'some men's proclivity to develop preoccupation with stimuli depicting optimally designed women' - i.e. porn. The weaker neural response to slimness, by contrast, suggests 'BMIs role in [attractiveness] evaluations is less the product of evolved psychological mechanisms and more the part of culturally driven, or societal based norms and perceptions.'

Platek and Singh made their observations after asking men to look at photographs of women taken before and after they'd undergone surgery in pursuit of an hour-glass figure. The post-op pictures triggered more brain activity in reward-evaluation areas such as the orbitofrontal cortex. The surgery had the effect of lowering the women's waist-to-hip ratio and there were also slight changes to their BMI scores. The former change was associated with more reward-related activity in the men's brains whereas changes to BMI was only associated with activity adjustments in lower-level visual brain areas. Finally, increases in the attractiveness ratings given by the men to the post-op pictures were associated with activation in neural reward areas, such as the nucleus accumbens, which are also involved in drug-based reward and craving.
_________________________________

ResearchBlogging.orgPlatek, S., & Singh, D. (2010). Optimal Waist-to-Hip Ratios in Women Activate Neural Reward Centers in Men. PLoS ONE, 5 (2) DOI: 10.1371/journal.pone.0009042
You have read this article Brain / evolutionary psych / Sex with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/hour-glass-figure-activates-neural.html. Thanks!

Can therapists tell when their clients have deteriorated?

About five to ten per cent of the time, people in therapy get worse instead of better. What should psychotherapists do in such cases? Hang on a minute. There's no point answering that question unless therapists can recognise that a client has deteriorated in the first place. A new study tackles this precise issue, finding, rather alarmingly, that the vast majority of therapists appear blind to client deterioration.

Derek Hatfield and colleagues took advantage of therapy outcome data gathered at a student counselling centre where clients provided symptom feedback prior to each weekly session. Although placed on record, this outcome data wasn't fed back to the therapists in a systematic way and there was no alert in place to signal symptom deterioration (as an aside, past research shows such systems hugely improve therapy outcomes). Rather, the therapists, the majority of whom had PhDs or were in doctoral training, had to rely on their own judgment.

Hatfield's team identified 70 clients who at one particular session were in significantly worse shape compared with their state before entering therapy, prior to the very first session. The researchers then scrutinised clinical notes made by the therapists after each session to see if, at the appropriate session, they'd made any reference to their clients' worsened state. Here's the shocker: in only 15 of these 70 cases had the therapists made a clinical note after the relevant session suggesting they had noticed a deterioration.

Therapists often have massive case loads and in some cases the deterioration could have occurred some weeks after the opening session. Perhaps it is no wonder that most therapists had struggled to notice negative change. To make things easier, Hatfield and his co-workers returned to the database and focused on just those cases where a client had shown a huge deterioration from one session to the next. Unfortunately, it's still bad news. Of these 41 cases, therapist notes suggested they noticed only 13.

The question of what psychotherapists should do when a client deteriorates is for fuller discussion another day. However, Hatfield did touch on this. On those occasions that therapists had noted a client deterioration, Hatfield's team looked to see what the noted course of action had been. The most common choices were drug referral and continue as usual. Hatfield then surveyed hundreds of APA-registered psychological therapists about what they would do, hypothetically speaking, if they had a client who'd deteriorated. Revealingly, among those 36 who replied, popular answers included 'discuss the deterioration with the client' and 'increase therapy sessions'. Worryingly perhaps, these suggestions were noticeably absent from the real life case notes.

This research comes with a major caveat - dependence on therapists' clinical notes is a far from perfect indicator of whether or not they noticed client deterioration. Still, you'd expect a significant worsening, if noticed, to be noted. The researchers said: 'It is hoped that therapists will be open to the idea that additional information concerning client progress will enhance their clinical judgment, particularly concerning potential client deterioration.'
_________________________________

ResearchBlogging.orgHatfield, D., McCullough, L., Frantz, S., & Krieger, K. (2009). Do we know when our clients get worse? An investigation of therapists' ability to detect negative client change. Clinical Psychology & Psychotherapy DOI: 10.1002/cpp.656
You have read this article Mental health with the title March 2010. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/03/can-therapists-tell-when-their-clients.html. Thanks!