Kate Harkness and colleagues asked 43 depressed and non-depressed students to identify people's emotions from pictures that showed only the eye region of their faces. The 16 students who were classified as mildly to moderately depressed – based on their score on the Beck Depression Inventory – performed significantly better (78 per cent correct) on this emotion-recognition test relative to the 27 non-depressed controls (69 per cent correct). The depressed students didn’t take any longer over their answers and their superior performance was not due to their being more sensitive to negative emotions only. Also, there was no difference between the groups on two control tests, one of which involved detecting people’s gender just from pictures of their eyes.
The researchers replicated their finding in a second experiment that involved a larger sample of 81 students, and which controlled for the influence of anxiety using the Mood and Anxiety Symptom Questionnaire. Again, students classified as mild to moderately depressed were better at recognising the emotions shown in pictures of people’s eyes.
Understanding other people’s feelings requires two stages, the authors said – the ability to detect emotions, followed by the ability to interpret and reason about those emotions. The researchers believe depressed people have an enhanced ability for the first stage paired with negatively biased functioning in the second stage.
“…hypersensitivity to others’ emotional states may have pathological implications simply because by being more sensitive, dysphoric and depressed individuals have more opportunities to deploy their negative biases in interpreting fleeting emotional reactions”, the researchers said.
Harkness, K.L., Sabbagh, M.A., Jacobson, J.A., Chowdrey, N.K. & Chen, T. (2005). Enhanced accuracy of mental state decoding in dysphoric college students. Cognition and Emotion, 19, 999-1025.
Each assessor was shown 20 pairs of personal statements and referees’ reports taken from a pool of 80 applicants, and had to say in each case which belonged to a subsequently happy doctor (based on data collected in 2002) and which belonged to an applicant who had become stressed and wanted to stop being a doctor.
The researchers found that 35 expert medical school selectors performed no better than chance at this task. Nineteen doctors, 22 medical students, and 20 psychology students, who also completed the task, fared no better.
However, the assessors' judgements were not random - they tended to agree with each other and their predictions also correlated with the applicants' exam results. It seems that to predict who would become a happy doctor, the assessors were mistakenly using clues to the applicants' academic prowess that were contained in the personal statements and referees' reports.
The researchers said "...although many claims are made for the utility of the personal and referees' information [contained on application forms], we could find no evidence of the long-term predictive validity for an important outcome variable - the judgment of whether or not an applicant will be a happy and satisfied doctor, or instead will be an unhappy, stressed, burned out, dissatisfied doctor who does not enjoy their job and thinks of leaving for another career".
McManus, I.C., Iqbal, S., Ferguson, E. & Leaviss, J.. (2005). Unhappiness and dissatisfaction in doctors cannot be predicted by selectors from medical school application forms: A prospective, longitudinal study. BMC Medical Education, 5:38.
Link to article in The Psychologist by Chris McManus on this topic.
Healthy participants had a painful stimulus applied to the back of their hand. At the same time they learned to use mental strategies - such as concentrating on another part of their body, or viewing the pain as a neutral experience - to control levels of activity in their anterior cingulate gyrus (a brain area known to be involved in pain perception), displayed to them live using real-time functional magnetic resonance imaging. All the while they provided a continuous rating of how painful the stimulus was.
The researchers found that the pain was perceived as being significantly less intense when the participants reduced the activity in their anterior cingulate gyrus compared with when they increased activity in that region. A similar effect was observed when the experiment was repeated with patients suffering from chronic pain - they were able to reduce their pain by lowering activity in their anterior cingulate.
A number of control conditions supported the researchers' interpretation of the results. The same control over pain wasn't shown when a different set of participants were taught the same mental strategies but without the real-time brain images; nor when participants used real-time brain images to learn to control activity in a part of the brain (the posterior cingulate) not involved in pain perception; nor when participants were given false real-time feedback of activity in their anterior cingulate. "Any effects of expectation or suggestion created by the displays themselves or by the subjects' perception of their control over brain activation were identically matched in these control subjects, who nonetheless did not show an improvement in their control over pain", the researchers said.
Now that this study has demonstrated the feasibility of using real-time brain images to help people learn to control an aspect of their behaviour or mental experience, more work is planned to test the potential benefit of this intervention with other conditions. The study also raises interesting philosophical questions. Did the images of the participants' own brains allow them to master their thoughts, or did their mental strategies allow them to control their brain activity? Which was the means and which was the ends to controlling their pain?
deCharms, R.C., Fumiko, M., Glover, G.H., Ludlow, D., Pauly, J.M., Soneji, D., Gabrieli, J.D.E. & Mackey, S.C. (2005). Control over brain activation and pain learned by using real-time functional MRI. Proceedings of the National Academy of Sciences, USA. In Press, DOI:10.1073/pnas.0505210102.
The pair conducted research that involved recording the skin conductance (a measure of emotional arousal) of people who were sometimes stared at, via a live video feed, by an experimenter sat in another room. In earlier work, they found that if Marilyn Shlitz, the psychic believer, did the greeting of participants and did the staring, then participants tended to show more emotional arousal when they were being stared at. However, when Richard Wiseman, the sceptic, did the greeting and staring, evidence for the 'sense of being stared at' was not found.
In the current experiment at the Institute of Noetic Sciences in America, where Schlitz is based, the two researchers broke things down still further to isolate the source of their earlier inconsistent findings. This time, Wiseman sometimes did the greeting while Schlitz did the staring, and vice versa. However, none of these manipulations made any difference - participants didn't show more emotional arousal when they were being stared at regardless of who did the greeting or staring. Schlitz's rapport with participants and expectations of success, which were also measured, also had no association with the outcome.
The researchers concluded that the latest findings have failed to explain their earlier inconsistent results, but they said "this series of experiments demonstrates that it is possible to conduct fruitful collaborative research involving both sceptics and proponents and it offers the potential of a more productive route than more traditional forms of sceptic-proponent debate".
They added "It is hoped that the studies described here will encourage researchers working in other controversial areas (e.g. the role of trance in hypnosis, false memory syndrome, unorthodox forms of psychotherapy and complimentary and alternative medicine) to engage in similar joint projects and that such work will help advance our understanding of the phenomena underlying these controversies".
Schiltz, M., Wiseman, R., Watt, C. & Radin, D. (2006). Of two minds: Sceptic-proponent collaboration within parapsychology. British Journal of Psychology. In Press, DOI:1348/000712605X80704.
Four hundred and seventy-six undergrads completed a 28-item questionnaire designed to measure boredom proneness, rating their agreement or not with statements like “Having to watch someone’s home movies or travel slides bores me tremendously”. From this, the 20 most and least boredom prone students were selected.
The remaining participants then completed a measure of their ability to rapidly reallocate their attention from one instant to the next, which required them to watch a computer monitor and note the appearance of one or two letters that appeared in streams of numbers. The brain’s limited attentional resources mean that if a second letter appeared too soon (typically within 200 to 500 ms) after the first, it would be more likely to go unnoticed - this is called the attentional blink. The participants then watched a series of illusory motion displays that showed a dot appearing to move around in a circle, and they had to say in each case how long in seconds the movement had lasted.
The size of the attentional blink was no larger in the participants prone to boredom than it was in the control participants, suggesting they were just as capable of refocusing their attention from one instant to the next. However, the boredom prone individuals were significantly less accurate at judging the duration of the illusory motion, particularly tending to overestimate its length.
The researchers said “If one’s subjective experience indicates that a task (say, reading an article on boredom) has taken less time than has really passed, that individual may be more motivated to continue reading and may in turn report the experience as a pleasant one…In other words, perhaps the reason time flies when we’re having fun is because this perception allows us to maintain attention for longer periods, which in turn allows us to see things through to completion, and perhaps results in more positive affect and enjoyment of the task itself”.
Danckert, J.A. & Allman, A.A. (2005). Time flies when you’re having fun: Temporal estimation and the experience of boredom. Brain and Cognition, 59, 236-245.
Peter Kirsch and colleagues at the National Institute of Mental Health in America scanned the brains of 15 male participants while they looked at angry and scared faces, or threatening scenes, such as a gun pointed in their direction. In a control condition they looked at simple shapes. Before the brain scanning, half the participants took five sniffs of oxytocin, the other half sniffed a placebo. The experiment was double-blind, so during the procedure neither the participants nor the researchers knew who had sniffed oxytocin and who had sniffed placebo.
As expected, when the participants who sniffed placebo looked at threatening faces or scenes, activity in their amygdalae increased relative to when they looked at simple shapes. The amygdala is an almond-shaped structure located in the temporal lobe that is known to respond to danger. However, this fear-related brain activity was significantly reduced in the participants who had sniffed oxytocin, especially when they looked at the threatening faces, suggesting oxytocin may particularly dampen down social fear. Moreover, in the participants who’d inhaled placebo, functional connectivity was detected between the amygdala and brainstem regions involved in the flight or fight response, but this connectivity was significantly reduced in participants who’d inhaled oxytocin.
There was no difference in the way the placebo participants and oxytocin participants felt according to questionnaires they completed before and after the experiment. This suggests the effect of oxytocin only becomes noticeable to the user in an actual interactive situation.
Co-researcher Dr. Andreas Meyer-Lindenberg told the Digest that oxytocin was soon to be tested as a short-term aid for people with social phobia. He said the side-effects, such as a possible head-ache, are weak, and that although the beneficial effects would be short-lived “that might be enough if you have a known stressor like a public speech”. Oxytocin would be “sort of a ‘social Viagra’, if you will” he said. He added that there was no evidence so far that oxytocin could lead to aggression or social disinhibition.
Kirsch, P., Esslinger, C., Chen, Q., Mier, D., Lis, S., Siddhanti, S., Gruppe, H., Mattay, V.S., Gallhofer, B. & Meyer-Lindenberg, A. (2005). Oxytocin modulates neural circuitry for social cognition and fear in humans. The Journal of Neuroscience, 25, 11489-11493.
Is it worrying that young girls don’t like their bodies, and know what a diet is?
This study aimed to find out how aware of dieting and body dissatisfaction young girls (5-8 years old) are, and how peers affect this awareness. Previous research has shown that the desire to be thinner has become so common in women it’s considered ‘a normative discontent’.
Eighty-one girls from the first three years of two private, single-sex schools were individually interviewed about their awareness of dieting and how teasing and likeability can change in relation to body shape, size and weight. For reasons of sensitivity, all questions were designed so that the participants didn’t have to go into detail or give reasons for their answers – they were just required to say yes or no.
When the girls were asked to point to a picture of their ideal figure, all, regardless of age, chose the thinner model. However, the girls in year two had the greatest body dissatisfaction when asked to answer yes or no to questions about dieting, weight and likeability. Twenty-two per cent of the girls could fully define the word ‘diet’, most of them being the older girls. Also, all the girls were aware of how teasing and likeability can be influenced by weight and body shape.
Awareness of dieting and weight seems to appear at a young age, with peer influences and opinions having a great affect on this. “This study has confirmed that a substantial number of young girls express a wish to be thinner and are well aware of dieting as a way of achieving the thin ideal”, said the researchers Hayley Dohnt and Marika Tiggemann at Flinders University in Australia.
Dohnt, H.K. & Tiggemann, M. (2005). Peer influences on body dissatisfaction and dieting awareness in young girls. British Journal of Developmental Psychology, 23, 103-116.
Nettle and Keenoo asked a sample of 425 people – including 96 recruited via adverts in art and poetry publications – to complete several questionnaires. One measured schizotypy, another asked how many sexual partners they’d had, while creativity was indicated by whether each participant was uninvolved in art, had it as a hobby, was an amateur or professional. The schizotypy questionnaire tapped four dimensions: unusual perceptual experiences and magical thinking; difficulties concentrating; violent and reckless behaviours; and ‘introvertive anhedonia’, which is an inability to enjoy oneself combined with social withdrawal.
Nettle and Keenoo found that participants who had more unusual thoughts and perceptions tended to be more creative, and in turn, people who were more creative tended to report having more sexual partners. They said this showed that in some people, schizotypic traits can manifest as creativity, which in turn is associated with more sexual partners, thus propagating schizophrenia-related genes.
They also found that a tendency towards violence or recklessness was directly related to having more sexual partners. Meanwhile, introvertive anhedonia, which the authors said schizophrenia sufferers score highly on but artists and poets do not, was associated with fewer sexual partners. Indeed, Nettle and Keenoo suggested that artists and poets “are differentiated from patients only by their low scores on introvertive anhedonia”.
Nettle, D. & Keenoo, H. (2005). Schizotypy, creativity and mating success in humans. Proceedings of The Royal Society, B. In Press, DOI: 10.1098/rspb.2005.3349.
Link to an artist's response in the Guardian
Vuorela first sat in on an internal strategic meeting held by sellers – four British men and a Finn – at a Finnish company that manufactures engines for use in power plants. She then sat in on a meeting that took place hours later between those sellers and a team of British and Irish buyers representing a British power company.
She found that joking was a sign of power, so that in the second meeting it was the chief buyer who initiated and ended most of the joking. His authority was betrayed by the fact everyone laughed at his jokes. “Based on my observations as a researcher…the quality of his quips did not deserve the level of laughter they received…the sellers seemed to be showing their respect for the head buyer in this way”, Vuorela said.
Humour was also used to express frustration. “It was a ‘safe’ way to express discontent because it permitted the speaker to express a problem while at the same time saving his face or that of the interlocutor because the joke was ‘off-the-record’ and not an official part of the negotiation”, Vuorela explained.
Experts tend to advise against using ethnic humour in business deals but Vuorela found that jokes about cultural differences were common. “Joking about your own national characteristics seems to be an acceptable way to produce ethnic humour”, she said.
Finally, there was evidence of unsuccessful humour – for example, although the sellers joked about their product in private, they did not respond to jokes about their product made by the buyers.
“Although consultative business communication guide books warn negotiators against using humour in multicultural negotiating, the data from this study indicate that disregarding humour in such business meetings would leave a negotiator on the ‘outside’ of the process”, Vuorela concluded.
Vuorela, T. (2005). Laughing matters: A case study of humour in multicultural business negotiations. Negotiation Journal, 21, 105-130.
Link to Laughlab
Daniel Cohen and colleagues at Harvard Medical School investigated these two aspects of learning by asking 50 participants to learn a sequence of key presses with their right hand. They then asked the participants to switch hands. For some of the participants the key sequence stayed the same but because they were now using a different hand, they obviously had to learn a different order of finger movements (see Figure). For the other participants, the sequence was mirror-reversed so they obviously had to learn a new sequence, but because they had switched hands, the order of the finger movements was actually same (see Figure). This procedure allowed the researchers to disentangle the fact-based and movement-based aspects of learning a motor skill. Some of the participants completed this initial part of the experiment in the morning, others in the evening.
Twelve hours later the participants were tested again. Of those participants who had to learn new finger movements, only those originally tested in the morning showed evidence of improvement. By contrast, of those who had to learn a new key-press sequence, only those who were previously tested in the evening and had therefore since slept, showed any sign of improvement.
The researchers said “We found that goal-based [i.e. fact-based] improvements developed exclusively overnight, whereas movement-based improvements developed exclusively over the day”.
“This deepens our understanding of [skill] consolidation by showing that off-line skill enhancement depends on multiple distinct processes that are preferentially engaged depending on when consolidation takes place”, they concluded.
Cohen, D.A., Leone, A-P., Press, D.Z. & Robertson, E.M. (2005). Off-line learning of motor skill memory: A double dissociation of goal and movement. In Press, DOI: 10.1073/pnas.0506072102.
Link to related review article
Todd Shackleford and colleagues asked 461 men to complete questionnaires about their use of ‘mate retention behaviours’ (see weblink) and their use of violence against their partners. They also asked a separate sample of 560 women to rate their partner’s use of mate retention tactics and their use of violence. Finally, 107 married couples gave the same information concerning the husband’s behaviour.
Across all three samples, the researchers found certain male behaviours tended to be associated with the use of violence against women. Men who were violent toward their partners also tended to use emotional manipulation (e.g. threatening to hurt themselves if their partner left them), to monopolise their partner’s time (e.g. not letting her go out without them), and/or to punish their partner’s infidelity (e.g. by becoming angry when she flirted with anyone else).
Other mate retention behaviours showed the opposite pattern and tended to be associated with a lack of violence. These included telling their partner they love them and spending a lot of money on their partner.
The researchers said “At a practical level, results of these studies can potentially be used to inform women and men, friends and relatives, of danger signs – the specific acts and tactics of mate retention – that portend the possibility of future violence in relationships in order to prevent it before it has been enacted”. They also acknowledged that women are sometimes violent towards men.
Shackleford, T.K., Goetz, A.T., Buss, D.M., Euler, H.A. & Hoier, S. (2005). When we hurt the ones we love: Predicting violence against women from men’s mate retention. Personal Relationships, 12, 447-463.
Link to mate retention inventory
In a first study, Angela Duckworth and Martin Seligman (Positive Psychology Centre, University of Pennsylvania) recruited 140 school children (average age 13 years) at the start of the academic year. In the Autumn, the children, their parents and teachers, all completed questionnaires about the children’s self-discipline. The measures asked things about the children’s ability to follow rules, to avoid acting impulsively, and to put off instant rewards for later gratification. Scores from the different measures were combined to create an overall indicator of self-discipline.
The researchers found self-discipline predicted all sorts of academic measures taken seven months later, including the children’s average grade for the academic year, their Spring exam result and their selection into High School.
A second study with 164 children (average age 13) followed a similar procedure but also involved the children taking an IQ test in the Autumn. Self-discipline again predicted later academic performance, as measured by their average grade for the year and their Spring exam result. Moreover, the researchers found that the children’s self-discipline scores accounted for twice as much of the variation in their later academic performance as their IQ did.
The researchers said “Underachievement among American youth is often blamed on inadequate teachers, boring textbooks, and large class sizes. We suggest another reason for students falling short of their intellectual potential: their failure to exercise self-discipline”.
Duckworth, A.L. & Seligman, M.E.P. (2005). Self-discipline outdoes IQ in predicting academic performance of adolescents. Psychological Science, 16, 939-944.
Now Peter Fischer and colleagues have revisited the phenomenon and come to the more heartening conclusion that people are likely to help if they perceive that someone is in serious danger.
Fischer’s team recruited 86 participants who were led to believe they were taking part in an experiment in which they had to observe the way men and women flirt with each other. The participants thought they were watching a live video feed from an adjacent room in which male and female strangers were meeting each other, but really they were watching pre-prepared video clips.
The first two clips each featured a different man and a woman meeting for the first time and passed uneventfully. However, during the third clip, which featured a third couple played by professional actors, the man grew increasingly aggressive towards the woman, until by the end of the clip he was being violent and abusive towards her.
Crucially, some participants watched a clip that featured a huge brute of a man (high danger condition), while other participants were shown a clip that featured a scrawny, skinny man (low danger condition). Also, half the participants were sat on their own, while the other half were accompanied by what they thought was another participant but was really an assistant to the researchers. When the man in the clip started getting aggressive, this other ‘participant’ just shrugged and said (s)he didn’t want to get involved.
When it was a little skinny man who started getting violent, the bystander effect seemed to occur: 50 per cent of participants who were sat alone went off to help the woman, compared with just 6 per cent when another ‘participant’ was sat with them. However, when the violent man was a large brute, the bystander effect virtually disappeared: 44 per cent went to help when they were on their own, compared with 40 per cent in the company of another ‘participant’.
Lead researcher Dr. Peter Fischer said “The good news is that when people are in real trouble, they have a good chance of receiving help, even when another bystander is present”.
Fischer, P., Greitemeyer, T., Pollozek, F. & Frey, D. (2005). Unresponsive bystander behaviour: Are bystanders more responsive in dangerous emergencies? European Journal of Social Psychology. In Press, DOI: 10.1002/ejsp.297.
Seth Pollak and colleagues (pictured) compared levels of the neuropeptides oxytocin and vasopressin in 18 four-year-old orphans and 21 age-matched control children. The controls had been raised in a typical family environment in Wisconsin, whereas the orphans had been adopted by an American family after spending the first year and a half of their lives in a Russian or Romanian orphanage where they experienced little human contact.
The researchers measured the children’s hormones at baseline; when they played with their mother (tickling and patting each other according to instructions given by a computer game); and when they played with a stranger.
Compared with controls, the orphans had lower baseline levels of vasopressin, a hormone thought to be specifically involved in recognising familiar people. Another difference emerged when the children played with their mother – oxytocin levels rose in the control children but not in the orphans. Oxytocin receptors are found in the brain’s reward pathways and it’s thought the hormone plays a role in feelings of security and protection.
The researchers said this showed “a failure to receive species-typical care disrupts the normal development of the oxytocin and vasopressin systems in young children. Perturbations in this system may interfere with the calming and comforting effects that typically emerge between young children and familiar adults who provide care and attention”. They explained these observations were consistent with reports that children reared in institutionalised settings continue to demonstrate social problems even after settling into an adopted family environment. However, they also cautioned that the current results were group effects – not all the orphans showed the hormonal differences, and children with lowered hormonal levels can go on to develop normal relationships.
But couldn’t the group difference in rising oxytocin levels be explained by the fact the controls were playing with their biological mother while the orphans were playing with their adopted mother? Lead author Alison Wismer Fries told the Digest “…our findings suggest that the neglected children's new primary attachment figures were not serving to activate the [hormonal] system the same way that typically reared kid's mothers do. And this can then help us explain the increased risk for affiliative problems in the post-institutionalized sample”.
Wismer Fries, A.B., Ziegler, T.E., Kurian, J.R., Jacoris, S. & Pollak, S.D. (2005). Early experience in humans in associated with changes in neuropeptides critical for regulating social behaviour. Proceedings of the National Academy of Sciences, USA, 102, 17237-17240.
Stefan Willich and colleagues interviewed the patients about their experiences of noise at home and at work, and used a Berlin ‘noise map’, and work-place assessments to corroborate the participants’ statements.
In men, both higher noise at home and at work were associated with increased risk of having a heart attack, whereas for women only noise at home was a factor, possibly because the female participants tended to spend more time at home. In women only, risk of heart attack was also independently related to the annoyance caused by noise, rather than just noise levels per se. Other known risk factors such as smoking and diabetes were controlled for throughout these analyses.
The researchers said “Sound pressure levels and/or annoyance by noise may enhance psychological stress and anger and lead to impaired physiological factors such as increasing catecholamine levels associated with increased blood pressure and plasma lipids [known risk factors for heart attack]”.
The researchers also noted that the risk of heart attack did not rise proportionately with noise levels, rather there seemed to be a cut off so that participants who experienced noise levels above 60 decibels (the level typical in a busy office) were exposed to increased risk of heart attack relative to participants who only experienced noise below 60 decibels.
Current European Union regulations are that workplace noise should not be higher than 85 decibels. On this the researchers said “The results emphasize the need to reassess the importance, in general, and the adequate thresholds, in particular, of wearing ear protection at work places. The currently used threshold of 85 decibels may protect sufficiently from hearing damage but not from cardiovascular risk”.
Willich, S.N., Wegscheider, Stallman, M. & Keil, T. (2005). Noise burden and the risk of myocardial infarction. European Heart Journal, In Press, DOI: 10.1093/eurheartj/ehi658.
Sara Lazar and colleagues at Massachusetts General Hospital scanned the brains of 20 people who meditated for an average of 40 minutes per day and 15 controls with no meditation experience. The meditating participants were practitioners of Buddhist Insight meditation, which involves concentrating on stimuli ‘in the moment’, in a non-judgmental way and without cognitive elaboration – a process known as ‘mindfulness’.
Controlling for age and education, the researchers found that specific areas of the cortex were thicker in the participants who meditated compared with controls. These areas included the right anterior insula, known to be involved in monitoring bodily functions, and parts of the prefrontal cortex involved in attention and sensory processing. The observed differences in the insula are consistent with the fact Insight meditation involves concentrating on bodily sensations, including breathing. Parts of the prefrontal cortex showed evidence of thinning in the older control participants but not in the older participants who meditated, thus suggesting meditation might offer protection from age-related neuronal loss.
Another observation was that that the extent of cortical thickening correlated with meditation experience. Experienced practitioners show a noticeable difference in their respiration rate when they are meditating compared with being at rest. The researchers used the size of this difference as a measure of meditating experience, and found that in one specific region in the inferior occipital-temporal visual cortex, experience correlated with cortical thickness.
Jeremy Gray, co-author on the study, said “What is most fascinating to me is the suggestion that meditation practice can change anyone’s grey matter. The study participants were people with jobs and families. They just meditated an average of 40 minutes each day, you don’t have to be a monk”.
The researchers’ report on the work cautions that longitudinal research is needed to confirm that meditation causes the effects it has been associated with here.
Lazar, S. W., Kerr, C. E., Wasserman, R. H., Gray, J. R., Greve, D. N., Treadway, M. T., McGarvey, M., Quinn, B. T., Dusek, J. A., Benson, H., Rauch, S. L., Moore, C. I. & Fischl, B. Meditation experience is associated with increased cortical thickness. NeuroReport, 16, 1893-1897.
To find out what it means to be ‘mentally tough’ in the world of cricket, Stephen Bull, a consultant psychologist, and colleagues at the English and Wales Cricket Board asked 12 elite English players, several of whom had previously been ranked as a top-ten batsman or bowler in the world.
Four main themes emerged from interviews with the cricket stars – ‘environmental influences’, ‘tough character’, ‘tough attitudes’ and ‘tough thinking’. The authors said the relationship between these themes was key, with the framework best visualised as a pyramid with environmental influences at the base. These influences lead to a generally tough character, which in turn manifests as a set of tough attitudes. “Finally, and very specifically”, they explained, “…on top of these attitudes sits ‘tough thinking’, which represents the key psychological properties of a ‘mentally tough’ mind, oriented towards the competition demands of the moment”.
Key environmental factors were: parental influence, childhood background, exposure to foreign cricket and overcoming childhood setbacks. “I must have been 15 when I was going to be signed up as a leg spinner and then just lost it. That was mentally a very defining year for me. You’ve gone from being a hero in your school to being a bloke who’s lost it” one player explained.
Aspects of ‘tough character’ included being independent and having resilient confidence. ‘Tough attitudes’ included being willing to take risks, going the extra mile, believing in quality preparation and having a ‘never say die’ mindset. “You can throw whatever stones you want at me but I am not going off this course. It might take me 10 or 15 years but I will get there. I will play for England” one player said, recalling his early career attitudes. Tough thinking included good decision making at critical moments in a match, honest self-appraisal, and overcoming self-doubts.
The authors said that thanks to recommendations arising from the research, the role of sports psychology consultants is now developing within the ECB’s age-group squads.
Bull, S.J., Shambrook, C.J., James, W. & Brooks, J.E. (2005). Towards an understanding of Mental Toughness in Elite English Cricketers. Journal of Applied Sports Psychology, 17, 209-227.
Shamala Kumar and Carolyn Jagacinski gave 93 male and 42 female undergrads questionnaires that tested their experience of the imposter phenomenon (by probing their agreement with statements like “I can give the impression I am more competent than I really am”; see here for more examples). Other questionnaires gauged their views on intelligence, their experience of test anxiety, and attitudes towards achievement.
Kumar and Jagacinski found the female students agreed with significantly more imposter-related statements than the male students, and that among the female students only, feelings of being an imposter tended to be associated with the belief that intelligence is a fixed attribute that cannot be developed over time.
Men who reported feelings of being an imposter approached tasks with the aim of avoiding negative comparison with their peers, agreeing with statements like “The reason I do my work is so others won't think I'm dumb”. Women who felt like imposters tended to seek favourable comparison with their peers, agreeing with statements like “I would feel successful at university if I did better than most of the other students”. Indeed, across both sexes, those students who had feelings of being an imposter tended to measure their own success against the achievements of others, rather than viewing task success as an end in itself, and so they tended to disagree with statements like “I do my work is because I like to learn new things”.
Kumar, S. & Jagacinski, C.M. (2006). Imposters have goals too: The imposter phenomenon and its relationship to achievement goal theory. Personality and Individual Differences, 40, 147-157.
To test this idea further, Viren Swami and Martin Tovee asked 61 male undergraduates at a British University to rate the attractiveness of 50 differently-sized women as depicted in black and white photos. The women were either emaciated, underweight, normal, overweight or obese, according to their body mass index (the ratio of height to weight). They were dressed in identical grey leotards and their faces were obscured. The male participants were recruited as they were entering or exiting the university dining hall, and they rated whether they were hungry or full on a 7-point scale.
The researchers found that the hungrier participants rated heavier women as more attractive than the full participants did. The hungrier men’s ratings were also less affected by the women’s shape, as measured by their hip to waist ratio.
“Temporary affective states can produce individual variation in mate preferences that mirrors patterns of cultural differences”, the researchers concluded. They also speculated about the generalisability of the findings. “If hungry men judge heavier women as more attractive than satiated men, might they also judge other heavy objects as generally more aesthetically pleasing?”, they asked.
Swami, V. & Tovee, M.J. (2005). Does hunger influence judgments of female physical attractiveness. British Journal of Psychology. In Press, DOI: 10.1348/000712605X80713
If so, please make your suggestion by clicking 'comment(s)' below. The research must be new and published in a respected peer-reviewed journal. Please give the full reference and a brief explanation for why the research would excite our broad readership.
One hundred and twenty participants were shown 15 pairs of female faces (taken from here). For each pair they had to say which of the two faces they found more attractive, and on a fraction of trials they had to say why they’d made that choice, in which case the photo of the face they’d selected was slid across the table to them so they could look at it while they explained their choice. Crucially, on a minority of these trials, the researchers used sleight of hand to surreptitiously pass the participant the photo of the face they had just rejected, rather than the one they’d chosen.
Bizarrely, only about a quarter of these trick trials were noticed by participants, despite the fact the two faces in a pair often bore little resemblance to one another. Even stranger was the way the participants then went on to justify choosing the face on the card they were holding, even though it was actually the face they’d rejected. It’s not that participants weren’t paying attention to the face they’d been passed – the justifications they gave often related to features specific to this face, not the one they’d actually chosen. Independent raters who compared participants’ verbal explanations for choices they had made (non-trick trials), with their explanations for the choices they hadn’t made (trick trials), found no differences in amount of emotional engagement, degree of detail given, or confidence.
The researchers said “Participants failed to notice conspicuous mismatches between their intended choice and the outcome they were presented with, while nevertheless offering retrospectively derived reasons for why they chose the way they did. We call this effect choice blindness”.
Lead researcher Petter Johansson told The Digest that interviews with the participants afterwards confirmed the choice blindness effect was real rather than a consequence of participants being afraid to say something odd was going on at the time.
Johansson, P., Hall, L., Sikstrom, S. & Olsson, A. (2005). Failure to detect mismatches between intention and outcome in a simple decision task. Science, 310, 116-119.
The therapy, known as ‘behavioural therapy (BT) Steps’, involves clients using a manual to help them understand what triggers their compulsions, and how to gradually resist performing the rituals (e.g. hand washing) they normally carry out when exposed to those triggers. Clients use a touch-tone phone to report their progress and receive encouragement via recorded voice messages. Such systems allow people who don’t have face-to-face access to a clinician to engage in therapy. However, what’s not clear is how important it is for such clients to receive scheduled telephone calls from a clinician providing help and advice, as opposed to clients being given a free-phone number that they can choose to call whenever they need guidance.
To find out, Mark Kenwright and colleagues at the Institute of Psychiatry recruited 44 sufferers of OCD to take part in a 17 week BTSteps programme. Half the clients were given a free-phone number to contact lead research Mark Kenwright, a clinician, for guidance; the other half received nine scheduled calls from him over the 17 week period.
Clients who received the scheduled calls were less likely to drop out, were more likely to complete homework that involved resisting performing obsessive rituals, and showed greater improvement in their OCD symptoms. Overall, the clients given scheduled calls received an average of 76 minutes of clinician support over the phone, compared with the clients given a free-phone number who received an average of only 16 minutes clinician support (only 8 clients actually used the support number).
“To further ease OCD sufferer’s access to computer-aided self-help at home, the BTSteps system, including its user’s manual, will henceforth be made accessible on the internet under the name of OCFighter...” the researchers said.
Kenwright, M., Marks, I., Graham, C., Franses, A. & Mataix-Cols, D. (2005). Brief scheduled phone support from a clinician to enhance computer-aided self-help for obsessive-compulsive disorder: randomised controlled trial. Journal of Clinical Psychology, 61, 1499-1508
They compared the creativity of 33 three-person groups across two tasks. After the first task, half the groups exchanged one of their team for a newcomer from another group, whereas the other half of the groups kept the same personnel throughout. The first task required the groups to think of as many ways as possible to categorise 12 vegetables into subgroups (e.g. can be eaten raw vs. cannot be eaten raw). The second task required the groups to think of as many uses as possible for a cardboard box.
There were no differences between the groups in the number of vegetable sub-categories they thought of. But those groups who swapped one of their members then went on to think of significantly more uses for a cardboard box, and significantly more different kinds of uses, than did the groups who’d kept the same personnel throughout. Analysis of the contributions made by each individual showed that a newcomer to a team increased the creativity of the two original team members.
The researchers cautioned that things are more complex in real-life scenarios. For example, recomposition of team membership could lead to interpersonal conflict and of course creativity isn’t always the aim of a group. However, they concluded that “For the most part, introducing new talent to the group can ensure that the group does not go stale”.
Choi, H-K. & Thompson, L. (2005). Old wine in a new bottle: Impact of membership change on group creativity. Organisational Behaviour and Human Decision Processes, 98, 121-132.
Robin Thompson and colleagues at the University of California asked 33 deaf signers to name 20 pictures of famous faces, and then to translate a list of fairly obscure English words, including names of cities and countries, into ASL.
In ASL, many famous names are ‘finger spelled’, with a different hand-shape spelling out each letter of the name. For the famous faces task, 21 of the participants experienced 55 instances of a ‘tip of the finger’ syndrome between them, in which they were sure they knew the name but couldn’t spell it out. As in spoken language, in which people can often only think of the first letter of a word, the participants here were often able to sign the first letter but no more.
Most other words and proper nouns in ASL are represented by a precise combination of hand-shape, location, orientation and movement. When it came to translating obscure English words, 13 of the participants reported 24 instances of ‘tip of the finger’ experiences between them. This often manifested as an ability to recall one or more of the correct hand-shape, location, orientation or movement, but a failure to recall all these aspects, which is needed to communicate the word correctly. This mirrors the way speakers can often recall the first letter, the number of syllables, length, or certain sounds of a word they are struggling to think of.
With spoken language, ‘tip of the tongue’ syndrome is considered to show that word meaning and word form are represented separately in the mind. The authors said their study shows a similar situation pertains in sign language and helps dispel the common misconception that Sign is a form of complex pantomime in which intended meaning and Sign form are always related.
Thompson, R., Emmorey, K. & Gollan, T.H. (2005). “Tip of the fingers” experiences by deaf signers. Psychological Science, 16, 856-860.
In Singapore, children are separated into ability streams based on their performance in public examinations they take before starting secondary school at age 12. Liu Woon Chia and colleagues recruited 495 boys and girls who were beginning their first year at secondary school (284 were in higher-stream classes, 211 in lower stream). They asked them to complete questionnaires about their academic confidence (e.g. by rating their agreement with statements such as “I am good at most of my school subjects”) and their academic effort (e.g. “I study hard for my tests”) on four occasions over three years: at the beginning of their first term at secondary school – about two weeks after the streaming process –and then at the end of each academic year thereafter. Scores from the two test dimensions were collapsed to form an overall score of academic ‘self-concept’ or self-esteem.
Children allocated to lower ability classes started off with lower academic self-esteem than children allocated to higher ability classes, a difference that disappeared when only boys were considered. Girls appear to be more sensitive to the stigmatising effect, the researchers said. But after three years, although academic self-esteem had fallen across the sample overall (adolescence is a difficult time), it was now children in the lower ability classes who had the higher academic self-esteem.
The researchers suggested children in lower ability classes may benefit from a ‘big fish in a little pond’ effect relative to their peers in high ability classes who face more pressure and stiffer competition from their classmates. Also, in Singapore, lower-stream children are given a limited chance to jump streams, which could have a motivating effect.
Liu, W.C., Wang, C.K.J. & Parkins, E.J. (2005). A longitudinal study of students’ academic self-concept in a streamed setting: The Singapore context. British Journal of Educational Psychology. In Press, DOI: 10.1348/000709905X42239.
Kellett and Gross interviewed 54 male joyriders aged between 15 and 21 years, who were in custody in either North Ireland or the Midlands following convictions for car theft.
As with drug abuse, the main motivation for joyriding seemed to be mood-modification. “…you get a buzz out of drugs yeh, well it’s 10 times better than that…” said one participant.
There was evidence of ‘tolerance’ as participants described stealing ever faster cars and seeking out more chases with the police. When they couldn’t joyride participants craved the thrill “I don’t know, you just get like an aching in your mind…you just want to go out there and just drive about”. And at least one participant described taking more drugs when he was unable to joyride, indicative of ‘withdrawal’.
Some participants had tried to stop but couldn’t. “I don’t know what it is, I’ve tried to stop, I just can’t do it, I’ve tried and tried but I just can’t do it, I don’t know why”, one 16-year-old said. Indeed, participants continued joyriding even in the face of overwhelming negative consequences: “like I put one of my mates in a coma before”, said one.
The authors suggested borrowing rehabilitation strategies, such as harm reduction, from the field of addiction. “If joyriders are not considering changing their behaviour in the immediate future then it is sensible to consider ways in which their activities might be made safer”, they said. For example, the authors recommended providing “…meaningful education regarding the effects of taking drugs and alcohol whilst driving…the benefit of wearing seatbelts; judging distance and stopping times; the effects of adverse weather conditions; plus discussions of the times and places that increase the likelihood of encountering pedestrians”.
Kellett, S. & Gross, H. (2005). Addicted to joyriding? An exploration of young offenders’ accounts of their car crime. Psychology, Crime and Law, 12, 39-59.
First about pain experiences in experiments.
In reply to those (Karen, Freya, Richard, Dave S.) justly concerned with the use of pain in behavioural experiments, I would like to offer a few words of explanation. This explanation does not apply to many studies where excessive electric shocks are used, but does apply to a great many behavioural studies, such as this one, in which the animals are required to exhibit normal emotional states.
The expression 'tail-shock' sounds bad if one does not realize that five brief and mild pain incidents per day is the least of unpleasant experiences the rats may go through in normal life. Not only do they inflict more harm on each other in normal fighting, but the effects of bites or scratches could be much more painful, prolonged, dangerous, and even lethal. Animals trained with the use of pain, such as in this study, are spared long-lasting unpleasant experiences of hunger, thirst, low or high ambient temperature, anxiety, cutaneous itch or swellings from lice or other parasites - all of which are normal in "natural life conditions."
Both I and my co-workers regularly tried the electric shock on ourselves. It wasn’t pleasant but was certainly preferable to a rat’s bite. Our entire experiment had to be very tolerable for the rats, because they needed to learn when it was safe and when not. They wouldn’t have been able to learn to relax and feel relief if the training was more disturbing. The same intensity of shock was used on cats in previous studies and to our surprise and satisfaction, many of these cats purred and fell asleep between trials. Both our cats and rats, when handled before and after the daily session, were quiet and friendly.
If it is accepted that there is a need to study emotional states of anxiety, fear and relief, then the administration - carefully and as humanely as possible - of pain is inevitable. Pain is not an abnormal experience – some cultural attitudes not withstanding – and a total lack of it (pain deprivation) may be deleterious to normal non-exaggerated responding to it and future coping with it.
As to the question of “scientific interest” and “benefit …. for humanity” (Karen, Louise) or replacing rats with humans (Dave Stevens – you probably think of paid volunteers, but I've even received suggestions of using inmates), consider this:
First, it is interesting per se, to find common psycho-physiologic grounds between our and other animals’ behaviour and “psyche.” Second, such commonality allows us to explore new ways (treatments, drugs) for dealing with human suffering (anxieties, depressions, phobias etc.). I do not know of any other procedure or behavioural test, or physiological index, that compares anxiety and relief, which would provide 20-fold (2000%) difference in objective measurements (it is usually measured in fractions, like 35% or so). Our rats sigh approx. 25 times/hour spontaneously, less than 10/h when anxious, but more than 180/h when relieved! So, consider, if you really have experimental animals` wellbeing in mind and not just negative feeling about any animal experimentation, that far fewer animals will be required to test a new psychotropic drug or some other procedure, when measuring emotional states (using sighs instead of heart rate, blood pressure etc.) is so dramatically improved (as demonstrated by the current study). And third, why not humans? Indeed, why not? I am sure now that rats have pioneered this approach, studies of human sighing should be considered as one of many other possible steps. But, humans seem diverged from other mammals (rats?) in that they use sighs in many emotional contexts: We sigh with relief, but also for something, to somebody, with disappointment, in frustrations, when resenting, etc. etc. That complicates things, unfortunately.Our paradigm that elicits in the rat three emotions (anxiety, fear and relief) reliably within 15 seconds of each trial has undeniable simplicity. So please, try to accept the possibility that mild aversive experiences and clear-cut highly significant results will benefit both humanity (leading to the fast reliable testing of new drugs) and animals (because fewer will be needed to obtain results).
Over hundreds of trials, Soltysik and Jelen trained 16 rats to expect an electric shock to their tail after they heard an auditory tone, but not to expect a shock if a light came on after the tone. In this way, fear could be induced in the rats, followed by relief if they saw the light come on. All the while, the researchers monitored the rats’ breathing by recording their diaphragm muscles. Sighs are easily recognisable, the researchers explained, because they appear as a “deep ‘additional’ inhalation that starts at or around the peak of a normal respiratory cycle”.
Three hundred and eleven sighs were recorded across the course of the experiment, the vast majority of them during the ‘relief phase’ that followed a light coming on, indicating a shock would not occur. Averaged across all the rats, 7.4 times more sighs occurred during this ‘relief phase’ than during the equivalent period when the light didn’t come on – a fear phase – that occurred between the tone sounding and a shock being given.
The researchers said it’s possible “This respiratory act was recruited during evolution to signal reduced perception of danger, and/or to synchronise the emotional state of the group (collective sighs of relief?)…” They added that “the sigh could be a signal opposite to the alarm cry”.
To test this theory further they plan experiments to see if sighing is more prevalent in the company of other rats, and to test whether sighing is impaired in rats raised in social isolation.
Soltysik, S. & Jelen, P. (2005). In rats, sighs correlate with relief. Physiology and behaviour, 85, 598-602.
"...sessions with more therapy-focused utterances were associated with better reports of perceived progress in weekly feedback from clients"The therapists worked with four clients each before training, and six each after the training. Researchers listened to recordings of the therapists’ sessions and noted each time the therapists spoke and whether or not their utterances referred to the therapy or to their relationship with the clients (e.g. “I’ve noticed that you don’t look at me when we are discussing sensitive issues” or “What’s so important about whether I like you or not”). Some therapists claim they already discuss the therapeutic process with clients, but here the researchers confirmed the therapists focused more on the therapeutic process after the training. For example, they looked at the proportion of sessions in which more than one in five therapist utterances were focused on the therapeutic process: this was 3.7 per cent of sessions before training, compared with 21 per cent afterwards.
Crucially, sessions with more therapy-focused utterances were associated with better reports of perceived progress in weekly feedback from clients, and with a tendency for clients to report more improvement in their relationships. However, more focus on the therapeutic process was not related to actual symptomatic improvement, as measured by clients’ weekly completion of the Beck Depression Inventory.
Whereas this study grouped together all therapy-focused utterances, the researchers said future work should probe deeper, to identify what kinds of therapeutic focus are beneficial.
Kanter, J.W., Schildcrout, J.S. & Kohlenberg, R.J. (2005). In vivo processes in cognitive therapy for depression: Frequency and benefits. Psychotherapy Research, 15, 366-373.
In an effort to rectify this bias, Lai’s team recruited 80 healthy adults, took saliva samples from them six times a day for two days (for measuring cortisol levels), and asked them to complete questionnaires about their optimism/pessimism and their mood over the last month and the last day.
They found that in men only, optimism was associated with lower cortisol levels after waking up, when levels of the hormone tend to peak as part of a daily cycle. The researchers said more research was needed to explain this gender difference. In men and women, they found that a generally positive mood during the last month was associated with lower cortisol levels over the whole day, even after controlling for good or bad mood on the day of testing.
The researchers said their findings “may draw increased attention to the potential impact of positive psychological dispositions or conditions on cortisol secretion and thus initiate a shift of research focus to the physiological substrates of positive states of minds…”. Future work should investigate whether the effects of positive psychological states on cortisol levels, as reported here, have actual health benefits, the researchers said.
Lai, J.C.L., Evans, P.D., Ng, S.H., Chong, A.M.L., Siu, O.T., Chan, C.L.W., Ho, S.M.Y., Ho, R.T.H., Chan, P. & Chan, C.C. (2005). Optimism, positive affectivity, and salivary cortisol. British Journal of Health Psychology, 10, 467-484.
Link to the Positive Psychology Centre
Link to special issue of The Psychologist on Positive Psychology (free access)
"Earlier trauma plays a causal role in schizophrenia, it’s argued, because it can leave people prone to finding...psychotic symptoms distressing"One suggestion is that some people would not have developed schizophrenia if they hadn’t had an earlier traumatic experience. According to this argument, psychotic experiences (for example, hearing voices; having paranoid thoughts) are not necessarily pathological (e.g. see here), rather they only become problematic if a person finds them distressing. Earlier trauma plays a causal role in schizophrenia, it’s argued, because it can leave people prone to finding these psychotic symptoms distressing.
To test this idea, Maarten Bak (Maastricht University) and colleagues interviewed thousands of people from the general population who had never had a psychotic experience, to find out if they had been traumatised in any way as a child. Three years later, researchers interviewed the same people again to find out whether or not they had had a psychotic experience since the first interview, and secondly, to find out if they found their psychotic experience(s) distressing or just unusual.
Among the 16 people who reported having had one or more non-distressing psychotic experiences since the first interview, just one (six per cent) had been traumatised as a child (according to their statements in the first interview). In contrast, among the 21 people who reported having had one or more distressing psychotic experiences, nine (43 per cent) had been traumatised as a child. The people who said they’d been traumatised also tended to report having less control over their psychotic experience(s).
The authors said their findings suggest “exposure to early trauma, defined here as self-reported traumatic experiences in childhood, predisposes persons to suffer from more emotional distress associated with psychotic experiences and less perceived control over these experiences, compared with those without a traumatic history”.
Bak, M., Krabbendam, L., Janssen, I., de Graaf, R., Vollebergh, W. & van Os, J. (2005). Early trauma may increase the risk for psychotic experiences by impacting on emotional response and perception of control. Acta Psychiatrica Scandinavica, 112, 360-366.
Link to related essay in the Guardian
Simone Bosbach (pictured) and colleagues tested two patients, IW and GL, who, because of sensory neuropathy, have the extremely rare condition of lacking any peripheral sensation or proprioception. The patients watched a video of a man lifting different boxes. When the man was always given correct information about the weight of the boxes, the patients, like controls, were able to correctly judge whether he’d lifted a heavy or light box. However, in a second experiment, when the man was occasionally given incorrect information about the boxes’ weight, the patients, unlike controls, were unable to judge from his movements whether or not a box weighed what he had expected (except when the task was made easier using larger boxes). Patient IW also couldn’t make this judgment correctly when he viewed videos of himself performing the same lifting task.
It’s not that the patients were incapable of forming a motor representation of the box lifting movements per se, otherwise IW wouldn’t have been able to lift the boxes. Rather, the authors believe the patients’ lack of peripheral sensation and proprioception affected their ability to activate or sustain a mental simulation of the lifting movements when watching them performed. The researchers said “[The patients’] reduced ability in the present task suggests that to judge mismatches between action preparation and performance in others, one has to access subconscious sensorimotor programmes, which IW and GL may lack”.
Bosbach, S., Cole, J., Prinz, W. & Knoblich, G. (2005). Inferring another’s expectation from action: the role of peripheral sensation. Nature Neuroscience, 8, 1295-1297.
"this 'lost generation' of mature-aged unemployed people needs particular help..."The findings come from a qualitative study by Rob Ranzijn and colleagues at the University of South Australia. They conducted group interviews with 27 participants aged between 45 and 71, all of whom were seeking work, or wished to change to more satisfactory work. Participants were invited to discuss their situation with the group.
The interviews were taped and transcribed and emerging themes were identified. Psychological themes included loss of self-worth, reduced quality of life, narrowed horizons (i.e. previous retirement plans were having to be reconsidered), inability to use talents and to contribute, effects on family relationships and concerns about the future. One participant said “Yes, it would be very frightening [not to get another job], it would be very, very frightening and I think that is something that you cannot afford at this time in our lives to be complacent about”.
Another issue that emerged was ‘skill atrophy’ – “the processes whereby continuing unemployment can lead to a progressive decay of skills and a perceived (by potential employers) decline in competencies”. This in turn led to what the researchers called the ‘peg-down phenomenon’ – “the older job-seekers’ reduced expectations of both the level of jobs attainable and the likelihood of attaining any such jobs”. In light of this, the authors recommended that “policies aiming to help mature-aged unemployed people re-enter the workforce must include focused training in the skills required in the current job market”.
The researchers concluded that this “’lost generation’ of mature-aged unemployed people needs particular help; otherwise they may live for another 30 or more years without ever again finding satisfactory employment”.
Ranzijn, R. Carson, E., Winefield, A.H. & Price D. (2005). On the scrap-heap at 45: The human impact of mature-aged unemployment. Journal of Occupational and Organisational Psychology. In Press, DOI: 10.1348/096317905X66828.
"...Nicer weather was associated with better mood, memory and a broader mindset, but only among participants who’d spent more than 30 minutes outside".In a second experiment conducted on various days in Spring and early Summer, 121 participants completed tests of their mood and short-term memory before and after relaxing for 30 minutes. Half the participants were asked to relax outdoors, the others indoors. Consistent with the first experiment, the researchers found that when the weather was good, participants’ mood and memory tended to have improved after they’d relaxed outdoors, but not if they’d relaxed indoors.
A final study was conducted to take into account other geographical locations and times of year. As before, more pleasant weather was found to enhance the mood of people who’d been outside for long enough, but it only had this benefit in Spring – probably because in Summer it can often get too hot for comfort, the authors said.
Keller, M.C., Fredrickson, B.L., Ybarra, O., Cote, S., Johnson, K., Mikels, J., Conway, A. & Wager, T. (2005). A warm heart and a clear head. The contingent effects of weather on mood and cognition. Psychological Science, 16, 724-731.
Plomin’s team found two thirds of the genetic influence on maths ability also explained variation in reading and general intelligence, thus suggesting most of the genes that affect maths also affect reading and general intelligence. Whereas genes tended to explain the similarity of a child’s performance across these domains, environmental factors tended to explain the differences. “One direction for future research is to identify the non-shared environmental factors that are experienced differently by twins, even identical twins, even in the same classroom and that contribute to differences in children’s relative performances in mathematics and reading”, the authors said.
Non-shared environmental factors are experiences that have uniquely affected one twin but not the other, even though they have been raised and taught together. Lead researcher Yulias Kovas told The Digest such factors could include “…pre-, peri- and post-natal influences, including childhood illnesses, differential parental influence, or differential effects of curricula on children”. Kovas added that “If these non-shared environmental factors can be identified, they could lead to more individualized curricula, although much more research is necessary to clarify whether such a move towards individualization in education is necessary or practically possible”.
Kovas, Y., Harlaar. N., Petrill, S.A. & Plomin, R. (2005). ‘Generalist genes’ and mathematics in 7-year-old twins. Intelligence, 33, 474-489.
They tested hundreds of students on various tasks, including a word game (Boggle), spotting an artist’s name embedded in his paintings (visual search), and spotting grammatical errors in a passage of text. After completing a task, participants were asked to evaluate their performance twice – first based on the solutions/ mistakes they knew they had identified, and then again after they’d been told about all the solutions/mistakes they’d missed. As the researchers predicted, they found the participants’ self-evaluations were far more accurate after they were given the extra information about the answers they had missed.
This phenomenon will be even more relevant when it comes to the less defined problems of real life, the researchers argued. “For example, it is impossible to catalogue all the solutions to such ill-defined tasks as designing an architecturally significant building, composing a classic country and western song, or writing the poem that resurrects the sonnet…”, the researchers said, “…as a consequence people are never really in a position to know just how well they have done because it is difficult to know all the alternative solutions they could have arrived at”.
Modestly, the researchers concluded by acknowledging that their investigation into this effect might well have been conducted better: “…there is [sic] bound to be some efforts we could have made but could not identify. We concede at this point that we do not know of them”.
Caputo, D. & Dunning, D. (2005). What you don’t know: The role played by errors of omission in imperfect self-assessments. Journal of Experimental Social Psychology, 41, 488-505.
"...patients presenting with a new onset unprovoked seizure should be evaluated for a history of suicide attempt and major depression".Dale Hesdorffer (Columbia University) and colleagues recruited 324 Icelandic adult and child participants (median age 34 years) who had recently suffered two or more seizures that weren’t caused by fever, head trauma or central nervous system infection, thus pointing to a diagnosis of epilepsy. Six hundred and forty-seven age-matched control participants who lacked a history of epilepsy were selected using the Icelandic population registry. All the adult participants were interviewed over the phone to determine whether they had ever suffered from major depression or had attempted suicide. For both groups, only depression or suicide attempts that occurred prior to when the seizure patients had suffered their seizures were counted. The same information was obtained for the child participants by interviewing their parents.
Child and adult participants who had recently experienced unexplained seizures were 1.7 times more likely to have previously suffered from major depression than the control participants (i.e. 12 per cent of the seizure group vs. 7.4 per cent of the control group). And they were 5.1 times more likely to have attempted suicide in the past (i.e. 6.5 per cent of the seizure group vs. 1.4 per cent of the control group). Previous attempted suicide remained significantly more prevalent among the participants who’d recently suffered seizures, even after taking into account rates of prior depression and alcohol consumption.
“Clearly, patients presenting with a new onset unprovoked seizure should be evaluated for a history of suicide attempt and major depression”, the researchers advised.
Hesdorffer, D.C., Hauser, W.A., Olafsson, E., Ludvigsson, P. & Kjartansson, O. (2005). Depression and suicide attempt as risk factors for incident unprovoked seizures. In Press, Annals of Neurology. DOI: 10.1002/ana.20685.