Five minutes with the discoverer of the Scientific Impotence Excuse, Geoffrey Munro

When attempting to change people’s behaviour – for example, encouraging them to eat more healthily or recycle more – a common tactic is to present scientific findings that justify the behaviour change. A problem with this approach, according to recent research by Geoffrey Munro at Towson University in America, is that when people are faced with scientific research that clashes with their personal view, they invoke a range of strategies to discount the findings.

Perhaps the most common of these is to challenge the methodological soundness of the research. However, with newspaper reports and other brief summaries of science findings, that’s often not possible because of lack of detail. In this case, Munro's research suggests that people will often judge that the topic at hand is not amenable to scientific enquiry. What’s more, he’s found that, having come to this conclusion about the specific topic at hand, the sceptic will then generalise their belief about scientific impotence to other topics as well (further detail). Munro says that by embracing the general idea that some topics are beyond the reach of science, such people are able to maintain belief in their own intellectual credibility, rather than feeling that they’ve selectively dismissed unpalatable findings.

The Digest caught up with Professor Munro to ask him, first of all, whether he thinks there are any ways to combat the scientific impotence excuse or reduce the likelihood of it being deployed.
"One of the most difficult things to do is to admit that you are wrong. In cases where a person is exposed to scientific conclusions that contradict her or his existing beliefs, one option would be to accept the scientific conclusions and change one’s beliefs. It sounds simple enough, and, for many topics, it is that simple. However, some of our beliefs are much more resistant to change. These are the ones that are important to us. They may be linked to other important aspects of our identity or self-concept (e.g., “I’m an environmentalist ”) or relevant to values that are central to who we are (e.g., “I believe in the sanctity of human life”) or meaningful to the social groups to which we align ourselves (e.g., “I’m a union man like my father and grandfather before him”) or associated with deeply-held emotions (e.g., “Homosexuality disgusts me”). When scientific conclusions challenge these kinds of beliefs, it’s much harder to admit that we were wrong because it might require a rethinking of our sense of who we are, what values are important to us, who we align ourselves with, and what our gut feelings tell us. Thus, a cognitively easier solution might be to not admit our beliefs have been defeated but to question the validity of the scientific conclusions. We might question the methodological quality of the scientific evidence, the researcher’s impartiality, or even the ability of scientific methods to provide us with useful information about this topic (and other topics as well). This final resistance technique is what I called “scientific impotence”.

So, how can strongly-held beliefs be changed? How can scientific evidence break through the defensive tenacity of these beliefs? Well, I hope the paragraph above illustrates how scientific evidence can be threatening when it challenges an important belief. It makes you feel anxious, upset, and/or embarrassed. It makes you question your own intelligence, moral standing, and group alliances. Therefore, the most effective ways to break the resistance to belief-challenging scientific conclusions is to present such conclusions in non-threatening ways. For example, Cohen and his colleagues have shown that affirming a person’s values prior to presenting belief-challenging scientific conclusions breaks down the usual resistance. In other words, the science is not so threatening when you’ve had a chance to bolster your value system. Relatedly, framing scientific conclusions in a way that is consistent with the values of the audience is more effective than challenging those values. Research from my own laboratory shows that reducing the negative emotional reactions people feel in response to belief-challenging scientific evidence can make people more accepting of the evidence. We achieved this by giving participants another source (something other than the scientific conclusions they read) to which they could attribute their negative emotional reactions. While this might be difficult to implement outside of the laboratory, we believe that other factors can affect the degree to which negative emotional reactions occur. For example, a source who speaks with humility is less upsetting than a sarcastic and arrogant pundit. Similarly, the use of discovery-type scientific words and phrases (e.g., “we learned that…” or “the studies revealed that…”) might be less emotionally provocative than debate-type scientific words and phrases (e.g., “we argue that…” or “we disagree with so-and-so and contend that…”). In fact, anything that draws the ingroup-outgroup line in the sand is likely to lead to defensive resistance if it appears that the science or its source is the outgroup. So, avoiding culture war symbols is crucial. Finally, as a college professor, I believe that frequent exposure to critical thinking skills, practice with critical thinking situations, and quality feedback about critical thinking allows people to understand how their own biases can affect their analysis of information and result in open-minded thinkers who are skeptical yet not defensive."
Next, the Digest asked Prof Munro whether he thinks psychology findings are particularly prone to provoke scientific discounting cognitions - and if so, should we as a discipline make extra effort to combat this?
"Yes, I believe psychological research (and probably social science research in general) is prone to provoke scientific discounting. The term “soft science” illustrates how social sciences are perceived differently than the “hard sciences”. There are a number of reasons why this might be true. First, much psychological research is conducted without the use of technologically-sophisticated laboratories containing the fancy equipment that comes to many people’s minds when the word science is used. In other words, psychological research doesn’t always resemble the science prototype. Supporting this position, psychological research that is conducted in high-tech labs (e.g., neuroscience imaging studies) is, in my opinion, perceived with less skepticism by the general public. Second, psychological research often investigates topics about which people already have subjective opinions or, at least, can easily call to mind experiences from their own lives that serve as a comparison to the research conclusions. In other words, people often believe that they already have knowledge and expertise about human thought and behavior. When their opinions run counter to psychological research conclusions, then scientific discounting is likely. For example, there is a common belief that cathartic behaviors (e.g., punching a punching bag) can reduce the frustrations that sometimes lead to aggression. Psychological research, however, has contradicted the catharsis hypothesis, yet the belief remains entrenched, possibly because it has such a strong intuitive appeal. In contrast, people will quickly reveal their lack of expertise on topics in physics or chemistry and have a harder time calling to mind examples from their own lives. Third, there is likely some belief that people’s thoughts and behaviors are less predictable, more mysterious, and affected by more variables than are inanimate objects like chemical molecules, planets in motion, or even the functioning of some parts of the human body (e.g., the kidneys). Furthermore, psychological conclusions are based on probability (e.g., the presence of a particular variable makes a behavior more likely to happen), and probability introduces the kind of ambiguity that makes the conclusions easy to discount. Fourth, some psychological research is perceived to be derived from and possibly biased by a sociopolitical ideology. That is, there is the belief that some psychologists conduct their research with the goal of providing support for some political viewpoint. This is somewhat less common among the “hard sciences” although the controversy over climate change and the researchers who investigate it suggest that if the topic is one that elicits the ingroup-outgroup nature of the cultural divide, then the “hard sciences” are also not immune to the problem of scientific discounting.

I think that the discipline of psychology has already made vast improvements in managing its public impression and is probably held in higher esteem than it was 50 or even 20 years ago. However, continued vigilance is essential against those (both within and outside of the discipline) who contribute to the perception of psychology as something less than science. The field of psychology has much to offer – it can generate important knowledge that can inform public policy and increase people’s health and happiness, but it cannot do so if its scientific conclusions fall on deaf ears."
_________________________________

ResearchBlogging.orgMunro, G. (2010). The Scientific Impotence Excuse: Discounting Belief-Threatening Scientific Abstracts. Journal of Applied Social Psychology, 40 (3), 579-600 DOI: 10.1111/j.1559-1816.2010.00588.x
You have read this article Decision making with the title Five minutes with the discoverer of the Scientific Impotence Excuse, Geoffrey Munro. You can bookmark this page URL https://psychiatryfun.blogspot.com/2010/10/five-minutes-with-discoverer-of.html. Thanks!

No comment for "Five minutes with the discoverer of the Scientific Impotence Excuse, Geoffrey Munro"

Post a Comment