The Gendered Brain and the Seven Deadly Sins of Psychological Science

By |2019-03-27T11:50:02+00:00March 11th, 2019|Blog|2 Comments

A new book has recently hit the market that, from what I have read thus far, is mandatory reading for most scientists in the field. The book is called – The Gendered Brain: The New Neuroscience That Shatters The Myth Of The Female Brain – and the key premise of the book is that men and women’s brains are simply not that different.

While the key issue discussed is certainly important, I believe there are far larger issues for the discipline that are highlighted in this book. These issues undermine scientific pursuit, and as such, I label them the seven deadly sins of modern science.

1. Sound bites and popular press

Early in the book, a study is described that purported to establish differences in the brains of men and women. Sidestepping the details of the study, what is interesting is how the findings from this study got picked up by the media, and without what appears to be much resistance from the authors, gained national prominence. And this is a problem.

Scientists, like many other professionals, are suspectable to the world of fame. Even though we may know that our effect sizes may be small, our sample may be unrepresentative and that we never prove anything but rather simply find supporting evidence, we want our research to make an impact. To make an impact with your peers is often difficult, as they are aware of the problems with any research. It is much easier to win over an unsuspecting public by allowing the media to distort studies and work simply off sounds bites.

The popular press wants research to be easily digestible, and to do this, anything that adds complication to the narrative is simply left out. Sound bites and popular press are not, however, science.

2. Politically correct science

While I could argue the merits of anyone choosing to look at gender differences in cognitive functioning via the size of the brain, what I will always defend is people’s right to conduct the research. We don’t destroy bad science and bad ideas by not letting the ideas surface. Rather, by letting the research out we can then critique the science through better science.

The best medicine for bad ideas is better ideas.

We are currently experiencing a revolution in the social sciences with a backlash against postmodern thinking and the repression of ideas that some people find challenging. I don’t think this battle for the freedom of science can come fast enough.

We live in a far more tolerant and accepting world than the world of our forebearers. However, the desire for a more equal and freer world for all is not won by pseudo-science. A classic example of this is the implicit bias test which is supposedly used to assess someone’s underlying bias to certain categories such as racial groups and genders. The issue that I have is not so much with the test, rather than the defenders of the test who, noting that the psychometric properties for the test are sub-par and the logic questionable, may wish to sidestep these issues in the place of cause (cf. https://journals.sagepub.com/doi/full/10.1177/0963721418797309).

The logic of such researchers is tantamount to saying, “Yes, I know that it lacks the rigour we normally expect, but I don’t care as all I care for is such-and-such ideology and science can take a backseat.” Well, science can’t take a back stance and the rules of the game are that scientific claims and political ideas are different. If people want political ideas to be scientific, then the idea will need to be debated in terms of scientific rigour.

For those with an interest in the topic of implicit bias training, this makes for interesting reading: https://www.equalityhumanrights.com/sites/default/files/research-report-113-unconcious-bais-training-an-assessment-of-the-evidence-for-effectiveness-pdf.pdf

3. Alternative causes and the failure to discuss theory and mechanisms

One of the key claims made in the book is that if we live in a gendered world, we will get gendered brains. The rationale for this assumption is the plasticity of the brain that is constantly evolving to its surroundings. In the review of the field, Rippon argues that this fact is simply not acknowledged to the extent that it should be in many of the studies reviewed.

What Rippon highlights is the problem of failure to look at alternative causes that are far more parsimonious. The failure to build robust models is endemic to social science, and in short, we need a better example of theory building.

Just to show it is not all doom and gloom I provide recent evidence of good theory (cf. https://onlinelibrary.wiley.com/doi/full/10.1002/per.2115) by way of example.

4. The ineptitude of the psychological associations

As a psychologist, I’m governed by psychological societies. I remain a member of the societies as I believe in my profession and I believe that the societies have the potential to work for the greater good of the field. However, more-and-more I see the failure of these organisations who are afraid of being unpopular and in doing so either endorse unscientific practices or fail to speak out when they should.

A recent article in The Psychologist captures the issue of psychological societies failure to have a backbone. Among other areas of discussion, the author discusses problems with the stance taken on video gaming through to the use of CBT. Further recent examples of the failure of societies to get involved in a timely fashion on issues include the potentially questionable research practices of Hans Eysenck, one of UK’s most famed psychologists. (https://journals.sagepub.com/doi/full/10.1177/1359105318820931), (https://journals.sagepub.com/doi/full/10.1177/1359105318822045 ) The absence of the BPS during this time to comment on findings that are, in science-speak, off the chart.

5. The use of marginally significant results

In the field of psychology, it is hard to get statistical and practically significant results. People have huge variability, and our effect sizes are generally much smaller than those found in physical sizes. Nevertheless, many researchers tend to circumspect the standards even further by discussing what is termed ‘marginally significant’ results (results that aren’t quite significant, but we will talk about them anyway as if they are). While the trend does not seem to be worsening in the field, (https://journals.sagepub.com/doi/full/10.1177/0956797619830326) this is most certainly something to look out for when listening to questionable findings.

6. The problem of exaggeration

Many of the issues discussed thus far can be described as problems of exaggeration. Our discipline has long suffered from exaggeration, in part because we want our findings to be more generalisable than they often are. The world of human beings is complex: https://thepsychologist.bps.org.uk/volume-31/october-2018/does-psychology-face-exaggeration-crisis

7. A failure to look at behaviour

Psychology moves in waves. Post the psychoanalytical movement, with such prominent figures as Sigmund Freud, Carl Jung and Karen Horney, we had the counter movement of behaviourism. For behaviourists like Skinner and Watson, it was not what was in your head that mattered but how you acted. While the behaviourist movement naturally gave rise to other schools of thought such as the Humanist and Cognitive movement, the problem is that somewhere along the way behaviour got discarded rather than integrated into the way that we do psychological science. There are a range of relying on self-report questionnaires and these problems arise in much psychological research (by way of one example, smartphones https://psyarxiv.com/6fjr7/).

Now the problem is that these problems are not simply a problem for self-report tools but also for the likes of neurological science: https://psycnet.apa.org/record/2018-09962-001. To overcome these problems, we need to make sure that we don’t rest with data from a single source but rather collaborate our findings with triangulations to make sure that the claims that are made are supported with behavioural outcomes.

The seven deadly sins are far from a comprehensive list. There are many other issues I could have included in the list such as the requirement to publish or perish (favouring quality over quantity), research that is funded to achieve a certain outcome, and the creation of terms with poor construct validity. However, these seven sins are certainly near the top of my list and all need to be addressed if the discipline is to continue to flourish in a positive manner.

References

Baumert, A., Schmitt, M., Perugini, M., Johnson, W., Blum, G., Borkenau, P., Constantini, G. Denissen, J.J.A., Fleeson, W.,Grafton, B., … Read, S.J., Roberts, B, Robinson, M.D., Wood, D., & Wrzus, C. (2017). Integrating personality structure, personality process, and personality development. European Journal of Personality, 31, 503-528.

Doyin Atewologun, Tinu Cornish and Fatima Tresh (2018, March) Unconscious Bias Training: An
assessment of the evidence for effectiveness retrieved from
https://www.equalityhumanrights.com/sites/default/files/research-report-113-unconcious-bais-training-an-assessment-of-the-evidence-for-effectiveness-pdf.pdf

Eliot, L. (2019). Neurosexism: the myth that men and women have different brains. Nature,
566(7745), 453.

Ellis, D. A., Dr, Davidson, B. I., Shaw, H., & Geyer, K. (2018, November 21). Do smartphone usage
scales predict behaviour?. https://doi.org/10.31234/osf.io/6fjr7

Fergusson C. (2019, March) Embrace the unknown, The Psychologist, 32, Retrieved from
https://thepsychologist.bps.org.uk/volume-32/march-2019/embrace-unknown

Haier, R. J., Jung, R. E., Yeo, R. A., Head, K., & Alkire, M. T. (2005). The neuroanatomy of general
intelligence: sex matters. NeuroImage, 25(1), 320-327.

Hughes, B. (2018). Does psychology face an exaggeration crisis. The Psychologist, 31 retrieved from
https://thepsychologist.bps.org.uk/volume-31/october-2018/does-psychology-face-exaggeration-crisis

Jost, J. T. (2018). The IAT is dead, long live the IAT: Context-sensitive measures of implicit attitudes
are indispensable to social and political psychology. Current Directions in Psychological Science, 0963721418797309.

Marks, D. F. (2019). The Hans Eysenck affair: Time to correct the scientific record. Journal of Health
Psychology. https://doi.org/10.1177/1359105318820931

Pelosi, A. J. (2019). Personality and fatal diseases: Revisiting a scientific scandal. Journal of Health
Psychology. https://doi.org/10.1177/1359105318822045

Nowack, K., & Radecki, D. (2018). Introduction to the special issue: Neuro-myth conceptions in
consulting psychology—between a rock and a hard place. Consulting Psychology Journal: Practice and Research, 70(1), 1-10. http://dx.doi.org/10.1037/cpb0000108

Olsson-Collentine, A., L., M. A., & J., C. H. (2019). The Prevalence of Marginally Significant Results in
Psychology Over Time. Psychological Science. https://doi.org/10.1177/0956797619830326

About the Author:

Paul Englert
Dr. Paul Englert is a co-founder of OPRA and Managing Director of OPRA in Asia Pacific. Since 1997 Paul’s professional career has had a single focus. That is to improve the efficiency and effectiveness of organisations through the appropriate application of Industrial/Organisational (I/O) Psychology.

2 Comments

  1. Avatar
    Matt O'Sullivan March 15, 2019 at 5:24 pm - Reply

    Hi Paul
    Hope you are well. Thanks for keeping us honest and reminding us to check the science of what gets reported. I found the links useful/educational as well.

    • Paul Englert
      Paul Englert March 17, 2019 at 12:42 pm - Reply

      Hi Matt

      Nice to hear from you. I’m doing well. I hope life on that side of the world is treating you well.

      Glad that you found the blog useful. Look out for more in the coming months.

      Paul

Leave A Comment