The UK Science Media Centre (SMC) offered to journalists (and any citizen scientists too busy or too lacking in confidence to make their own critical appraisals) a briefing on cognitive behavior therapy .
I offer a quick critical appraisal of the “three top experts” advertised as giving an honest appraisal.
I conclude that what would be offered is highly biased. Any organization making claims that such an evaluation is trustworthy is itself untrustworthy. Please read on and decide if you agree with me.
Meta-lesson: When you encounter a source that screams to you that it is trustworthy, maybe you should be particularly skeptical and simply move on to alternative sources.
September 2, 2016
CBT – does it really work?
Journalists came along to the Science Media Centre to discuss issues such as:
- What exactly is CBT and how does it work? Is it just talking?
- Where do we have robust evidence of CBT working really well?
- What about issues like placebos and double-blind RCTs – does it matter if we don’t have the same level of rigour as for drugs?
- Are there concerns about CBT being overused or misused? Can it cause harm? How do we ensure practitioners have sufficient training in its use?
- How cost-effective is CBT? Are we using it enough?
- Can we see a future where psychological therapies replace pharmacological ones for many mental illnesses?
The SMC invited in three top experts to give an honest appraisal of the evidence, to discuss how CBT fits into the wider picture and to explain the pros and cons of this therapy.
Prof. Rona Moss-Morris, Professor of Psychology as Applied to Medicine, King’s College London’s Institute of Psychiatry, Psychology & Neuroscience
Prof. Michael Sharpe, Professor of Psychological Medicine Research, University of Oxford
Prof. Dame Til Wykes, Professor of Clinical Psychology and Rehabilitation, King’s College London’s Institute of Psychiatry, Psychology & Neuroscience
Prof Til Wykes
I have learned to be skeptical of UK expertise offered with Dame or Sir in the title. So, having been declared a vexatious American, I’ll simply call her Professor Wykes. But given she is deemed a Dame, I will do her the honor of starting with her first.
Professor Wykes has written an Evidence-based Mental Health commentary that repeatedly screams “Take me seriously.”
Wykes T. Cognitive-behaviour therapy and schizophrenia. Evidence Based Mental Health. 2014 Aug 1;17(3):67-8.
She attests to the quality of the evidence that she considered:
So is CBTp effective?
All the meta-analyses described here were carried out rigorously. They allow us to draw a picture of a therapy that developed from its roots in medication-resistant positive symptoms to the wealth of new targets and then to the more radical—the prevention of psychosis. Only one recent meta-analysis does not show beneficial effects after accounting for methodology. This odd one out is odd for very good reasons and provides us with a backdrop to understand the treatments required in the future.
For instance in a paradigm changing study for people with continuing symptoms but who refuse medication, Morrison et al19 demonstrated successful treatment by CBTp. This is the first time such an RCT has been carried out and the authors are to be commended on their rigour and particularly on reporting adverse events (2 in the CBTp group vs 6 in the control group). The study challenges the belief that CBTp is only an adjunct to medication treatment in more chronic populations—although we cannot yet conclude that CBTp should be recommended until we have more and larger trials.
I have written extensively about the Morrison et al study in blog posts [1, 2] and in a letter to The Lancet in which I complained about its quality and the study having been registered after it was already well in progress.
The Morrison et al. study:
- Lacked an adequate comparison control group.
- Claimed to be evaluating CBT for psychotic patients refusing medication, but in the course of the psychological treatment, most patients received some antipsychotic medication.
- Had more authors than patients maintained in follow-up.
- Relied on voodoo statistics that ignored the large attrition from an already small sample in arriving at a positive conclusion.
I could go on but I think you get the picture of what I think of this study.
Keith Laws also delivered devastating blog posts about the study (such as 1 and 2) with excellent accompanying music selections. With Peter McKenna, Professor Laws also published a letter in the Lancet. Professor Laws and his colleagues also authored the meta-analysis that Wyke dismissed as “the odd one out” that “is odd.”
Despite Wyke’s evaluation, Professor Laws’ meta-analysis is much more consistent with what has been offered by the Cochrane collaboration. He and his colleagues delivered a devastating reply to Wykes that leaves the Dame looking rather silly and ill-informed:
Keith R Laws, KR, McKenna, PJ Jauhar, S The odd one out. Evidence Based Mental Health.
Professor Wykes discusses the effectiveness of CBT, contrasting the positive findings of her and co-workers’ meta-analysis in 2007 (Wykes et al, 2007) with our recent finding, in a meta-analysis using similar inclusion criteria, of small effects on positive symptoms that became nonsignificant in blind studies (Jauhar et al, 2014). As well as these two ‘comprehensive’ meta-analyses, she considers three recent selective meta- analyses by: Burns et al (2014), van der Gaag et al (2014) and Turner et al (2014). Wykes makes the point that our meta-analysis differs in its conclusions from these others. We would like to clarify why this may be the case, specifically with regard to inclusion/exclusion criteria and the reporting of effect sizes
It is not immediately apparent why Wykes rejects the obvious explanation for why our meta-analysis had weaker findings than her similarly comprehensive meta-analysis – more recent studies over the intervening years have largely been negative. This interpretation is supported by the fact that only 1 of the 11 studies included in our meta- analysis since Wykes et al 2007, has documented a significant impact of CBT on positive symptoms. Moreover, in the one significant study (van der Gaag et al, 2011) the authors noted that blinding was compromised. It seems inherently unlikely that CBT would somehow work better in treatment- resistant than treatment-responsive patients. To maintain that CBT works against delusions and hallucinations requires rejection of the finding from a comprehensive meta-analysis of little effect against positive symptoms in favour of those of a much smaller one, which employed rather convoluted inclusion and exclusion criteria.
I have offered to debate the authors of this book, but they have consistently refused. In their absence, I have presented detailed critiques in blog posts [1, 2, 3] and what I think is at least an entertaining slideshow from an invited presentation at the Royal Edinburgh.
Prof. Michael Sharpe
Professor Sharpe has been getting a lot of media attention as one of the authors of The Lancet PACE trial of cognitive behavior therapy and graded exercise for chronic fatigue syndrome. He and his co-authors claimed that would be damaging to their reputations to release the data in response to a Freedom of Information Act request. I did not make my request for that data under the FOIA. I took advantage of the authors having published another study in PLOS One, where acceptance was contingent on the data being available. The authors still have not complied, but they unkindly labeled me as vexatious for having made the request
While praising the strength of the results and winning praises from the likes of Professor Sir Simon Wessely, these authors and their universities spent over £250,000 resisting the release of their data. Just days before the data were released as ordered, Professor Sharpe publicly conceded that rates of improvement were substantially lower than reported in The Lancet. The discrepancy was due to the outcomes having been switched in The Lancet.
Among the many other problems in the credibility faced by Professor Sharpe and his co-authors is that the prestigious US Agency for Healthcare Research & Quality has declared Sharpe et al’s case identification used in the PACE trial deficient.
Using the Oxford case definition results in a high risk of including patients who may have an alternate fatiguing illness or whose illness resolves spontaneously with time. In light of this, we recommended in our report that future intervention studies use a single agreed upon case definition, other than the Oxford (Sharpe, 1991) case definition.
Prof. Rona Moss-Morris
In blogging about the PACE trial, I have said numerous things about Prof. Rona Moss-Morris, not all of them flattering, but never as unflattering as what I have said about Professors Wyke and Sharpe. Here I will only say that Professor Moss-Morris was brought in by the Science Media Centre to offer an expert reaction to the follow-up report from the PACE trial.
I provided a heavily accessed blog post on follow-up study, an invited critical letter with Keith Laws to The Lancet Psychiatry, as well as a slideshow for a hastily prepared pub talk in Edinburgh that is now received over 14,000 views. I obviously don’t like the follow up study and you can look at any of these materials to see my reasons.
You can go to the Science Media Centre to see Professor Moss-Morris’ expert opinion. She was quite embarrassed to have been brought in and declared a conflict of interest generally missing in this kind of publicity campaign:
Prof. Rona Moss-Morris: “Two authors of this study, Trudie Chalder and Kimberley Goldsmith, are colleagues of mine at King’s College London. I work with Trudie on other CFS work and with Kimberley on different work. I published a small study on GET in 2005. I am a National Advisor for NHS England for improving access to psychological therapies for long-term conditions and medically unexplained symptoms. Peter White (another author of the present study) is Chair of trial steering committee for an HTA NIHR-funded RCT I am working on with people with irritable bowel syndrome.”
So, here you have me offering an evaluation of a media event that I did not actually attend. How vexatious of me, but I at least offer ample documentation of my complaints and concerns. Take issue with me in comments here if you like or you can simply swallow what the Science Media Centre feeds to you.
Institutions and websites like SMC which brand themselves as trusted sources should offer independent critical evaluations of claims that investigators are making about their favorite interventions. Otherwise, they are serving as l public relations machines.