A meta analysis of interventions training non-mental health professionals to deal with mental health problems

 quick takesWhen garbage in isn’t necessarily garbage out.

I often see statements that meta-analyses can be no better than the literature on which they draw. The point is often underscored with something like “garbage in, garbage out” (GI,GO).

I certainly agree that we should not contact synthetic meta-analyses, aimed at isolating a single effect size when the available literature consists of small, similarly flawed studies. Yet, that is what is done entirely too often.

Many Cochrane Collaboration reviews depend on only a small handful of randomized controlled trials. The reviews typically acknowledge the high risk of bias, but policymakers, clinicians, and the advocates for the treatments that are reviewed seize on the encouraging effect sizes, ignoring  the limited quantity and quality of evidence.

In part that’s a failure of Cochrane, but in part it also reflects how hungry consumers are confident reviews of the literature, when such reassurance is just not possible.

John TukeyI think a meta-analysis of the literature characterized by mediocre studies can be valuable if it doesn’t attempt to provide an overall effect size, but only to identify the ways in which the current literature is bad and how it should be corrected. These are analytic or diagnostic meta-analysis, with the diagnostic assessment being applied to the adequacy of the existing literature and how it can be improved.

That’s why I think a recent review of anti-stigma and other training mental health training programs for non-mental health professionals is so valuable.

Booth A, Scantlebury A, Hughes-Morley A, Mitchell N, Wright K, Scott W, McDaid C. Mental health training programmes for non-mental health trained professionals coming into contact with people with mental ill health: a systematic review of effectiveness. BMC Psychiatry. 2017 May 25;17(1):196.

The review is explicit about the low quality of the literature, pointing out that most studies don’t even evaluate the key question of whether people with mental health problems come in contact with professionals benefit from what are often expensive intervention programs.

The review also points out that many studies use idiosyncratic and inadequately validated outcomes measures to assess whether interventions work. That is inexcusable because the existing literature is readily available to those desiring such studies, along with validated measures of effects on the target population.

At best, these intervention programs only have short-term benefits for the attitudes of the professionals receiving them, with little assessment of long-term benefits or of impact on the target population. This is hardly a recommendation from for large-scale programs without better evidence that they work.

Agencies funding expensive intervention programs often require evaluation components. Too bad that those conducting such programs don’t fulfill their responsibility in providing an adequate demonstration that what they are being paid for to provide actually works.

We should be quite skeptical of the claims that are made for anti-stigma and other educational programs targeting non-mental health professionals. The burden of proof is on those who market such programs, and the conflict of interest in making extravagant claims should be recognized.

We should get real about unrealistic assumptions behind such programs. Namely, the programs are predicated on the assumption that we can select a few professionals, expose them to brief interventions with unvalidated content, and expect the professionals to react expertly when suddenly thrown in situations involving persons with mental health problems. The intervention programs are typically too weak and unfocused. The programs don’t prepare professionals very well to respond effectively to unexpected, but fortunately infrequent encounters in which how they perform is so critically important.

I was once asked to apply for NIMH grant that would prepare primary care physicians to respond more effectively to older patients who were suicidal, but not expressing their intent directly. I declined to submit an application after I calculated that physicians would encounter such a situation only about once every 18 months. It would take a huge randomized trial to demonstrate any effect. But NIMH nonetheless funded a trial doomed to being uninformative from before it even enrolled the first physician.

What a systematic search yielded and what could be concluded

From 8578 search results, 19 studies met the inclusion criteria: one systematic review, 12 RCTs, three prospective non-RCTs, and three non-comparative studies.

The training interventions identified included broad mental health awareness training and packages addressing a variety of specific mental health issues or conditions. Trainees included police officers, teachers and other public sector workers.

Some short term positive changes in behaviour were identified for trainees, but for the people the trainees came into contact with there was little or no evidence of benefit.

Conclusions

A variety of training programmes exist for non-mental health professionals who come into contact with people who have mental health issues. There may be some short term change in behaviour for the trainees, but longer term follow up is needed. Research evaluating training for UK police officers is needed in which a number of methodological issues need to be addressed.

The studies included in the systematic review were all conducted in the USA. Eight of the 19 primary studies included took place in the USA, three in Sweden, three in England, two in Australia, and one each in Canada, Scotland and Northern Ireland. Participants included teachers, public health professionals, university resident advisors, community practitioners, public sector staff, and case workers. Law enforcement participants were trainee, probationary, university campus, and front line police officers.

The review noted that there isn’t an excuse for poor assessment of outcomes in these programs:

A recent systematic review of the measurement properties of tools measuring mental health knowledge recommends using tools with an evidence base which reach the threshold for positive ratings according to the COSMIN checklist [42].