University and clinical trial investigators must release data to a citizen-scientist patient, according to a landmark decision in the UK. But the decision could still be overturned if the University and investigators appeal. The scientific community needs the decision to be upheld. I’ll argue that it’s unwise for any appeal to be made. The reasons for withholding the data in the first place were archaic. Overturning of the decision would set a bad precedent and would remove another tooth from almost toothless requirements for data sharing.
We didn’t need Francis Collins, Director of National Institutes of Health to tell us what we already knew, the scientific and biomedical literature is untrustworthy.
And there is the new report from the UK Academy of Medical Sciences, Reproducibility and reliability of biomedical research: improving research practice.
There has been a growing unease about the reproducibility of much biomedical research, with failures to replicate findings noted in high-profile scientific journals, as well as in the general and scientific media. Lack of reproducibility hinders scientific progress and translation, and threatens the reputation of biomedical science.
Among the report’s recommendations:
- Journals mandating that the data underlying findings are made available in a timely manner. This is already required by certain publishers such as the Public Library of Science (PLOS) and it was agreed by many participants that it should become more common practice.
- Funders requiring that data be released in a timely fashion. Many funding agencies require that data generated with their funding be made available to the scientific community in a timely and responsible manner
A consensus has been reached: The crisis in the trustworthiness of science can be only overcome only if scientific data are routinely available for reanalysis. Independent replication of socially significant findings is often unfeasible, and unnecessary if original data are fully available for inspection.
Numerous governmental funding agencies and regulatory bodies are endorsing routine data sharing.
The UK Medical Research Council (MRC) 2011 policy on data sharing and preservation has endorsed principles laid out by the Research Councils UK including
Publicly funded research data are a public good, produced in the public interest, which should be made openly available with as few restrictions as possible in a timely and responsible manner.
To enable research data to be discoverable and effectively re-used by others, sufficient metadata should be recorded and made openly available to enable other researchers to understand the research and re-use potential of the data. Published results should always include information on how to access the supporting data.
The Wellcome Trust Policy On Data Management and Sharing opens with
The Wellcome Trust is committed to ensuring that the outputs of the research it funds, including research data, are managed and used in ways that maximise public benefit. Making research data widely available to the research community in a timely and responsible manner ensures that these data can be verified, built upon and used to advance knowledge and its application to generate improvements in health.
The Cochrane Collaboration has weighed in that there should be ready access to all clinical trial data
Summary results for all protocol-specified outcomes, with analyses based on all participants, to become publicly available free of charge and in easily accessible electronic formats within 12 months after completion of planned collection of trial data;
Raw, anonymised, individual participant data to be made available free of charge; with appropriate safeguards to ensure ethical and scientific integrity and standards, and to protect participant privacy (for example through a central repository, and accompanied by suitably detailed explanation).
Many similar statements can be found on the web. I’m unaware of credible counterarguments gaining wide acceptance.
Yet, endorsements of routine sharing of data are only a promissory reform and depend on enforcement that has been spotty, at best. Those of us who request data from previously published clinical trials quickly realize that requirements for sharing data have no teeth. In light of that, scientists need to watch closely whether a landmark decision concerning sharing of data from a publicly funded trial is appealed and overturned.
The Decision requiring release of the PACE data
The UK’s Information Commissioner’s Office (ICO) ordered Queen Mary University of London (QMUL) on October 27, 2015 to release anonymized from the PACE chronic fatigue syndrome trial data to an unnamed complainant. QMUL has 28 days to appeal.
Even if scientists don’t know enough to care about Chronic Fatigue Syndrome/Myalgic Encephalomyelitis, they should be concerned about the reasons that were given in a previous refusal to release the data.
I took a critical look at the long-term follow up results for the PACE trial in a previous Mind the Brain blog post and found fatal flaws in the authors’ self-congratulatory interpretation of results. Despite authors’ claims to the contrary and their extraordinary efforts to encourage patients to report the intervention was helpful, there were simply no differences between groups at follow-up
Background on the request for release of PACE data
- A complainant requested release of specific PACE data from QMUL under the Freedom of Information Act.
- QMUL refused the request.
- The complainant requested an internal review but QMUL maintained its decision to withhold the data.
- The complainant contacted the ICO with concerns about how the request had been handled.
- On October 27, 2015, the ICO sided with the complainant and order the release of the data.
A report outlines Queen Mary’s arguments for refusing to release the data and the Commissioner’s justification for siding with the patient requesting the data be released.
Reasons the request release of data was initially refused
The QMU PACE investigators claimed
- They were entitled to withhold data prior to publication of planned papers.
- An exemption to having to share data because data contained sensitive medical information from which it was possible to identify the trial participants.
- Release of the data might harm their ability to recruit patients for research studies in the future.
The QMU PACE researchers specifically raised concerns about a motivated intruder being able to facilitate re-identification of participants:
In relation to a motivated intruder being able facilitate re-identification of participants, the University argued that:
“The PACE trial has been subject to extreme scrutiny and opponents have been against it for several years. There has been a concerted effort by a vocal minority whose views as to the causes and treatment of CFS/ME do not comport with the PACE trial and who, it is QMUL’s belief, are trying to discredit the trial. Indeed, as noted by the editor of the Lancet, after the 2011 paper’s publication, the nature of this comprised not a ‘scientific debate’ but an “orchestrated response trying to undermine the credibility of the study from patient groups [and]… also the credibility of the investigators and that’s what I think is one of the other alarming aspects of this. This isn’t a purely scientific debate; this is going to the heart of the integrity of the scientists who conducted this study.”
Bizarre. This is obviously a talented masked motivated intruder. Do they have evidence that Magneto is at it again? Mostly he now is working with the good guys, as seen in the help he gave Neurocritic and me.
Let’s think about this novel argument. I checked with University of Pennsylvania bioethicist Jon Merz, an expert who has worked internationally to train researchers and establish committees for the protection of human subjects. His opinion was clear:
The litany of excuses – not reasons – offered by the researchers and Queen Mary University is a bald attempt to avoid transparency and accountability, hiding behind legal walls instead of meeting their critics on a level playing field. They should be willing to provide the data for independent analyses in pursuit of the truth. They of course could do this willingly, in a way that would let them contractually ensure that data would be protected and that no attempts to identify individual subjects would be made (and it is completely unclear why anyone would care to undertake such an effort), or they can lose this case and essentially lose any hope for controlling distribution.
The ‘orchestrated response to undermine the credibility of the study’ claimed by QMU and the PACE investigators, as well as issue being raised of the “integrity of the scientists who conducted the study” sounds all too familiar. It’s the kind of defense that is heard from scientists under scrutiny of the likes of Open Science Collaborations, as in psychology and cancer. Reactionaries resisting post-publication peer review say we must be worried about harassment from
“replication police” “shameless little bullies,” “self-righteous, self-appointed sheriffs” engaged in a process “clearly not designed to find truth,” “second stringers” who were incapable of making novel contributions of their own to the literature, and—most succinctly—“assholes.”
Far fetched? Compare this to a QMU quote drawn from the National Radio, Australian Broadcast Company April 18, 2011 interview of Richard Horton and PACE investigator Michael Sharpe in which former Lancet Editor Richard Horton condemned:
A fairly small, but highly organised, very vocal and very damaging group of individuals who have…hijacked this agenda and distorted the debate…
Alas, all scientific findings should be scrutinized, all data relevant to the claims that are made should be available for reanalysis. Investigators just need to live with the possibility that their claims will be proven wrong or exaggerated. This is all the more true for claims that have substantial impact on public policy and clinical services, and ultimately, patient welfare.
[It is fascinating to note that Richard Horton spoke at the meeting that produced the UK Academy of Medical Sciences report to which I provided a link above. Horton covered the meaning in a Lancet editorial in which he amplified the sentiment of the meeting: “The apparent endemicity of bad research behaviour is alarming. In their quest for telling a compelling story, scientists too often sculpt data to fit their preferred theory of the world.” His editorial echoed a number of recommendations of the meeting report, but curiously omitted mentioning of data sharing.]
Fortunately the ICO has rejected the arguments of QMUL and the PACE investigators. The Commissioner found that QMUL and the PACE investigators incorrectly interpreted regulations in their withholding of the data and should provide the complaint with the data or risk being viewed as in contempt of court.
In his decision, the Commissioner found that QMUL failed to provide any plausible mechanism through which patients could be identified, even in the case of a “motivated intruder.” He was also not convinced that there is sufficient evidence to determine that releasing the data would result in the mass exodus of a significant number of the trial’s 640 participants nor that it would deter significant numbers of participants from volunteering to take part in future research.
Requirements for data sharing in the United States have no teeth and situation would be worsened by reversal of ICO decision
Like the UK, the United States supposedly has requirements for sharing of data from publicly funded trials. But good luck in getting support from regulatory agencies associated with funding sources for obtaining data. Here’s my recent story, still unfolding – or maybe, sadly, over, at least for now.
For a long time I’ve fought my own battles about researchers making unwarranted claims that psychotherapy extend the lives of cancer patients. Research simply does not support the claim. The belief that psychological factors have such influence on the course and outcome of cancer sets up cancer patients to be blamed and to blame themselves when they don’t overcome their disease by some sort of mind control. Our systematic review concluded
“No randomized trial designed with survival as a primary endpoint and in which psychotherapy was not confounded with medical care has yielded a positive effect.”
Investigators who conducted some of the best ambitious, well-designed trials to test the efficacy of psychological interventions on cancer but obtained null results echoed our assessment. The commentaries were entitled “Letting Go of Hope” and “Time to Move on.”
I provided an extensive review of the literature concerning whether psychotherapy and support groups increased survival time in an earlier blog post. Hasn’t the issue of mind-over-cancer been laid to rest? I was recently contacted by a science journalist interested in writing an article about this controversy. After a long discussion, he concluded that the issue was settled — no effect had been found — and he could not succeed in pitching his idea for an article to a quality magazine.
But as detailed here one investigator has persisted in claims that a combination of relaxation exercises, stress reduction, and nutritional counseling increases survival time. My colleagues and I gave this 2008 study a careful look. We ran chi-square analyses of basic data presented in the paper’s tables. But none of our analyses of group assignment on mortality more disease recurrence was significant. The investigators’ claim of an effect depended on dubious multivariate analyses with covariates that could not be independently evaluated without a look at the data.
The investigator group initially attempted to block publication of a letter to the editor, citing a policy of the journal Cancer that critical letters could not be published unless investigators agreed to respond and they were refusing to respond. We appealed and the journal changed its policy and allowed us additional length to our letter.
We then requested from the investigator’s University Research Integrity Officer the specific data needed to replicate the multivariate analyses in which the investigators claimed an effect on survival. The request was denied:
The data, if disclosed, would reveal pending research ideas and techniques. Consequently, the release of such information would put those using such data for research purposes in a substantial competitive disadvantage as competitors and researchers would have access to the unpublished intellectual property of the University and its faculty and students.
Recall that we were requesting in 2014 specific data needed to evaluate analyses published in 2008.
I checked with statistician Andrew Gelman whether my objections to the multivariate analyses were well-founded and he agreed they were.
Since then, another eminent statistician Helena Kraemer has published an incisive critique of reliance in a randomized controlled trial on multivariate analyses and simple bivariate analyses do not support the efficacy of interventions. She labeled adjustments with covariates as a “source of false-positive findings.”
We appealed to the US Health and Human Services Office of Research Integrity (ORI) but they indicated no ability to enforce data sharing.
Meanwhile, the principal investigator who claimed an effect on survival accompanied National Cancer Institute program officers to conferences in Europe and the United States where she promoted her intervention as effective. I complained to Robert Croyle, Director, NCI Division of Cancer Control and Population Sciences who twice has been one of the program officer’s co-presenting with her. Ironically, in his capacity as director he is supposedly facilitating data sharing for the division. Professionals were being misled to believe that this intervention would extend the lives of cancer patients, and the claim seemingly had the endorsement NCI.
I told Robert Croyle that if only the data for the specific analyses were released, it could be demonstrated that the claims were false. Croyle did not disagree, but indicated that there was no way to compel release of the data.
The National Cancer Institute recently offered to pay the conference fees to the International Psycho-Oncology Congress in Washington DC of any professionals willing to sign up for free training in this intervention.
I don’t think I could get any qualified professional including Croyle to debate me publicly as to whether psychotherapy increases the survival of cancer patients. Yet the promotion of the idea persists because it is consistent with the power of mind over body and disease, an attractive talking point
I have not given up in my efforts to get the data to demonstrate that this trial did not show that psychotherapy extends the survival of cancer patients, but I am blocked by the unwillingness of authorities to enforce data sharing rules that they espouse.
There are obvious parallels between the politics behind persistence of the claim in the US for psychotherapy increasing survival time for cancer patients and those in the UK about cognitive behavior therapy being sufficient treatment for schizophrenia in the absence of medication or producing recovery from the debilitating medical condition, Chronic Fatigue Syndrome/Myalgic Encephalomyelitis. There are also parallels to investigators making controversial claims based on multivariate analyses, but not allowing access to data to independently evaluate the analyses. In both cases, patient well-being suffers.
If the ICO upholds the release of data for the PACE trial in the UK, it will pressure the US NIH to stop hypocritically endorsing data sharing and rewarding investigators whose credibility depends on not sharing their data.
As seen in a PLOS One study, unwillingness to share data in response to formal requests is
associated with weaker evidence (against the null hypothesis of no effect) and a higher prevalence of apparent errors in the reporting of statistical results. The unwillingness to share data was particularly clear when reporting errors had a bearing on statistical significance.
Why the PACE investigators should not appeal
In the past, PACE investigators have been quite dismissive of criticism, appearing to have assumed that being afflicted with Chronic Fatigue Syndrome/Myalgic Encephalomyelitis precludes a critic being taken seriously, even when the criticism is otherwise valid. However, with publication of the long-term follow-up data in Lancet Psychiatry, they are now contending with accomplished academics whose criticisms cannot be so easily brushed aside. Yes, the credibility of the investigators’ interpretations of their data are being challenged. And even if they do not believe they need to be responsive to patients, they need to be responsive to colleagues. Releasing the data is the only acceptable response and not doing so risks damage to their reputations.
QMUL, Professors White and Sharpe, let the People’s data go.