When the topic of open science comes up, most of us are likely to think of journals that make their content freely accessible on-line. This is, however, just the tail-end of an extensive, multi-faceted process, capturing results in their completed form. The concept of open science has much more to say about that process, and where it can go wrong.
More specifically, open science considers the reliability and reproducibility of the resulting information. These two principles are usually hailed as hallmarks of the scientific method, and likely to be taken for granted by readers of scholarly publications. Unfortunately, much of this material is far less reliable and reproducible than most of us would assume. In fact, the quality of science can vary dramatically in the day-to-day world of research, curtailed by limited budgets, eagerness to publish, institutional pressures, and habitual corner-cutting.
“It’s not a few bad actors — it’s the entire system of research,” said Robert Thibault, a post-doctoral researcher at Stanford University, whose work assesses the calibre of scientific research. Speaking at webinar in January organized by the Canadian Science Policy Centre, he outlined the goals of the recently established Canadian Reproducibility Network (CaRN).
“One is to improve the outputs of research, to make research publications and data-sets more trustworthy, reliable, and reproducible,” he said. “The second aspect is to try to make the research process healthier and more vigorous, so that people can do the type of research they want to do.”
Kelly Cobey, a scientist with the University of Ottawa Heart Institute, who serves on CaRN’s steering committee with Thibault, argued that science poorly serves the interest of the society that supports it.
“Research waste is the norm,” she said. “The vast majority of research is conducted in a way that doesn’t set it up to be reproducible, to be transparent, and to be rigorous. Everything from the questions we ask, to the way we design our studies, to the way we manage our studies, and the way we report our studies.”
Cobey cited statistics from a Canadian study of 6,720 research trials conducted between 2009 and 2019. The findings indicated that only 59 percent of those trials were publicly registered, and only 39 percent subsequently reported their results to the registry. For findings from Canadian sites, some 48 percent were subsequently published in an academic journal.
“Less than half of the trials that are registered go on to be published,” Cobey observed. “That’s highly problematic to me, as a member of the public.”
It gets worse, she added. Among those studies conducted exclusively in Canada, which was about half of the total, only three percent were registered, reported their results to the registry, and published their results in a journal. In contrast, among the studies conducted with international partners, that proportion rises to 41 percent.
“Things like the development of systematic reviews are pretty negligible in our system right now,” argued Cobey. “You can get funding to investigate a question without having synthesized the evidence — to me, that’s a little bizarre. We don’t want our tax dollars going to invisible research that never gets published, or gets published selectively and supports irreproducibility.”
Fishy findings
Such complaints were recently voiced by a group of 16 professors and research scientists working on different aspects of fisheries, epidemiology, and aquaculture. They wrote an open letter to Fisheries Minister Joyce Murray, pointing out the technical flaws in a Science Advisory Report from the Department of Fisheries and Ocean’s Canadian Science Advisory Secretariat, which concluded parasitic sea ice in wild salmon could not be blamed on sea lice found in nearby salmon farming operations.
The letter details seven distinct problems they found in the way the report was assembled, including improper statistical analyses, a lack of reference to the current state of scientific knowledge on this topic, and the inability of outside parties to assess the conclusion because the underlying data were not published.
“The Science Response Report in no way overturns the accumulated scientific evidence that salmon farms are one of the primary drivers of sea louse infestations on nearby wild juvenile salmon,” the letter concluded, arguing it “fails to meet widely accepted scientific standards on numerous fronts, and therefore falls well short of the quality of science advice that you need to make informed decisions on the future of salmon aquaculture in Canada.”
This criticism blamed a lack of data on a lack of reproducibility, which consequently undermines the contribution science can make to public policy. In some cases, researchers deliberately withhold their original data, especially if they are competing with others to the first to announce a major discovery. More typically, though, such evidence is omitted simply because the researchers do not have the resources — nor the interest — to make this data publicly accessible.
“We have some support staff within an institution, like librarians, but what we often don’t have are data research management on hand — statisticians, people who can quality-check our code, quality-check our data,” said Thibault. “This puts a lot of responsibility on busy researchers, especially at the end of their work, when it becomes much more difficult to implement open science principles. We need funding and effort that comes at the beginning, and makes researchers’ lives easier.”
Such help is sometimes offered, according to Margaret Blakeney, a senior policy analyst with the federal government’s Secretariat on Responsible Conduct of Research. This body, which was created by the three main granting agencies — CIHR, NSERC, and SSHRC — helps institutions and individuals aspire to better research practices, as well as handling breaches in such practices that contravene these agencies’ guidelines.
“Many institutions, in response to problems that have arisen, make it a habit to sit down with research teams, soon after they receive their funding, to talk to them, not just about data management issues, but also a variety of issues related to the responsible management of their research project,” she said. “I really like that idea of building time and attention at that step.”
Nevertheless, Cobey insisted, such interactions are the exception, rather than the norm. Institutions express their support for international standards set by agreements such as Declaration on Research Assessment(DORA), with little or no consideration for how they will live up to those standards. She pointed out that the clinical trials study, revealing Canada’s poor record of research accountability, was conducted by a third-year medical student with no prior experience in this field, who received a $5,000 grant to carry out this investigation as a summer project. This happenstance approach reflects the prevailing view on reliability and reproducibility of research.
“We need to choose strategic priorities, but it feels like right now, we sign on to DORA, but we don’t have an implementation plan,” she said. “We’re likely to fall short of our desirable goals if we don’t have the monitoring, funding, and supports in place.”
Nor has there been any forum for even broaching these topics, she concluded. “There is not currently a lot of discussion, or opportunity for that. This is the first such panel in Canada I’ve ever been engaged in.”
R$