research transparency

Scientists Just Published Ambitious New Guidelines for Conducting Better Research

Blue Laboratory Glassware Montage
Photo: Steve Drake/Corbis

The Michael LaCour scandal, which broke in May and culminated in the retraction of what appeared to be a groundbreaking study in Science, thrust the conversation about scientific integrity into the mainstream spotlight. But among researchers themselves, that conversation had been ongoing for awhile. Even as the scandal was unfolding, the Center for Open Science’s (COS) Transparency and Openness Promotion Committee, which consisted of researchers, journal editors, and others with a frontline view of the scientific publishing process (mostly from the social sciences), was working on a plan to make the scientific process a bit more transparent — and, hopefully, trustworthy.

Today the group released its guidelines via an article in (perhaps fittingly) Science, lead-authored by Bryan Nosek, a University of Virginia psychologist and research-transparency advocate, and they’re ambitious. In short, the plan sets up eight standards that science journals can choose to adopt, each with three levels of stringency. All are geared toward making research more transparent and more accurate, and toward shifting the norms of how findings are published so that sketchy results are less likely to gain prominence.

Take the “data transparency” standard, for example, which deals with whether or not the authors of a given study choose to make their raw data available, making it easier for other researchers to, among other things, check to make sure the authors’ findings stand up to scrutiny and truly fit the data they’ve collected. At level one, articles in a journal have to “state whether data are available and, if so, where to access them” — meaning that if a researcher wants to publish a finding without making their data available, they’ll at least have to explicitly state that they won’t be sharing their data. At the much more stringent level three, “Data must be posted to a trusted repository, and reported analyses will be reproduced independently before publication.” In other words, the article can’t even be published until a second group of researchers re-runs the numbers and says yes, they check it out.

There are some other groups that have tried to develop guidelines for research for varieties of disciplines, but it’s one of the first that’s been this broad,” said Sara Bowman, project manager at COS. At the moment, 111 journals and 33 organizations, including big-name ones like Science, Psychological Science, and the Alfred P. Sloan Foundation, have signed on to the guidelines, which means that they’ll take the next year to determine how many of those guidelines they’ll adopt, and at what levels of stringency.

Their decisions will go a long way toward determining which norms and practices emerge as a result of this effort — if a given guideline ends up being widely ignored or only adopted at the least stringent level, it could slow down the pace of progress toward transparency on that particular front. That said, not all journals have the same resources, and in many cases there will be legitimate reasons not to adopt these guidelines at their most stringent levels. “I also think there are probably some differences in disciplines,” said Bowman. Journals that deal less with original data, for example, will naturally pay less attention to guidelines pertaining to data transparency.

The direct goal of these guidelines, Bowman explained, is to nudge researchers toward better, more transparent practices — and therefore better research — not to detect fraud. But one goal contributes to the other. “It’s a transparency initiative,” she said. “It’s not about rooting out fraud, but one of the consequences of greater transparency is that these things might come to light.”

New Guidelines for Better Scientific Research