A group of psychologists are launching a project this week that they hope will make studies in their field radically more transparent and prompt other
fields to open up as well. With a pledge of $5.25 million from private supporters, they have set up an outfit called the Center for Open Science. It is collaborating with an established journal, Perspectives on Psychological Science, to solicit work from authors who are willing to work completely in the open and have their studies
replicated. Authors will be asked to first publish an experimental design and then, after a public vetting, collect data. Findings come in a separate
publication. Authors would get credit for all steps in this process: experimental designs, peer review, delivering results, and replicating them.
In an ideal world, all research would be as transparent as this, argues Brian Nosek, a psychologist at the University of Virginia in Charlottesville. The
new venture is the brainchild of Nosek and his graduate student Jeffrey Spies. They say it grew naturally out of a quest by Nosek and others to test how much of psychological science is reproducible.
But the center's plans go far beyond psychology and replicability. For example, it will promote a publishing model that involves peer review at the
earliest stage of research, and the critique of experimental designs would be transparent rather than anonymous. Perspectives is adding a special
section to test the model.
Insider spoke with Nosek about the Center for Open Science and its goals. The conversation has been edited for brevity.
Q: What about the worry that all scientists have of getting scooped?
There are two answers to this. The first is that registering a study actually prevents one from getting scooped because precedence is established by date
that the project is made public. This same worry was voiced for manuscript sharing in physics when arXiv.org was emerging,
and it faded when people realized that precedence is established as soon as research is put out into the open, not by when it is ultimately published.
For the worrywart, the second answer is that it is possible to register the study privately. Registration and making public are distinct behaviors. So, the
researcher can register the study privately and only minimal details are publicly viewable to note that a registration has occurred. This preserves privacy
until the researcher is confident that they cannot be scooped.
Q: How much money do you need to really make this work?
So far we have $5.25 million committed with a grant from the Laura and John Arnold Foundation. Three additional funders have approached us in the last 6
months and all asked us to submit a proposal. We believe that these have a high likelihood of success. (I have never experienced anything like this!)
For the current projects supported by the center, we aim for a budget of $15 million for the first 5 years. We have other projects in conception phases
that would require additional resources, and are speaking with funders about these possibilities already.
Q: What are your top three goals?
One: Open the entire research workflow. For example, in addition to the Open Science Framework, we plan to
facilitate open access and downward pressure of publishing fees by creating open source software for journal administration.
Two: Diversify means of reputation building. Currently, the primary means of reputation building is authorship on journal articles. This creates a very
specific expectation of what makes a good scientist. Opening the workflow creates opportunities to get similar credit for other activities. For example,
any component of the research process can be a citable node in the framework. Data is citable, methods are citable, analytic tools and software are
citable. And, once the commenting engine is added, it will enable open and continuous review. Instead of a private peer review system that offers no reward
for reviewers, an open system provides a means of crediting those that contribute to science by critiquing it.
Three: Coordinate journals, societies, and universities to improve scientific practices and create more sensible and sustainable funding models for
scientific communication. For example, we are creating models and infrastructure to support registered replication studies—like the one cited in the Association for Psychological Science's press release. This is a model test that we will make widely available.