UK enquiry into peer review Part II: scientific analysis vs impact
Posted Jun 02 2011 9:27am
Following on from my post yesterday about the ongoing UK House of Commons Select Committee on Science and Technology peer review enquiry , the oral session that Faculty of 1000 ( F1000 ) took part in focussed on a long and interesting discussion about splitting up the review process into two constituent parts as the Public Library of Science (PLoS) has done. The first part of the process focuses purely on the analysis of the science pre-publication and the second, a post-publication process, focuses on the impact of the work.
When it comes to impact assessment, that is of course where F1000 comes in. The Committee Chair, MP Andrew Miller, joked that F1000 was the systematic approach where PLoS One was the X-Factor approach! I think we agreed that all the metrics have their problems if used alone and so the challenge is to create as many metrics as possible and then bring them together in a sensible way to get a better understanding of the true impact of a piece of work. In fact, in the Beyond Impact workshop ran by Cameron Neylon last month, a group of us worked to create a mock-up of a ‘live CV’ ( Total Impact ) which assembles all of the researchers’ outputs (not just research articles), associates each researcher with as many metrics as possible, and can be updated live. This dashboard approach can be expanded to cover the output of whole research groups, institutions or recipients of grants from a particular funder.
The discussions on the scientific analysis part of peer review led to debate about reproducibility and we reminded the panel that except for extremely rare cases (certainly in biology & medicine), reproducibility cannot normally be checked in the timeframe of the standard peer review process. And as Malcolm Read (Executive Secretary, JISC) said with reference to papers published in climate science and geology: ‘you can’t repeat an earthquake’. You can say ‘this seems ok’ but not ‘it is ok’.
Of course, better data sharing would allow for easier checks on reproducibility and thereby probably reduce fraudulent behaviour. In fact data sharing has a great many benefits as an increasing number of people are realising, including the re-use of data by others (alone or in combination with other data) which should then provide a significant benefit to the advancement of science and hence the public purse through the funders. Dr Andrew Sugden (Deputy Editor & International Managing Editor, Science magazine) voiced concerns about being able to share data privately during the peer review process although Dryad has now launched a version of their data-sharing platform which enables this to happen.
In fact we suggested making data deposition mandatory. True, standards are not available for all data types yet but a group called BioSharing are already compiling a list of existing data standards and then working with publishers and others to identify areas that still require standards and so that shouldn’t stop us sharing. Other issues such as the file sizes of some datasets may also make this more difficult but with cloud computing and further ongoing computational advances, this shouldn’t prevent us from sharing going forward either.
Mandates are certainly not the only answer as they can be hard to enforce (the NIH and UK funders are finding it difficult to enforce the OA mandates) so the single thing that will probably have the greatest impact is making sure there are enough incentives for researchers to deposit their data. And that is where our upcoming initiative, F1000 Research, will come in, publishing peer reviewed data publications with their own citations …but that is for another blog post!
Back to this enquiry, I think we all left still wondering what the purpose of the whole enquiry is and what they are hoping to achieve. This was made even more evident when at the end of the latest session, following 5 long rounds of oral sessions, MP Graham Stringer suggested that he felt that maybe they should have been looking at the commercial pressure on both editors of journals and researchers instead.
[Please note that the quotes taken from the latest evidence transcripts are still uncorrected documents and the final form of their publication have not yet been approved by the Committee]