The end of this week has seen two very insightful postings on science and governance by Stephen Curry and Neuroskeptic.
Stephen Curry’s post “Who governs science” has what is for me a pithy summary of how science works: “No-one is in charge … …. That structure, or rather, lack of structure has not been arrived at by design but reflects the organic emergence of the scientific enterprise over the past several hundred years… …It poses challenges for good governance but is at the same time a source of great strength.”
This akin in some ways to a world-wide anarcho-syndicalist commune.
How can such “disorganisation” be productive? I would argue that it is precisely the lack of structure that is essential for creativity to flourish, because creativity in any field of culture can only occur through critical thinking. Question everything, but always first yourself, then others. Again. And again. And again. Ad infinitum. One cannot do this in formal structures with strong lines of control. The evidence is particularly strong: human creativity and so science flourishes at times of “freedom” and disappears at times of repression.
With careers and status to be gained, there is clearly temptation for some to cheat in such a system. The question posed by Stephen in his post and in a post by Neuroskeptic entitled “Who should catch fraud” is how might the system be policed.
Stephen argues that one way to enforce “standards”, and so prevent cheating is Open Access. I would concur. Open Access allied to Open Data allows the entire community to peruse research findings and then questions to be raised. Such peer review by the entire community is, as Stephen argues, at the heart of the “disorganisation” of science and so of creativity and critical thinking.
Neuroskeptic’s post “Who should catch fraud” looks at the process whereby fraud is unmasked and then the lines of responsibilities to journals, institutions and grant awarders. This ends with a statement, which is unfortunately true: “While all of these organizations have policies for investigating and punishing fraud when it comes to light, they rarely (if ever) actually catch it, leaving this hazardous and stressful job to individuals.“
I would add that these policies are usually hardly enforced. That is journals, institutions and grant awarders prefer to look intensely the other way, unless there is such a strong reaction from the community and interest from the wider public that they really do have to turn their heads.
The comments thread on this post at Retraction Watch highlights two instances of institutions apparently failing to apply even the most simple principles of due diligence in terms of hiring and promotion.
So there is a gap between Stephen’s call for open access and action: how to translate community discussion into action, while allowing for the (very) substantial difference of opinion that must go hand in hand with the necessary “disorganisation” of science?
In a nutshell, how can the police possibly police the police, a problem compounded by the complete absence of formal structure and organisation across science. Neuroskeptic hit the button here, pointing out that fraud is detected and pursued “1) By readers of published papers who notice oddities in the data, or 2) by internal whistleblowers, almost always junior lab members“
The weakness of this system are obvious, as there is a positive feedback loop that strengthens the hand of the fraudster: the richer and more powerful you are, the greater your immunity to pursuit, allowing an accumulation of greater riches and power.
The problem is that individuals take on huge risks when they blow the whistle. I believe that the solution is under development and seems in its early stages to be delivering. Not perfectly, but sufficiently to make people stand up and think. The tools we have available for communicating are changing rapidly and the commenting site PubPeer is having an effect, slowly but surely. Why slowly? Because only a fraction of the community uses PubPeer. Usage is growing (stats on views and number of comments and would be most interesting). As Paul Brookes points out in his article in PeerJ (here), when potential misconduct is made public, there is a far greater chance of action being taken. Couple widespread use of PubPeer as a centralised filter and outlet of our reading and reviewing of papers and grants to open access and open data and we avoid vigilantism, witch hunts, ensure science remains “disorganised” and, most crucially self-righting (posts on the latter here