ֱ̽ of Cambridge - open research /taxonomy/subjects/open-research en Cambridge ֱ̽ signs San Francisco Declaration on Research Assessment /research/news/cambridge-university-signs-san-francisco-declaration-on-research-assessment <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/analysis-20302651920.jpg?itok=vl76ADtT" alt="Microscope" title="Microscope, Credit: kkolosov" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://sfdora.org/">DORA’s recommendations</a> call for institutions not to use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles when assessing researchers’ contributions in hiring, promotion, or funding decisions. It encourages universities, researchers and others to assess research on its own merits rather than on the basis of the journal in which the research was published and highlights the need to capitalise on the opportunities provided by online publication.</p>&#13; &#13; <p>Professor Chris Abell, Pro-Vice Chancellor for Research at Cambridge, said: “ ֱ̽ ֱ̽ of Cambridge is committed to producing excellent research. By signing up to DORA, we want to demonstrate to our researchers that we value the quality and content of their research regardless of how and where it is published.”</p>&#13; &#13; <p>Professor Steve Russell from the ֱ̽’s Department of Genetics, will chair the DORA Working Group, which will oversee the implementation of the DORA recommendations.</p>&#13; &#13; <p>“This is an important step for the ֱ̽, particularly for early career researchers where all too often career progression is based on judgments using flawed metrics,” says Professor Russell. “By signing DORA the ֱ̽ is making very positive step towards developing a culture where research excellence is assessed by the quality of the work and not by the title of the Journal where it is published.”</p>&#13; &#13; <p><img alt="" src="/sites/www.cam.ac.uk/files/inner-images/dorabadge1.jpg" style="width: 400px; height: 173px; float: left;" />DORA calls on institutions to be explicit about the criteria used to reach hiring, tenure, and promotion decisions, clearly highlighting, especially for early-stage researchers, that the content of a paper is much more important than publication metrics or the identity of the journal in which it was published.</p>&#13; &#13; <p>In addition to research publications, DORA recommends considering the value and impact of all research outputs (including datasets and software) and a broad range of impact measures including qualitative indicators of research impact, such as influence on policy and practice, for the purposes of research assessment.</p>&#13; &#13; <p> ֱ̽ ֱ̽’s HR Division will begin implementing a number of changes to ensure the agreement’s recommendations are reflected across its recruitment, reward and promotions schemes.</p>&#13; &#13; <p>Brigitte Shull, Director of Scholarly Communications Research &amp; Development at Cambridge ֱ̽ Press, added: “ ֱ̽principles of DORA align with our open research strategy and ongoing activities around improved metrics and recognizing author contributions. By signing up to DORA, we want to help improve the way the quality of research is assessed and expand the range of tools to better account for a variety of research outputs.”</p>&#13; &#13; <p>In February, Cambridge became one of the first UK universities to publish <a href="https://osc.cam.ac.uk/open-research-position-statement">a position statement on Open Research</a>. Its statement set out the key principles for the conduct and support of Open Research at the ֱ̽, which aims to increase inclusivity and collaboration, unlock access to knowledge and improve the transparency and reproducibility of research.</p>&#13; &#13; <p> ֱ̽recommendation to sign DORA at the ֱ̽ was made by the Open Research Working Group, chaired by Professor Richard Penty, and at the Press by the Open Research Steering Committee, chaired by Brigitte Shull.</p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p> ֱ̽ ֱ̽ of Cambridge and Cambridge ֱ̽ Press announced on 8 July 2019 that they have signed up to the San Francisco Declaration on Research Assessment (DORA), a set of recommendations agreed in 2012 that seek to ensure that the quality and impact of research outputs are 'measured accurately and evaluated wisely'.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">By signing up to DORA, we want to demonstrate to our researchers that we value the quality and content of their research regardless of how and where it is published</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Chris Abell</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://pixabay.com/photos/analysis-biochemistry-biologist-2030265/" target="_blank">kkolosov</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Microscope</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright © ֱ̽ of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/public-domain">Public Domain</a></div></div></div> Mon, 08 Jul 2019 08:09:15 +0000 cjb250 206362 at 6,000 and counting: Cambridge Vice-Chancellor joins Stephen Hawking in making his PhD ‘Open Access’ /stories/6000th-thesis <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Cambridge also becomes first UK university to publish position statement on Open Research.</p> </p></div></div></div> Wed, 27 Feb 2019 17:00:09 +0000 sjr81 203572 at Opinion: ֱ̽science ‘reproducibility crisis’ – and what can be done about it /research/discussion/opinion-the-science-reproducibility-crisis-and-what-can-be-done-about-it <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/discussion/56134101292091c5602do.jpg?itok=A6BEJC8V" alt="Study of Human Immune Response to HIV" title="Study of Human Immune Response to HIV, Credit: NIAID" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>A survey by Nature revealed that <a href="https://www.nature.com/articles/533452a">52% of researchers</a> believed there was a “significant reproducibility crisis” and 38% said there was a “slight crisis”.</p>&#13; &#13; <p>We asked three experts how they think the situation could be improved.</p>&#13; &#13; <h2>Open Research is the answer</h2>&#13; &#13; <p><em>Danny Kingsley, head of the Office of Scholarly Communication, ֱ̽ of Cambridge</em></p>&#13; &#13; <p> ֱ̽solution to the scientific reproducibility crisis is to move towards <a href="https://osc.cam.ac.uk/open-research">Open Research</a> – the idea that scientific knowledge of all kinds should be openly shared as early as it is practical in the discovery process. We need to reward the publication of research outputs along the entire process, rather than just each journal article as it is published.</p>&#13; &#13; <p>As well as other research outputs – such as data sets – we should reward research productivity itself as well as the thought process and planning behind the study. This is why <a href="http://neurochambers.blogspot.co.uk/2013/04/scientific-publishing-as-it-was-meant_10.html">Registered Reports</a> was launched in 2013, where researchers register the proposal and how the research will be conducted, before any experimental work commences. It allows editorial decisions to be based on the rigour of the experimental design and increases the likelihood that the findings could be replicated.</p>&#13; &#13; <p>In the UK there is now a requirement from most <a href="https://www.data.cam.ac.uk/funders">funders</a> that the data underpinning a research publication is made available. However, although there are moves towards open research, many argue against the sharing of data among the research community.</p>&#13; &#13; <figure class="align-center "><img alt="" src="https://cdn.theconversation.com/files/160520/width754/image-20170313-9613-2cfmqw.jpg" /><figcaption><em><span class="caption">Questionable findings are often hidden.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/product-researching-marketing-team-work-loft-425326300?src=WZtYxmdFeSANhTM2RN1K6w-2-98">Shutterstock</a></span></em></figcaption></figure><p>Researchers often write multiple papers from a single data set and many fear that if this data is released with the first publication then the researcher will be “scooped” by another research group, who will publish findings from similar data sets before the original authors get the chance to publish follow up articles – to gain maximum credit for the work. If the publication of data itself could be recorded as a “research output”, then being scooped would no longer be such an issue, as such credit will have been given.</p>&#13; &#13; <p><a href="https://journals.plos.org:443/plosone/article?id=10.1371/journal.pone.0026828">One benefit of sharing data</a> could be an improvement in its quality – as previous research has shown. And there have been small steps towards this goal, such as a <a href="https://force11.org/info/joint-declaration-of-data-citation-principles-final/">standard method of citing data</a>.</p>&#13; &#13; <p>We also need to publish “null” results – those that do not support the hypothesis – to prevent other researchers wasting time repeating work. There are a few publication outlets for this, and a <a href="https://techcrunch.com/2017/02/28/researchgate-raises-52-6m-for-its-social-research-network-for-scientists/">recent press release from ResearchGate</a> indicated that it supports the sharing of failed experiments through its “project” offering. It lets users upload and track experiments as they are happening – meaning no one knows how they will turn out.</p>&#13; &#13; <h2>Psychology is leading the way out of crisis</h2>&#13; &#13; <p><em>Jim Grange, senior lecturer in psychology, Keele ֱ̽</em></p>&#13; &#13; <p>To me, it is clear that there is a reproducibility crisis in psychological science, and across all sciences. Murmurings of low reproducibility began in 2011 – the “<a href="https://ejwagenmakers.com/2012/Wagenmakers2012Horrors.pdf">year of horrors</a>” for psychology – with a high profile fraud case. But since then, <a href="https://osf.io/vmrgu/"> ֱ̽Open Science Collaboration</a> has published the findings of a large-scale effort to closely replicate 100 studies in psychology. Only 36% of them could be replicated.</p>&#13; &#13; <p> ֱ̽<a href="https://arxiv.org/abs/1205.4251">incentive structures</a> in universities and the attitude that you “publish or perish” means that researchers prioritise “getting it published” over “getting it right”. It also means that some, implicitly or explicitly, use questionable research practices to achieve publication. These may include failing to report parts of data sets or trying different analytical approaches to make the data fit what you want to say. It could also mean presenting exploratory research as though it was originally confirmatory (designed to test a specific hypothesis).</p>&#13; &#13; <p>However, many psychology journals now recommend or require the preregistration of studies which <a href="https://www.apa.org/science/about/psa/2015/08/pre-registration">allow researchers to detail their predictions</a>, experimental protocols, and planned analytical strategy before data collection. This provides confidence to readers that no questionable research practices have occurred.</p>&#13; &#13; <figure class="align-center "><img alt="" src="https://cdn.theconversation.com/files/160499/width754/image-20170313-19247-57184o.jpg" /><figcaption><em><span class="caption">Erasing data: a questionable research practice.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/erasing-data-correction-fluid-427863787?src=1XEqIKb5SpySP5ZUCpLmZg-1-69">Shutterstock</a></span></em></figcaption></figure><p><a href="https://www.elsevier.com/connect">Registered Reports</a> has taken this further. But of course, once results are produced, isolated findings don’t mean much until they have been replicated.</p>&#13; &#13; <p>I make efforts to replicate results before trying to publish and you’d be forgiven for thinking that replication attempts are common in science, but this is simply not the case. Journals seek novel theories and findings, and view replications as treading over old ground which offers little incentive for career-minded academics to conduct replications.</p>&#13; &#13; <p>This has also led to the introduction of <a href="https://www.psychologicalscience.org/publications/replication">Registered Replication Reports</a> in <a href="https://journals.sagepub.com/home/pps">Perspectives on Psychological Science</a>. This is where teams of researchers each follow identical procedures independently and aim to replicate important findings from the literature. A single paper then collates and analyses them to establish the size and reproducibility of the original study.</p>&#13; &#13; <p>Although psychology is leading the way for improvements with these pioneering initiatives, it is certainly not out of the woods. But it has started to move beyond a crisis and make impressive strides – more disciplines need to follow suit.</p>&#13; &#13; <h2>This is a publication bias crisis</h2>&#13; &#13; <p><em>Ottoline Leyser, director of the Sainsbury Laboratory, ֱ̽ of Cambridge</em></p>&#13; &#13; <p>Reproducibility is a fundamental building block of science. If two people do the same experiment, they should get the same result. But there are many good reasons why two “identical” experiments might not give the same result such as unknown differences that have not been considered – and some <a href="http://www.plantcell.org/content/2/4/279.abstract">exciting discoveries have been made this way</a>.</p>&#13; &#13; <p>So if a lack of reproducibility is itself not necessarily a problem, why is everybody talking about a crisis? In some cases poor practice and corner cutting have contributed to lack of reproducibility, and there have been some <a href="https://www.science.org/news/2012/11/final-report-stapel-affair-points-bigger-problems-social-psychology">high profile cases of out and out fraud</a>. It’s a major concern, but what is causing it?</p>&#13; &#13; <p>In 2014 I chaired a project on the research culture in Britain for the <a href="https://www.nuffieldbioethics.org/publication/the-culture-of-scientific-research-the-findings-of-a-series-of-engagement-activities-exploring-the-culture-of-scientific-research-in-the-uk/">Nuffield Council on bioethics</a>, which was motivated by <a href="https://theconversation.com/the-dark-side-of-research-when-chasing-prestige-becomes-the-prize-35001">concerns about research integrity</a> including over-claiming, rushing prematurely to publication and incorrect use of statistics. ֱ̽main conclusions were that poor practice is incentivised by hyper-competition with overly narrow rules for winning.</p>&#13; &#13; <p>There is an excessive focus on the publication of groundbreaking results in prestigious journals. But science cannot only be groundbreaking, as there is a lot of important digging to do after new discoveries – but there is not enough credit in the system for this work and it may remain unpublished because researchers prioritise their time on the eye-catching papers, hurriedly put together.</p>&#13; &#13; <p> ֱ̽reproducibility crisis is actually a publication bias crisis which is driven by the reward structures in the research system. Various approaches have been suggested to address problems, such as pre-registration of experiments. However, the research landscape is highly diverse and this type of solution is only sensible for some research types. ֱ̽most widely relevant solution is to change the reward structures. In the UK there is a major opportunity to do this by reforming the <a href="https://theconversation.com/qanda-what-is-the-ref-and-how-is-the-quality-of-university-research-measured-35529">Research Excellence Framework</a> (REF). Through the REF, public money is allocated to universities based on the “quality” of the four best research outputs, usually papers, produced by each of their principal investigators over approximately six years and it disproportionately rewards groundbreaking research.</p>&#13; &#13; <p>We need reward for a portfolio of research outputs, including not only the headline grabbing results, but also confirmatory work and community data sharing, which are the hallmarks of a truly high quality research endeavour. This would go a long way to shifting the current destructive culture.</p>&#13; &#13; <p><em><span><a href="https://theconversation.com/profiles/ottoline-leyser-147196">Ottoline Leyser</a>, Director of the Sainsbury Laboratory, <a href="https://theconversation.com/institutions/university-of-cambridge-1283"> ֱ̽ of Cambridge</a>; <a href="https://theconversation.com/profiles/danny-kingsley-3258">Danny Kingsley</a>, Head, Office of Scholarly Communication, ֱ̽ of Cambridge, <a href="https://theconversation.com/institutions/university-of-cambridge-1283"> ֱ̽ of Cambridge</a>, and <a href="https://theconversation.com/profiles/jim-grange-344560">Jim Grange</a>, Senior Lecturer in psychology, <a href="https://theconversation.com/institutions/keele-university-1012">Keele ֱ̽</a></span></em></p>&#13; &#13; <p><em>This article was originally published on <a href="https://theconversation.com/"> ֱ̽Conversation</a>. Read the <a href="https://theconversation.com/the-science-reproducibility-crisis-and-what-can-be-done-about-it-74198">original article</a>.</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Reproducibility is the idea that an experiment can be repeated by another scientist and they will get the same result. It is important to show that the claims of any experiment are true and for them to be useful for any further research. However, science appears to have an issue with reproducibility. </p>&#13; </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.flickr.com/photos/niaid/5613410129/" target="_blank">NIAID</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Study of Human Immune Response to HIV</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; ֱ̽text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/" rel="license">Creative Commons Attribution 4.0 International License</a>. For image use please see separate credits above.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-license-type field-type-taxonomy-term-reference field-label-above"><div class="field-label">Licence type:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="/taxonomy/imagecredit/attribution">Attribution</a></div></div></div> Mon, 20 Mar 2017 09:57:15 +0000 cjb250 186372 at