scholarly journals The San Francisco Declaration on Research Assessment

Biology Open ◽  
2013 ◽  
Vol 2 (6) ◽  
pp. 533-534 ◽  
Author(s):  
J. W. Raff
2018 ◽  
Vol 69 (4) ◽  
pp. 183-189 ◽  
Author(s):  
Terje Tüür-Fröhlich

ZusammenfassungEine wachsende Anzahl von wissenschaftlichen Gesellschaften, Zeitschriften, Institutionen und wissenschaftlich Tätigen protestieren und bekämpfen den „allmächtigen“ Journal Impact Faktor. Die bekannteste Initiative von Protest und Empfehlungen heißt DORA, The San Francisco Declaration on Research Assessment. Kritisiert wird die fehlerhafte, verzerrte und intransparente Art der quantitativen Evaluationsverfahren und ihre negativen Auswirkungen auf das wissenschaftliche Personal, insbesondere auf junge Nachwuchskräfte und ihre wissenschaftliche Entwicklung, insbesondere die subtile Diskriminierung von Kultur- und Sozialwissenschaften. Wir sollten nicht unkritisch im Metrik-Paradigma gefangen bleiben und der Flut neuer Indikatoren aus der Szientometrie zujubeln. Der Slogan „Putting Science into the Assessment of Research“ darf nicht szientistisch verkürzt verstanden werden. Soziale Phänomene können nicht bloß mit naturwissenschaftlichen Methoden untersucht werden. Kritik und Transformation der sozialen Aktivitäten, die sich „Evaluation“ nennen, erfordern sozialwissenschaftliche und wissenschaftsphilosophische Perspektiven. Evaluation ist kein wertneutrales Unternehmen, sondern ist eng mit Macht, Herrschaft, Ressourcenverteilung verbunden.


2016 ◽  
Vol 1 ◽  
Author(s):  
J. Roberto F. Arruda ◽  
Robin Champieux ◽  
Colleen Cook ◽  
Mary Ellen K. Davis ◽  
Richard Gedye ◽  
...  

A small, self-selected discussion group was convened to consider issues surrounding impact factors at the first meeting of the Open Scholarship Initiative in Fairfax, Virginia, USA, in April 2016, and focused on the uses and misuses of the Journal Impact Factor (JIF), with a particular focus on research assessment. The group’s report notes that the widespread use, or perceived use, of the JIF in research assessment processes lends the metric a degree of influence that is not justified on the basis of its validity for those purposes, and retards moves to open scholarship in a number of ways. The report concludes that indicators, including those based on citation counts, can be combined with peer review to inform research assessment, but that the JIF is not one of those indicators. It also concludes that there is already sufficient information about the shortcomings of the JIF, and that instead actions should be pursued to build broad momentum away from its use in research assessment. These actions include practical support for the San Francisco Declaration on Research Assessment (DORA) by research funders, higher education institutions, national academies, publishers and learned societies. They also include the creation of an international “metrics lab” to explore the potential of new indicators, and the wide sharing of information on this topic among stakeholders. Finally, the report acknowledges that the JIF may continue to be used as one indicator of the quality of journals, and makes recommendations how this should be improved.OSI2016 Workshop Question: Impact FactorsTracking the metrics of a more open publishing world will be key to selling “open” and encouraging broader adoption of open solutions. Will more openness mean lower impact, though (for whatever reason—less visibility, less readability, less press, etc.)? Why or why not? Perhaps more fundamentally, how useful are impact factors anyway? What are they really tracking, and what do they mean? What are the pros and cons of our current reliance on these measures? Would faculty be satisfied with an alternative system as long as it is recognized as reflecting meaningfully on the quality of their scholarship? What might such an alternative system look like?


2017 ◽  
Vol 28 (22) ◽  
pp. 2941-2944 ◽  
Author(s):  
Sandra L. Schmid

The San Francisco Declaration on Research Assessment (DORA) was penned 5 years ago to articulate best practices for how we communicate and judge our scientific contributions. In particular, it adamantly declared that Journal Impact Factor (JIF) should never be used as a surrogate measure of the quality of individual research contributions, or for hiring, promotion, or funding decisions. Since then, a heightened awareness of the damaging practice of using JIFs as a proxy for the quality of individual papers, and to assess an individual’s or institution’s accomplishments has led to changes in policy and the design and application of best practices to more accurately assess the quality and impact of our research. Herein I summarize the considerable progress made and remaining challenges that must be met to ensure a fair and meritocratic approach to research assessment and the advancement of research.


Development ◽  
2013 ◽  
Vol 140 (13) ◽  
pp. 2643-2644 ◽  
Author(s):  
Olivier Pourquié

Author(s):  
David Moher ◽  
Lex Bouter ◽  
Sabine Kleinert ◽  
Paul Glasziou ◽  
Mai Har Sham ◽  
...  

The primary goal of research is to advance knowledge. For that knowledge to benefit research and society, it must be trustworthy. Trustworthy research is robust, rigorous and transparent at all stages of design, execution and reporting. Initiatives such as the San Francisco Declaration on Research Assessment (DORA) and the Leiden Manifesto have led the way bringing much needed global attention to the importance of taking a considered, transparent and broad approach to assessing research quality. Since publication in 2012 the DORA principles have been signed up to by over 1500 organizations and nearly 15,000 individuals. Despite this significant progress, assessment of researchers still rarely includes considerations related to trustworthiness, rigor and transparency. We have developed the Hong Kong Principles (HKPs) as part of the 6th World Conference on Research Integrity with a specific focus on the need to drive research improvement through ensuring that researchers are explicitly recognized and rewarded (i.e., their careers are advanced) for behavior that leads to trustworthy research. The HKP have been developed with the idea that their implementation could assist in how researchers are assessed for career advancement with a view to strengthen research integrity. We present five principles: responsible research practices; transparent reporting; open science (open research); valuing a diversity of types of research; and recognizing all contributions to research and scholarly activity. For each principle we provide a rationale for its inclusion and provide examples where these principles are already being adopted.


2020 ◽  
Author(s):  
Graham Smith ◽  
Andrew Hufton

<p>Researchers are increasingly expected by funders and journals to make their data available for reuse as a condition of publication. At Springer Nature, we feel that publishers must support researchers in meeting these additional requirements, and must recognise the distinct opportunities data holds as a research output. Here, we outline some of the varied ways that Springer Nature supports research data sharing and report on key outcomes.</p><p>Our staff and journals are closely involved with community-led efforts, like the Enabling FAIR Data initiative and the COPDESS 2014 Statement of Commitment <sup>1-4</sup>. The Enabling FAIR Data initiative, which was endorsed in January 2019 by <em>Nature</em> and <em>Scientific Data</em>, and by <em>Nature Geoscience</em> in January 2020, establishes a clear expectation that Earth and environmental sciences data should be deposited in FAIR<sup>5</sup> Data-aligned community repositories, when available (and in general purpose repositories otherwise). In support of this endorsement, <em>Nature</em> and <em>Nature Geoscience</em> require authors to share and deposit their Earth and environmental science data, and <em>Scientific Data</em> has committed to progressively updating its list of recommended data repositories to help authors comply with this mandate.</p><p>In addition, we offer a range of research data services, with various levels of support available to researchers in terms of data curation, expert guidance on repositories and linking research data and publications.</p><p>We appreciate that researchers face potentially challenging requirements in terms of the ‘what’, ‘where’ and ‘how’ of sharing research data. This can be particularly difficult for researchers to negotiate given that huge diversity of policies across different journals. We have therefore developed a series of standardised data policies, which have now been adopted by more than 1,600 Springer Nature journals. </p><p>We believe that these initiatives make important strides in challenging the current replication crisis and addressing the economic<sup>6</sup> and societal consequences of data unavailability. They also offer an opportunity to drive change in how academic credit is measured, through the recognition of a wider range of research outputs than articles and their citations alone. As signatories of the San Francisco Declaration on Research Assessment<sup>7</sup>, Nature Research is committed to improving the methods of evaluating scholarly research. Research data in this context offers new mechanisms to measure the impact of all research outputs. To this end, Springer Nature supports the publication of peer-reviewed data papers through journals like <em>Scientific Data</em>. Analysis of citation patterns demonstrate that data papers can be well-cited, and offer a viable way for researchers to receive credit for data sharing through traditional citation metrics. Springer Nature is also working hard to improve support for direct data citation. In 2018 a data citation roadmap developed by the Publishers Early Adopters Expert Group was published in <em>Scientific Data</em><sup>8</sup>, outlining practical steps for publishers to work with data citations and associated benefits in transparency and credit for researchers. Using examples from this roadmap, its implementation and supporting services, we outline how a FAIR-led data approach from publishers can help researchers in the Earth and environmental sciences to capitalise on new expectations around data sharing.</p><p>__</p><ol><li>https://doi.org/10.1038/d41586-019-00075-3</li> <li>https://doi.org/10.1038/s41561-019-0506-4</li> <li>https://copdess.org/enabling-fair-data-project/commitment-statement-in-the-earth-space-and-environmental-sciences/</li> <li>https://copdess.org/statement-of-commitment/</li> <li>https://www.force11.org/group/fairgroup/fairprinciples</li> <li>https://op.europa.eu/en/publication-detail/-/publication/d375368c-1a0a-11e9-8d04-01aa75ed71a1</li> <li>https://sfdora.org/read/</li> <li>https://doi.org/10.1038/sdata.2018.259</li> </ol>


Sign in / Sign up

Export Citation Format

Share Document