Expert Failure: Re-evaluating Research Assessment

EDITORIAL © 2013 Eisen et al. Funding organisations, scientists, and the general public need robust and reliable ways to evaluate the output of scientific research. In this issue of PLOS Biology, Adam Eyre-Walker and Nina Stoletzki analyse the subjective assessment and citations of more than...

Full description

Bibliographic Details
Main Authors: Eisen, J.A., MacCallum, C.J., Neylon, Cameron
Format: Journal Article
Language:English
Published: PUBLIC LIBRARY SCIENCE 2013
Subjects:
Online Access:http://hdl.handle.net/20.500.11937/81467
_version_ 1848764370609242112
author Eisen, J.A.
MacCallum, C.J.
Neylon, Cameron
author_facet Eisen, J.A.
MacCallum, C.J.
Neylon, Cameron
author_sort Eisen, J.A.
building Curtin Institutional Repository
collection Online Access
description EDITORIAL © 2013 Eisen et al. Funding organisations, scientists, and the general public need robust and reliable ways to evaluate the output of scientific research. In this issue of PLOS Biology, Adam Eyre-Walker and Nina Stoletzki analyse the subjective assessment and citations of more than 6,000 published papers [1]. They show that expert assessors are biased by the impact factor (IF) of the journal in which the paper has been published and cannot consistently and independently judge the “merit” of a paper or predict its future impact, as measured by citations. They also show that citations themselves are not a reliable way to assess merit as they are inherently highly stochastic. In a final twist, the authors argue that the IF is probably the least-bad metric amongst the small set that they analyse, concluding that it is the best surrogate of the merit of individual papers currently available.
first_indexed 2025-11-14T11:18:17Z
format Journal Article
id curtin-20.500.11937-81467
institution Curtin University Malaysia
institution_category Local University
language English
last_indexed 2025-11-14T11:18:17Z
publishDate 2013
publisher PUBLIC LIBRARY SCIENCE
recordtype eprints
repository_type Digital Repository
spelling curtin-20.500.11937-814672020-11-09T00:57:42Z Expert Failure: Re-evaluating Research Assessment Eisen, J.A. MacCallum, C.J. Neylon, Cameron Science & Technology Life Sciences & Biomedicine Biochemistry & Molecular Biology Biology Life Sciences & Biomedicine - Other Topics EDITORIAL © 2013 Eisen et al. Funding organisations, scientists, and the general public need robust and reliable ways to evaluate the output of scientific research. In this issue of PLOS Biology, Adam Eyre-Walker and Nina Stoletzki analyse the subjective assessment and citations of more than 6,000 published papers [1]. They show that expert assessors are biased by the impact factor (IF) of the journal in which the paper has been published and cannot consistently and independently judge the “merit” of a paper or predict its future impact, as measured by citations. They also show that citations themselves are not a reliable way to assess merit as they are inherently highly stochastic. In a final twist, the authors argue that the IF is probably the least-bad metric amongst the small set that they analyse, concluding that it is the best surrogate of the merit of individual papers currently available. 2013 Journal Article http://hdl.handle.net/20.500.11937/81467 10.1371/journal.pbio.1001677 English http://creativecommons.org/licenses/by/4.0/ PUBLIC LIBRARY SCIENCE fulltext
spellingShingle Science & Technology
Life Sciences & Biomedicine
Biochemistry & Molecular Biology
Biology
Life Sciences & Biomedicine - Other Topics
Eisen, J.A.
MacCallum, C.J.
Neylon, Cameron
Expert Failure: Re-evaluating Research Assessment
title Expert Failure: Re-evaluating Research Assessment
title_full Expert Failure: Re-evaluating Research Assessment
title_fullStr Expert Failure: Re-evaluating Research Assessment
title_full_unstemmed Expert Failure: Re-evaluating Research Assessment
title_short Expert Failure: Re-evaluating Research Assessment
title_sort expert failure: re-evaluating research assessment
topic Science & Technology
Life Sciences & Biomedicine
Biochemistry & Molecular Biology
Biology
Life Sciences & Biomedicine - Other Topics
url http://hdl.handle.net/20.500.11937/81467