Validating research performance metrics against peer rankings

81Citations
Citations of this article
157Readers
Mendeley users who have this article in their library.

Abstract

A rich and diverse set of potential bibliometric and scientometric predictors of research performance quality and importance are emerging today - from the classic metrics (publication counts, journal impact factors and individual article/author citation counts) to promising new online metrics such as download counts, hub/authority scores and growth/ decay chronometrics. In and of themselves, however, metrics are circular: They need to be jointly tested and validated against what it is that they purport to measure and predict, with each metric weighted according to its contribution to their joint predictive power. The natural criterion against which to validate metrics is expert evaluation by peers; a unique opportunity to do this is offered by the 2008 UK Research Assessment Exercise, in which a full spectrum of metrics can be jointly tested, field by field, against peer rankings. © Inter-Research 2008.

References Powered by Scopus

An index to quantify an individual's scientific research output

8600Citations
N/AReaders
Get full text

Google Scholar as a new source for citation analysis

495Citations
N/AReaders
Get full text

Earlier web usage statistics as predictors of later citation impact

292Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Scientific peer review

373Citations
N/AReaders
Get full text

Biodiversity data should be published, cited, and peer reviewed

181Citations
N/AReaders
Get full text

Evaluating journal quality and the association for information systems senior scholars' journal basket via bibliometric measures: Do expert journal assessments add value?

175Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Harnad, S. (2008). Validating research performance metrics against peer rankings. Ethics in Science and Environmental Politics, 8(1), 103–107. https://doi.org/10.3354/esep00088

Readers over time

‘09‘10‘11‘12‘13‘14‘15‘16‘17‘18‘19‘20‘21‘22‘23‘2506121824

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 46

42%

Researcher 29

26%

Professor / Associate Prof. 26

24%

Lecturer / Post doc 9

8%

Readers' Discipline

Tooltip

Social Sciences 32

33%

Agricultural and Biological Sciences 31

32%

Computer Science 26

27%

Environmental Science 8

8%

Article Metrics

Tooltip
Mentions
Blog Mentions: 1
References: 1
Social Media
Shares, Likes & Comments: 10

Save time finding and organizing research with Mendeley

Sign up for free
0