Do bibliometrics and altmetrics correlate with the quality of papers? A large-scale empirical study based on F1000Prime, altmetrics, and citation data

11/20/2017
by   Lutz Bornmann, et al.
0

Purpose: In this study, we address the question of the unclear meaning of altmetrics by studying their relationship with scientific quality (measured by peer assessments). Design: In the first step, we analyze the underlying dimensions of measurement for traditional metrics (citation counts) and altmetrics - by using principal component analysis (PCA). In the second step, we test the relationship between the dimensions and quality of papers (as measured by the post-publication peer-review system of F1000Prime assessments) - using regression analysis. Results: The results of the PCA show that altmetrics measure different things, whereas Mendeley counts are related to citation counts, and tweets form a separate dimension. The Altmetric Attention Score does not contribute in a major way to the relevant principal components. The results of the regression analysis indicate that citation-based metrics and readership counts are significantly more related to quality, than tweets and the Altmetric Attention Score. This result on the one hand questions the use of Twitter counts and the Altmetric Attention Score for research evaluation purposes and on the other hand indicates potential use of Mendeley reader counts. Originality: Only a few studies have previously investigated the relationship between altmetrics and assessments by peers. The relationship is important to study: if these data should be used in research evaluation, altmetrics should be related to quality.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset