Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts

06/18/2020
by   Lutz Bornmann, et al.
0

This study focuses on a recently introduced type of indicator measuring disruptiveness in science. Disruptive research diverges from current lines of research by opening up new lines. In the current study, we included the initially proposed indicator of this new type (Wu, Wang, Evans, 2019) and several variants with DI1: DI5, DI1n, DI5n, and DEP. Since indicators should measure what they propose to measure, we investigated the convergent validity of the indicators. We used a list of milestone papers, selected and published by editors of Physical Review Letters, and investigated whether this human (experts - based list is related to values of the several disruption indicators variants and - if so - which variants show the highest correlation with expert judgements. We used bivariate statistics, multiple regression models, and (coarsened) exact matching (CEM) to investigate the convergent validity of the indicators. The results show that the indicators correlate differently with the milestone paper assignments by the editors. It is not the initially proposed disruption index that performed best (DI1), but the variant DI5 which has been introduced by Bornmann, Devarakonda, Tekles, and Chacko (2019). In the CEM analysis of this study, the DEP variant - introduced by Bu, Waltman, and Huang (2019) - also showed favorable results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset