Sparse Probability of Agreement

08/12/2022
by   Jeppe Nørregaard, et al.
0

Measuring inter-annotator agreement is important for annotation tasks, but many metrics require a fully-annotated dataset (or subset), where all annotators annotate all samples. We define Sparse Probability of Agreement, SPA, which estimates the probability of agreement when no all annotator-item-pairs are available. We show that SPA, with some assumptions, is an unbiased estimator and provide multiple different weighing schemes for handling samples with different numbers of annotation, evaluated over a range of datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset