raceBERT – A Transformer-based Model for Predicting Race and Ethnicity from Names

12/07/2021
by   Prasanna Parasurama, et al.
1

This paper presents raceBERT – a transformer-based model for predicting race and ethnicity from character sequences in names, and an accompanying python package. Using a transformer-based model trained on a U.S. Florida voter registration dataset, the model predicts the likelihood of a name belonging to 5 U.S. census race categories (White, Black, Hispanic, Asian Pacific Islander, American Indian Alaskan Native). I build on Sood and Laohaprapanon (2018) by replacing their LSTM model with transformer-based models (pre-trained BERT model, and a roBERTa model trained from scratch), and compare the results. To the best of my knowledge, raceBERT achieves state-of-the-art results in race prediction using names, with an average f1-score of 0.86 – a 4.1 over the previous state-of-the-art, and improvements between 15-17 non-white names.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset