Symbolic regression outperforms other models for small data sets

03/28/2021
by   Casper Wilstrup, et al.
0

Machine learning is often applied to obtain predictions and new understanding of complex phenomena and relationships, but availability of sufficient data for model training is a widespread problem. Traditional machine learning techniques such as random forests and gradient boosting tend to overfit when working with data sets of a few hundred samples. This study demonstrates that for small training sets of 250 observations, symbolic regression is a superior alternative to these machine learning models by providing better accuracy while preserving the interpretability of linear models and decision trees. In 132 out of 240 cases, the symbolic regression model performsbetter than any of the other models on the out-of-sample data. The second best algorithm was found to be a random forest, which performs best in 37 of the 240 cases. When restricting the comparison to interpretable models,symbolic regression performs best in 184 out of 240 cases.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset