Online Forgetting Process for Linear Regression Models
Motivated by the EU's "Right To Be Forgotten" regulation, we initiate a study of statistical data deletion problems where users' data are accessible only for a limited period of time. This setting is formulated as an online supervised learning task with constant memory limit. We propose a deletion-aware algorithm FIFD-OLS for the low dimensional case, and witness a catastrophic rank swinging phenomenon due to the data deletion operation, which leads to statistical inefficiency. As a remedy, we propose the FIFD-Adaptive Ridge algorithm with a novel online regularization scheme, that effectively offsets the uncertainty from deletion. In theory, we provide the cumulative regret upper bound for both online forgetting algorithms. In the experiment, we showed FIFD-Adaptive Ridge outperforms the ridge regression algorithm with fixed regularization level, and hopefully sheds some light on more complex statistical models.
READ FULL TEXT