Adaptive nonparametric estimation in the functional linear model with functional output
In this paper, we consider a functional linear regression model, where both the covariate and the response variable are functional random variables. We address the problem of optimal nonparametric estimation of the conditional expectation operator in this model. A collection of projection estimators over finite dimensional subspaces is first introduce. We provide a non-asymptotic bias-variance decomposition for the Mean Square Prediction error in the case where these subspaces are generated by the (empirical) PCA functional basis. The automatic trade-off is realized thanks to a model selection device which selects the best projection dimensions: the penalized contrast estimator satisfies an oracle-type inequality and is thus optimal in an adaptive point of view. These upper-bounds allow us to derive convergence rates over ellipsoidal smoothness spaces. The rates are shown to be optimal in the minimax sense: they match with a lower bound of the minimax risk, which is also proved. Finally, we conduct a numerical study, over simulated data and over two real-data sets.
READ FULL TEXT