Multivariate nonparametric regression by least squares Jacobi polynomials approximations

02/02/2022
by   Asma BenSaber, et al.
0

In this work, we study a random orthogonal projection based least squares estimator for the stable solution of a multivariate nonparametric regression (MNPR) problem. More precisely, given an integer d≥ 1 corresponding to the dimension of the MNPR problem, a positive integer N≥ 1 and a real parameter α≥ -1/2, we show that a fairly large class of d-variate regression functions are well and stably approximated by its random projection over the orthonormal set of tensor product d-variate Jacobi polynomials with parameters (α,α). The associated uni-variate Jacobi polynomials have degree at most N and their tensor products are orthonormal over 𝒰=[0,1]^d, with respect to the associated multivariate Jacobi weights. In particular, if we consider n random sampling points 𝐗_i following the d-variate Beta distribution, with parameters (α+1,α+1), then we give a relation involving n, N, α to ensure that the resulting (N+1)^d× (N+1)^d random projection matrix is well conditioned. Moreover, we provide squared integrated as well as L^2-risk errors of this estimator. Precise estimates of these errors are given in the case where the regression function belongs to an isotropic Sobolev space H^s(I^d), with s> d/2. Also, to handle the general and practical case of an unknown distribution of the 𝐗_i, we use Shepard's scattered interpolation scheme in order to generate fairly precise approximations of the observed data at n i.i.d. sampling points 𝐗_i following a d-variate Beta distribution. Finally, we illustrate the performance of our proposed multivariate nonparametric estimator by some numerical simulations with synthetic as well as real data.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset