ChebNet: Efficient and Stable Constructions of Deep Neural Networks with Rectified Power Units using Chebyshev Approximations

11/07/2019
by   Shanshan Tang, et al.
0

In a recent paper[B. Li, S. Tang and H. Yu, arXiv:1903.05858, to appear on Commun. Comput. Phys. 2019], we show that deep neural networks with rectified power units (RePU) can give better approximation for sufficient smooth functions than those with rectified linear units by stably converting polynomial approximation given in power series into deep neural networks with optimal complexity and no approximation error. However, in practice, power series are not easy to compute. In this paper, we propose a new and more stable way to construct deep RePU neural networks using Chebyshev polynomial approximations. By using a hierarchical structure of Chebyshev polynomial approximation in frequency domain, we build efficient and stable deep neural network constructions. In theory, ChebNets and the deep RePU nets based on Power series have the same upper error bounds for general function approximations. But, numerically, ChebNets are much more stable. The constructed ChebNets can be further trained and obtain much better results than those obtained by training deep RePU nets constructed basing on power series.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset