Adaptive Extreme Learning Machine for Recurrent Beta-basis Function Neural Network Training
Beta Basis Function Neural Network (BBFNN) is a special kind of kernel basis neural networks. It is a feedforward network typified by the use of beta function as a hidden activation function. Beta is a flexible transfer function representing richer forms than the common existing functions. As in every network, the architecture setting as well as the learning method are two main gauntlets faced by BBFNN. In this paper, new architecture and training algorithm are proposed for the BBFNN. An Extreme Learning Machine (ELM) is used as a training approach of BBFNN with the aim of quickening the training process. The peculiarity of ELM is permitting a certain decrement of the computing time and complexity regarding the already used BBFNN learning algorithms such as backpropagation, OLS, etc. For the architectural design, a recurrent structure is added to the common BBFNN architecture in order to make it more able to deal with complex, non linear and time varying problems. Throughout this paper, the conceived recurrent ELM-trained BBFNN is tested on a number of tasks related to time series prediction, classification and regression. Experimental results show noticeable achievements of the proposed network compared to common feedforward and recurrent networks trained by ELM and using hyperbolic tangent as activation function. These achievements are in terms of accuracy and robustness against data breakdowns such as noise signals.
READ FULL TEXT