ICompELM - Independent Component Analysis Based Extreme Learning Machine

Single Layer Feed-forward Neural networks (SLFNs) have many applications in various fields of statistical modelling, especially for time-series forecasting. However, there are some major disadvantages of training such networks via the widely accepted 'gradient-based backpropagation' algorithm, such as convergence to local minima, dependencies on learning rate and large training time. These concerns were addressed by Huang et al. (2006) <doi:10.1016/j.neucom.2005.12.126>, wherein they introduced the Extreme Learning Machine (ELM), an extremely fast learning algorithm for SLFNs which randomly chooses the weights connecting input and hidden nodes and analytically determines the output weights of SLFNs. It shows good generalized performance, but is still subject to a high degree of randomness. To mitigate this issue, this package uses a dimensionality reduction technique given in Hyvarinen (1999) <doi:10.1109/72.761722>, namely, the Independent Component Analysis (ICA) to determine the input-hidden connections and thus, remove any sort of randomness from the algorithm. This leads to a robust, fast and stable ELM model. Using functions within this package, the proposed model can also be compared with an existing alternative based on the Principal Component Analysis (PCA) algorithm given by Pearson (1901) <doi:10.1080/14786440109462720>, i.e., the PCA based ELM model given by Castano et al. (2013) <doi:10.1007/s11063-012-9253-x>, from which the implemented ICA based algorithm is greatly inspired.

Last updated 6 months ago

1.00 score 452 downloads