PhD Defense: 'New Neural Networks based on Extreme Learning Machine'

The extreme learning machine (ELM) is a popular neural network that has two major drawbacks for large-scale data sets: the need of tuning of the number of hidden neurons, and the pseudo-inversion of the hidden activation matrix. This thesis proposes algorithms that keep the simplicity and speed of the ELM network and: 1) avoid tuning and bound the size of hidden activation matrix; 2) raises the ELM performance with a suitable choice of the random bias values; 3) speeds up the hyper-parameter tuning by reducing the number of training executions required.

 

 

Supervisor: Manuel Fernández Delgado