Discussion of article "Deep Neural Networks (Part VI). Ensemble of neural network classifiers: bagging"
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
New article Deep Neural Networks (Part VI). Ensemble of neural network classifiers: bagging has been published:
The article discusses the methods for building and training ensembles of neural networks with bagging structure. It also determines the peculiarities of hyperparameter optimization for individual neural network classifiers that make up the ensemble. The quality of the optimized neural network obtained in the previous article of the series is compared with the quality of the created ensemble of neural networks. Possibilities of further improving the quality of the ensemble's classification are considered.
Despite the fact that the hyperparameters of the individual classifiers in the ensemble were chosen intuitively and obviously are not optimal, a high and stable quality of classification was obtained, both using averaging and a simple majority voting.
Summarize all the above. Schematically, the whole process of creating and testing an ensemble of neural networks can be divided into 4 stages:
Fig.3. Structure of training and testing the ensemble of neural networks with the averaging/voting combiner
Author: Vladimir Perervenko