Discussion of article "Deep Neural Networks (Part VI). Ensemble of neural network classifiers: bagging" - page 4

 
Vladimir Perervenko:

And how did you determine/calculate this figure? If it's no secret, of course.

Experimental way. Much depends on a particular time series, timeframe and ZigZag parameters. For example, for EURUSD15, the accuracy of 0.84 was not enough.

 

Tried an ensemble of 10 DNN Darch pieces, averaging the forecasts of the top 10. On data similar to yours, but from my DC.

No improvement, the average prediction (osh=33%) is just below the best (osh=31%). The worst was with error=34%.

DNNs are trained well - for 100 epochs.

Apparently ensembles work well on a large number of undertrained or weak networks like Elm.

 
elibrarius:

Tried an ensemble of 10 DNN Darch pieces, averaging the forecasts of the top 10. On data similar to yours, but from my DC.

No improvement, the average prediction (osh=33%) is just below the best (osh=31%). The worst was with error=34%.

DNNs are trained well - for 100 epochs.

Apparently ensembles work well on a large number of undertrained or weak networks like Elm.

Of course, it is better to use weak and unstable models in ensembles. But you can also create ensembles with strict ones, but the technique is a bit different. If the size allows, I will show in the next article how to create an ensemble using TensorFlow. In general, the topic of ensembles is very large and interesting. For example, you can build a RandomForest with ELM neural networks or any other weak models as nodes (see gensemble package).

Success

 
It's been a habbit checking for your new topic on deep NN. Bravo!
 

Обсуждение и вопросы по коду можно сделать в ветке

Удачи

 

Discussion and questions on the code can be done in the branch

Good luck

 
Interesting article, thank you.