Discussion of article "Deep Neural Networks (Part VI). Ensemble of neural network classifiers: bagging" - page 4

You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
And how did you determine/calculate this figure? If it's no secret, of course.
Experimental way. Much depends on a particular time series, timeframe and ZigZag parameters. For example, for EURUSD15, the accuracy of 0.84 was not enough.
Tried an ensemble of 10 DNN Darch pieces, averaging the forecasts of the top 10. On data similar to yours, but from my DC.
No improvement, the average prediction (osh=33%) is just below the best (osh=31%). The worst was with error=34%.
DNNs are trained well - for 100 epochs.
Apparently ensembles work well on a large number of undertrained or weak networks like Elm.
Tried an ensemble of 10 DNN Darch pieces, averaging the forecasts of the top 10. On data similar to yours, but from my DC.
No improvement, the average prediction (osh=33%) is just below the best (osh=31%). The worst was with error=34%.
DNNs are trained well - for 100 epochs.
Apparently ensembles work well on a large number of undertrained or weak networks like Elm.
Of course, it is better to use weak and unstable models in ensembles. But you can also create ensembles with strict ones, but the technique is a bit different. If the size allows, I will show in the next article how to create an ensemble using TensorFlow. In general, the topic of ensembles is very large and interesting. For example, you can build a RandomForest with ELM neural networks or any other weak models as nodes (see gensemble package).
Success
Обсуждение и вопросы по коду можно сделать в ветке
Удачи
Discussion and questions on the code can be done in the branch
Good luck