Discussion of article "Deep Neural Networks (Part VII). Ensemble of neural networks: stacking"

 

New article Deep Neural Networks (Part VII). Ensemble of neural networks: stacking has been published:

We continue to build ensembles. This time, the bagging ensemble created earlier will be supplemented with a trainable combiner — a deep neural network. One neural network combines the 7 best ensemble outputs after pruning. The second one takes all 500 outputs of the ensemble as input, prunes and combines them. The neural networks will be built using the keras/TensorFlow package for Python. The features of the package will be briefly considered. Testing will be performed and the classification quality of bagging and stacking ensembles will be compared.

Let us plot the history of training:

history_stop_500

Fig. 11. The history of the DNN500 neural network training

To improve the classification quality, numerous hyperparameters can be modified: neuron initialization method, regularization of activation of the neurons and their weights, etc. The results obtained with almost intuitively selected parameters have a promising quality but also a disappointing cap. Without optimization, it was not possible to raise Accuracy above 0.82. Conclusion: it is necessary to optimize the hyperparameters of the neural network. In the previous articles, we experimented with Bayesian optimization. It can be applied here as well, but it is a separate difficult topic.

Author: Vladimir Perervenko

 

quick integration with python for deep learning and mt5

https://github.com/TheSnowGuru/PyTrader-python-mt5-trading-api-connector

Reason: