Self-trained MA cross! - page 9

 

Neural networks made easy (Part 12): Dropout

Since the beginning of this series of articles, we have already made a big progress in studying various neural network models. But the learning process was always performed without our participation. At the same time, there is always a desire to somehow help the neural network to improve training results, which can also be referred to as the convergence of the neural network. In this article we will consider one of such methods entitled Dropout.

Neural networks made easy (Part 12): Dropout
Neural networks made easy (Part 12): Dropout
  • www.mql5.com
As the next step in studying neural networks, I suggest considering the methods of increasing convergence during neural network training. There are several such methods. In this article we will consider one of them entitled Dropout.
 

Neural networks made easy (Part 13): Batch Normalization

In the previous article, we started considering methods aimed at increasing the convergence of neural networks and got acquainted with the Dropout method, which is used to reduce the co-adaptation of features. Let us continue this topic and get acquainted with the methods of normalization.

Reason: