Discussion of article "Experiments with neural networks (Part 5): Normalizing inputs for passing to a neural network"

 

New article Experiments with neural networks (Part 5): Normalizing inputs for passing to a neural network has been published:

Neural networks are an ultimate tool in traders' toolkit. Let's check if this assumption is true. MetaTrader 5 is approached as a self-sufficient medium for using neural networks in trading. A simple explanation is provided.

Normalization methods may vary depending on the type of data and the problem we are trying to solve. For example, the most common normalization methods for images are Z-normalization and min-max normalization. However, for other types of data, such as audio signals or text data, it may be more efficient to use other normalization methods.

For example, in case of audio signals, the maximum amplitude normalization is often used, where all signal values are scaled between -1 and 1. For text data, it can be useful to normalize by the number of words or characters in a sentence.

In addition, in some cases it can be useful to normalize not only the input data, but also the target variables. For example, in regression problems where the target variable has a large range of values, it can be useful to normalize the target variable to improve training stability and prediction accuracy.

Normalization of inputs is an important step in preparing data for training neural networks. This process allows us to bring the inputs to a certain range of values, which helps to improve the stability and speed of training convergence. Depending on the type of data and the problem we are trying to solve, different normalization methods can be used. In addition, in some cases it can be useful to normalize not only the input data, but also the target variables.

Author: Roman Poshtar