Discusión sobre el artículo "Características del Wizard MQL5 que debe conocer (Parte 18): Búsqueda de arquitectura neural con vectores propios"

 

Artículo publicado Características del Wizard MQL5 que debe conocer (Parte 18): Búsqueda de arquitectura neural con vectores propios:

Búsqueda de arquitectura neuronal, un enfoque automatizado para determinar la configuración ideal de la red neuronal, puede ser una ventaja cuando se enfrentan muchas opciones y grandes conjuntos de datos de prueba. Analizamos cómo, cuando se combinan vectores propios, este proceso puede resultar aún más eficiente.

If we choose to use neural networks to define the relationship between a training dataset and its target, as is the case for this article, then we have to contend with the question of what settings will this network use? There are several types of networks, and this implies the applicable designs and settings are also many. For this article, we consider a very basic case that is often referred to as a multi-layer perceptron. With this type, the settings we’ll dwell on will be only the number of hidden layers and the size of each hidden layer.

NAS can typically help identify these 2 settings and much more. For instance, even with simple MLPs the question of which activation type to use, the initial weights to use, as well as the initial biases are all factors that are sensitive to the performance and accuracy of the network. These though are skimmed over here because the search space is very extensive and the compute resources required for forward and back propagation on even a moderate sized data set would be prohibitive.

Autor: Stephen Njuki