Discussing the article: "MQL5 Wizard Techniques you should know (Part 18): Neural Architecture Search with Eigen Vectors"

 

Check out the new article: MQL5 Wizard Techniques you should know (Part 18): Neural Architecture Search with Eigen Vectors.

Neural Architecture Search, an automated approach at determining the ideal neural network settings can be a plus when facing many options and large test data sets. We examine how when paired Eigen Vectors this process can be made even more efficient.

If we choose to use neural networks to define the relationship between a training dataset and its target, as is the case for this article, then we have to contend with the question of what settings will this network use? There are several types of networks, and this implies the applicable designs and settings are also many. For this article, we consider a very basic case that is often referred to as a multi-layer perceptron. With this type, the settings we’ll dwell on will be only the number of hidden layers and the size of each hidden layer.

NAS can typically help identify these 2 settings and much more. For instance, even with simple MLPs the question of which activation type to use, the initial weights to use, as well as the initial biases are all factors that are sensitive to the performance and accuracy of the network. These though are skimmed over here because the search space is very extensive and the compute resources required for forward and back propagation on even a moderate sized data set would be prohibitive.


Author: Stephen Njuki

Reason: