Discussing the article: "A New Approach to Custom Criteria in Optimizations (Part 1): Examples of Activation Functions"

 

Check out the new article: A New Approach to Custom Criteria in Optimizations (Part 1): Examples of Activation Functions.

The first of a series of articles looking at the mathematics of Custom Criteria with a specific focus on non-linear functions used in Neural Networks, MQL5 code for implementation and the use of targeted and correctional offsets.

The ability to define a Custom Criterion, and even utilize the Complex Criterion with its opaque methodology, has introduced the ability to reduce the parsing or at the least analysis of results in Excel, Python, R, or in proprietary software, to obtain the best permutation of parameters.

The problem is, that it is still not uncommon to see the use of return(0) in published Custom Criteria. This is fraught with actual, or potential, dangers, including the potential to discard (barely) unwanted results, or worse, to divert the genetic optimization process from potentially productive paths.

In an attempt to go back to some first principles, having conducted some very empirical experiments, I attempted to find some curve equations. To do this, I looked at “Activation Functions in Neural Networks” and adopted and modified some for use here. In addition, having elucidated these, I have suggested some methods to put them to practical use.

The plan for this article series is as follows:

  1. Introduction and standard Activation Functions with MQL5 code
  2. Modifications, scaling and weighting and real examples
  3. A tool to explore different curves, scaling and weighting
  4. Any other points that arise...


    Author: Andrew Thompson