Discussing the article: "Neural Networks in Trading: Contrastive Pattern Transformer (Final Part)"

 

Check out the new article: Neural Networks in Trading: Contrastive Pattern Transformer (Final Part).

In the previous last article within this series, we looked at the Atom-Motif Contrastive Transformer (AMCT) framework, which uses contrastive learning to discover key patterns at all levels, from basic elements to complex structures. In this article, we continue implementing AMCT approaches using MQL5.

As usual, model training is performed offline on a pre-collected training dataset covering the entire 2023 year. Training is performed iteratively. After several iterations, the training dataset is updated. This provides the most accurate evaluation of the Agent's actions according to the current policy.

During training, we obtained a model capable of generating profits on both the training and test datasets. But there is one caveat. The resulting model executes very few trades. We even extended the testing period to three months. The test results are presented below.

As can be seen from the results, over the three-month test interval, the model executed only 21 trades, with just over half closed profitably. Examining the balance graph, we observe an initial growth over the first month and a half, followed by sideways movement. This is quite expected behavior. Our model only collects statistics from market states present in the training dataset. Like any statistical model, the training set must be representative. From the balance graph, we can conclude that a one-year training dataset provides representativeness for approximately 1.2 to 1.5 months forward.


Author: Dmitriy Gizlyk