Discussing the article: "Neural Networks in Trading: Models Using Wavelet Transform and Multi-Task Attention (Final Part)"

 

Check out the new article: Neural Networks in Trading: Models Using Wavelet Transform and Multi-Task Attention (Final Part).

In the previous article, we explored the theoretical foundations and began implementing the approaches of the Multitask-Stockformer framework, which combines the wavelet transform and the Self-Attention multitask model. We continue to implement the algorithms of this framework and evaluate their effectiveness on real historical data.

During testing, the models were trained on EURUSD historical data for the entire year of 2023, with the H1 timeframe. All analyzed indicators were used with their default parameter settings.

Для первого этапа обучения мы использовали обучающую выборку, собранную в рамках предыдущих исследований. В дальнейшем обучающая выборка периодически обновлялась с целью адаптации к текущей политике АктераПосле нескольких циклов обучения и обновления выборки, была получена политика, демонстрирующая прибыльность как на обучающей, так и на тестовой выборках.

Testing of the trained policy was conducted on historical data for January 2024, with all other parameters unchanged. The results are presented below.


Author: Dmitriy Gizlyk