Discussing the article: "Neural Networks in Trading: Optimizing the Transformer for Time Series Forecasting (LSEAttention)"

 

Check out the new article: Neural Networks in Trading: Optimizing the Transformer for Time Series Forecasting (LSEAttention).

The LSEAttention framework offers improvements to the Transformer architecture. It was designed specifically for long-term multivariate time series forecasting. The approaches proposed by the authors of the method can be applied to solve problems of entropy collapse and learning instability, which are often encountered with vanilla Transformer.

In domains such as computer vision and natural language processing, attention matrices can suffer from entropy collapse or rank collapse. This problem is further exacerbated in time series forecasting due to the frequent fluctuations inherent in time-based data, often resulting in substantial degradation of model performance. The underlying causes of entropy collapse remain poorly understood, highlighting the need for further investigation into its mechanisms and effects on model generalization. These challenges are the focus of the paper titled "LSEAttention is All You Need for Time Series Forecasting".

To ensure a fair comparison, we replicated the HypDiff model training algorithm in full. The same training dataset was used. However, this time we did not perform iterative updates to the training set. While this might slightly degrade training performance, it allows for an accurate comparison of the model before and after algorithm optimization.

The models were evaluated using real historical data from Q1 of 2024. The test results are presented below.

It should be noted that the model performance before and after modification was quite similar. During the test period, the updated model executed 24 trades. It deviated from the baseline model by only one trade, which falls within the margin of error. Both models made 13 profitable trades. The only visible improvement was the absence of a drawdown in February.

Author: Dmitriy Gizlyk