Discussing the article: "Neural Networks in Trading: Models Using Wavelet Transform and Multi-Task Attention"

 

Check out the new article: Neural Networks in Trading: Models Using Wavelet Transform and Multi-Task Attention.

We invite you to explore a framework that combines wavelet transforms and a multi-task self-attention model, aimed at improving the responsiveness and accuracy of forecasting in volatile market conditions. The wavelet transform allows asset returns to be decomposed into high and low frequencies, carefully capturing long-term market trends and short-term fluctuations.

In recent years, deep learning has become an indispensable tool in quantitative investing, particularly in refining multifactor strategies that form the foundation for understanding financial asset price movements. By automating feature learning and capturing nonlinear relationships in financial market data, deep learning algorithms effectively uncover complex patterns, thereby improving prediction accuracy. The global research community recognizes the potential of deep neural networks, such as recurrent neural networks (RNNs) and convolutional neural networks (CNNs), for forecasting stock and futures prices. However, while RNNs and CNNs are widely used, deeper neural architectures that extract and construct sequential market and signal information remain underexplored. This opens opportunities for further advancements in applying deep learning to stock markets.


Today, we introduce the Multitask-Stockformer framework, presented in the paper "Stockformer: A Price-Volume Factor Stock Selection Model Based on Wavelet Transform and Multi-Task Self-Attention Networks". Despite the similarity in name to the previously discussed StockFormer framework, these two models are unrelated - except for their shared objective: generating a profitable stock portfolio for trading in financial markets.


Author: Dmitriy Gizlyk

 
Is there anything planned about Gramian Angular Difference Field?