Discussing the article: "Neural Networks Made Easy (Part 96): Multi-Scale Feature Extraction (MSFformer)"

You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Check out the new article: Neural Networks Made Easy (Part 96): Multi-Scale Feature Extraction (MSFformer).
Efficient extraction and integration of long-term dependencies and short-term features remain an important task in time series analysis. Their proper understanding and integration are necessary to create accurate and reliable predictive models.
The authors of the MSFformer model propose an innovative architecture of the pyramidal attention mechanism at different time intervals, which underlies their method. In addition, in order to construct multi-level temporal information in the input data, they use feature convolution in the large-scale construction module CSCM (Coarser-Scale Construction Module). This allows them to extract temporal information at a coarser level.
The CSCM module constructs a tree of features of the analyzed time series. Here, the inputs are first passed through a fully connected layer to transform the feature dimensionality to a fixed size. Then several sequential, specially designed, FCNN feature convolution blocks are used.
In the FCNN block, feature vectors are first formed by extracting data from the input sequence using a given cross-step. These vectors are then combined. The combined vectors are then subject to convolution operations. Author's visualization of the FCNN block is presented below.
The CSCM module proposed by the authors uses several consecutive FCNN blocks. Each of them, using the results of the previous block as input, extracts features of a larger scale.
Author: Dmitriy Gizlyk