Dmitriy Gizlyk
Dmitriy Gizlyk
4.4 (50)
  • Informazioni
11+ anni
esperienza
0
prodotti
0
versioni demo
134
lavori
0
segnali
0
iscritti
Professional programming of any complexity for MT4, MT5, C#.
Dmitriy Gizlyk
Articolo pubblicato Neural Networks Made Easy (Part 95): Reducing Memory Consumption in Transformer Models
Neural Networks Made Easy (Part 95): Reducing Memory Consumption in Transformer Models

Transformer architecture-based models demonstrate high efficiency, but their use is complicated by high resource costs both at the training stage and during operation. In this article, I propose to get acquainted with algorithms that allow to reduce memory usage of such models.

3
Dmitriy Gizlyk
Articolo pubblicato Neural Networks Made Easy (Part 94): Optimizing the Input Sequence
Neural Networks Made Easy (Part 94): Optimizing the Input Sequence

When working with time series, we always use the source data in their historical sequence. But is this the best option? There is an opinion that changing the sequence of the input data will improve the efficiency of the trained models. In this article I invite you to get acquainted with one of the methods for optimizing the input sequence.

2
Dmitriy Gizlyk
Articolo pubblicato Neural Networks Made Easy (Part 93): Adaptive Forecasting in Frequency and Time Domains (Final Part)
Neural Networks Made Easy (Part 93): Adaptive Forecasting in Frequency and Time Domains (Final Part)

In this article, we continue the implementation of the approaches of the ATFNet model, which adaptively combines the results of 2 blocks (frequency and time) within time series forecasting.

1
Dmitriy Gizlyk
Articolo pubblicato Neural Networks Made Easy (Part 92): Adaptive Forecasting in Frequency and Time Domains
Neural Networks Made Easy (Part 92): Adaptive Forecasting in Frequency and Time Domains

The authors of the FreDF method experimentally confirmed the advantage of combined forecasting in the frequency and time domains. However, the use of the weight hyperparameter is not optimal for non-stationary time series. In this article, we will get acquainted with the method of adaptive combination of forecasts in frequency and time domains.

Dmitriy Gizlyk
Articolo pubblicato Neural Networks Made Easy (Part 91): Frequency Domain Forecasting (FreDF)
Neural Networks Made Easy (Part 91): Frequency Domain Forecasting (FreDF)

We continue to explore the analysis and forecasting of time series in the frequency domain. In this article, we will get acquainted with a new method to forecast data in the frequency domain, which can be added to many of the algorithms we have studied previously.

1
Dmitriy Gizlyk
Articolo pubblicato Neural Networks Made Easy (Part 90): Frequency Interpolation of Time Series (FITS)
Neural Networks Made Easy (Part 90): Frequency Interpolation of Time Series (FITS)

By studying the FEDformer method, we opened the door to the frequency domain of time series representation. In this new article, we will continue the topic we started. We will consider a method with which we can not only conduct an analysis, but also predict subsequent states in a particular area.

1
Dmitriy Gizlyk
Articolo pubblicato Neural networks made easy (Part 89): Frequency Enhanced Decomposition Transformer (FEDformer)
Neural networks made easy (Part 89): Frequency Enhanced Decomposition Transformer (FEDformer)

All the models we have considered so far analyze the state of the environment as a time sequence. However, the time series can also be represented in the form of frequency features. In this article, I introduce you to an algorithm that uses frequency components of a time sequence to predict future states.

1
Dmitriy Gizlyk
Articolo pubblicato Neural Networks Made Easy (Part 88): Time-Series Dense Encoder (TiDE)
Neural Networks Made Easy (Part 88): Time-Series Dense Encoder (TiDE)

In an attempt to obtain the most accurate forecasts, researchers often complicate forecasting models. Which in turn leads to increased model training and maintenance costs. Is such an increase always justified? This article introduces an algorithm that uses the simplicity and speed of linear models and demonstrates results on par with the best models with a more complex architecture.

2
Dmitriy Gizlyk
Articolo pubblicato Neural Networks Made Easy (Part 87): Time Series Patching
Neural Networks Made Easy (Part 87): Time Series Patching

Forecasting plays an important role in time series analysis. In the new article, we will talk about the benefits of time series patching.

2
Dmitriy Gizlyk
Articolo pubblicato Neural Networks Made Easy (Part 86): U-Shaped Transformer
Neural Networks Made Easy (Part 86): U-Shaped Transformer

We continue to study timeseries forecasting algorithms. In this article, we will discuss another method: the U-shaped Transformer.

3
Dmitriy Gizlyk
Articolo pubblicato Neural Networks Made Easy (Part 85): Multivariate Time Series Forecasting
Neural Networks Made Easy (Part 85): Multivariate Time Series Forecasting

In this article, I would like to introduce you to a new complex timeseries forecasting method, which harmoniously combines the advantages of linear models and transformers.

2
Dmitriy Gizlyk
Articolo pubblicato Neural Networks Made Easy (Part 84): Reversible Normalization (RevIN)
Neural Networks Made Easy (Part 84): Reversible Normalization (RevIN)

We already know that pre-processing of the input data plays a major role in the stability of model training. To process "raw" input data online, we often use a batch normalization layer. But sometimes we need a reverse procedure. In this article, we discuss one of the possible approaches to solving this problem.

4
Dmitriy Gizlyk
Articolo pubblicato Neural Networks Made Easy (Part 83): The "Conformer" Spatio-Temporal Continuous Attention Transformer Algorithm
Neural Networks Made Easy (Part 83): The "Conformer" Spatio-Temporal Continuous Attention Transformer Algorithm

This article introduces the Conformer algorithm originally developed for the purpose of weather forecasting, which in terms of variability and capriciousness can be compared to financial markets. Conformer is a complex method. It combines the advantages of attention models and ordinary differential equations.

5
Look Mode
Look Mode 2024.03.30
Здравствуйте, как эти файлы попробовать (тестировать) из файлы Comformer?
Dmitriy Gizlyk
Articolo pubblicato Neural networks made easy (Part 82): Ordinary Differential Equation models (NeuralODE)
Neural networks made easy (Part 82): Ordinary Differential Equation models (NeuralODE)

In this article, we will discuss another type of models that are aimed at studying the dynamics of the environmental state.

5
Dmitriy Gizlyk
Articolo pubblicato Neural Networks Made Easy (Part 81): Context-Guided Motion Analysis (CCMR)
Neural Networks Made Easy (Part 81): Context-Guided Motion Analysis (CCMR)

In previous works, we always assessed the current state of the environment. At the same time, the dynamics of changes in indicators always remained "behind the scenes". In this article I want to introduce you to an algorithm that allows you to evaluate the direct change in data between 2 successive environmental states.

2
Dmitriy Gizlyk
Articolo pubblicato Neural networks made easy (Part 80): Graph Transformer Generative Adversarial Model (GTGAN)
Neural networks made easy (Part 80): Graph Transformer Generative Adversarial Model (GTGAN)

In this article, I will get acquainted with the GTGAN algorithm, which was introduced in January 2024 to solve complex problems of generation architectural layouts with graph constraints.

3
Dmitriy Gizlyk
Articolo pubblicato Neural networks made easy (Part 79): Feature Aggregated Queries (FAQ) in the context of state
Neural networks made easy (Part 79): Feature Aggregated Queries (FAQ) in the context of state

In the previous article, we got acquainted with one of the methods for detecting objects in an image. However, processing a static image is somewhat different from working with dynamic time series, such as the dynamics of the prices we analyze. In this article, we will consider the method of detecting objects in video, which is somewhat closer to the problem we are solving.

3
Dmitriy Gizlyk
Articolo pubblicato Neural networks made easy (Part 78): Decoder-free Object Detector with Transformer (DFFT)
Neural networks made easy (Part 78): Decoder-free Object Detector with Transformer (DFFT)

In this article, I propose to look at the issue of building a trading strategy from a different angle. We will not predict future price movements, but will try to build a trading system based on the analysis of historical data.

4
Dmitriy Gizlyk
Articolo pubblicato Neural networks made easy (Part 77): Cross-Covariance Transformer (XCiT)
Neural networks made easy (Part 77): Cross-Covariance Transformer (XCiT)

In our models, we often use various attention algorithms. And, probably, most often we use Transformers. Their main disadvantage is the resource requirement. In this article, we will consider a new algorithm that can help reduce computing costs without losing quality.

2
Dmitriy Gizlyk
Articolo pubblicato Neural networks made easy (Part 76): Exploring diverse interaction patterns with Multi-future Transformer
Neural networks made easy (Part 76): Exploring diverse interaction patterns with Multi-future Transformer

This article continues the topic of predicting the upcoming price movement. I invite you to get acquainted with the Multi-future Transformer architecture. Its main idea is to decompose the multimodal distribution of the future into several unimodal distributions, which allows you to effectively simulate various models of interaction between agents on the scene.

1