Dmitriy Gizlyk
Dmitriy Gizlyk
4.4 (50)
  • Information
11+ years
experience
0
products
0
demo versions
134
jobs
0
signals
0
subscribers
Professional programming of any complexity for MT4, MT5, C#.
Dmitriy Gizlyk
Published article Neural Networks in Trading: Practical Results of the TEMPO Method
Neural Networks in Trading: Practical Results of the TEMPO Method

We continue our acquaintance with the TEMPO method. In this article we will evaluate the actual effectiveness of the proposed approaches on real historical data.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Using Language Models for Time Series Forecasting
Neural Networks in Trading: Using Language Models for Time Series Forecasting

We continue to study time series forecasting models. In this article, we get acquainted with a complex algorithm built on the use of a pre-trained language model.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Lightweight Models for Time Series Forecasting
Neural Networks in Trading: Lightweight Models for Time Series Forecasting

Lightweight time series forecasting models achieve high performance using a minimum number of parameters. This, in turn, reduces the consumption of computing resources and speeds up decision-making. Despite being lightweight, such models achieve forecast quality comparable to more complex ones.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Reducing Memory Consumption with Adam-mini Optimization
Neural Networks in Trading: Reducing Memory Consumption with Adam-mini Optimization

One of the directions for increasing the efficiency of the model training and convergence process is the improvement of optimization methods. Adam-mini is an adaptive optimization method designed to improve on the basic Adam algorithm.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Spatio-Temporal Neural Network (STNN)
Neural Networks in Trading: Spatio-Temporal Neural Network (STNN)

In this article we will talk about using space-time transformations to effectively predict upcoming price movement. To improve the numerical prediction accuracy in STNN, a continuous attention mechanism is proposed that allows the model to better consider important aspects of the data.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Dual-Attention-Based Trend Prediction Model
Neural Networks in Trading: Dual-Attention-Based Trend Prediction Model

We continue the discussion about the use of piecewise linear representation of time series, which was started in the previous article. Today we will see how to combine this method with other approaches to time series analysis to improve the price trend prediction quality.

Dmitriy Gizlyk
Published article Neural Networks in Trading: Piecewise Linear Representation of Time Series
Neural Networks in Trading: Piecewise Linear Representation of Time Series

This article is somewhat different from my earlier publications. In this article, we will talk about an alternative representation of time series. Piecewise linear representation of time series is a method of approximating a time series using linear functions over small intervals.

Dmitriy Gizlyk
Published article Neural Networks Made Easy (Part 97): Training Models With MSFformer
Neural Networks Made Easy (Part 97): Training Models With MSFformer

When exploring various model architecture designs, we often devote insufficient attention to the process of model training. In this article, I aim to address this gap.

Dmitriy Gizlyk
Published article Neural Networks Made Easy (Part 96): Multi-Scale Feature Extraction (MSFformer)
Neural Networks Made Easy (Part 96): Multi-Scale Feature Extraction (MSFformer)

Efficient extraction and integration of long-term dependencies and short-term features remain an important task in time series analysis. Their proper understanding and integration are necessary to create accurate and reliable predictive models.

Dmitriy Gizlyk
Published article Neural Networks Made Easy (Part 95): Reducing Memory Consumption in Transformer Models
Neural Networks Made Easy (Part 95): Reducing Memory Consumption in Transformer Models

Transformer architecture-based models demonstrate high efficiency, but their use is complicated by high resource costs both at the training stage and during operation. In this article, I propose to get acquainted with algorithms that allow to reduce memory usage of such models.

Dmitriy Gizlyk
Published article Neural Networks Made Easy (Part 94): Optimizing the Input Sequence
Neural Networks Made Easy (Part 94): Optimizing the Input Sequence

When working with time series, we always use the source data in their historical sequence. But is this the best option? There is an opinion that changing the sequence of the input data will improve the efficiency of the trained models. In this article I invite you to get acquainted with one of the methods for optimizing the input sequence.

Dmitriy Gizlyk
Published article Neural Networks Made Easy (Part 93): Adaptive Forecasting in Frequency and Time Domains (Final Part)
Neural Networks Made Easy (Part 93): Adaptive Forecasting in Frequency and Time Domains (Final Part)

In this article, we continue the implementation of the approaches of the ATFNet model, which adaptively combines the results of 2 blocks (frequency and time) within time series forecasting.

Dmitriy Gizlyk
Published article Neural Networks Made Easy (Part 92): Adaptive Forecasting in Frequency and Time Domains
Neural Networks Made Easy (Part 92): Adaptive Forecasting in Frequency and Time Domains

The authors of the FreDF method experimentally confirmed the advantage of combined forecasting in the frequency and time domains. However, the use of the weight hyperparameter is not optimal for non-stationary time series. In this article, we will get acquainted with the method of adaptive combination of forecasts in frequency and time domains.

Dmitriy Gizlyk
Published article Neural Networks Made Easy (Part 91): Frequency Domain Forecasting (FreDF)
Neural Networks Made Easy (Part 91): Frequency Domain Forecasting (FreDF)

We continue to explore the analysis and forecasting of time series in the frequency domain. In this article, we will get acquainted with a new method to forecast data in the frequency domain, which can be added to many of the algorithms we have studied previously.

Dmitriy Gizlyk
Published article Neural Networks Made Easy (Part 90): Frequency Interpolation of Time Series (FITS)
Neural Networks Made Easy (Part 90): Frequency Interpolation of Time Series (FITS)

By studying the FEDformer method, we opened the door to the frequency domain of time series representation. In this new article, we will continue the topic we started. We will consider a method with which we can not only conduct an analysis, but also predict subsequent states in a particular area.

Dmitriy Gizlyk
Published article Neural networks made easy (Part 89): Frequency Enhanced Decomposition Transformer (FEDformer)
Neural networks made easy (Part 89): Frequency Enhanced Decomposition Transformer (FEDformer)

All the models we have considered so far analyze the state of the environment as a time sequence. However, the time series can also be represented in the form of frequency features. In this article, I introduce you to an algorithm that uses frequency components of a time sequence to predict future states.

Dmitriy Gizlyk
Published article Neural Networks Made Easy (Part 88): Time-Series Dense Encoder (TiDE)
Neural Networks Made Easy (Part 88): Time-Series Dense Encoder (TiDE)

In an attempt to obtain the most accurate forecasts, researchers often complicate forecasting models. Which in turn leads to increased model training and maintenance costs. Is such an increase always justified? This article introduces an algorithm that uses the simplicity and speed of linear models and demonstrates results on par with the best models with a more complex architecture.

Dmitriy Gizlyk
Published article Neural Networks Made Easy (Part 87): Time Series Patching
Neural Networks Made Easy (Part 87): Time Series Patching

Forecasting plays an important role in time series analysis. In the new article, we will talk about the benefits of time series patching.

Dmitriy Gizlyk
Published article Neural Networks Made Easy (Part 86): U-Shaped Transformer
Neural Networks Made Easy (Part 86): U-Shaped Transformer

We continue to study timeseries forecasting algorithms. In this article, we will discuss another method: the U-shaped Transformer.

Dmitriy Gizlyk
Published article Neural Networks Made Easy (Part 85): Multivariate Time Series Forecasting
Neural Networks Made Easy (Part 85): Multivariate Time Series Forecasting

In this article, I would like to introduce you to a new complex timeseries forecasting method, which harmoniously combines the advantages of linear models and transformers.