Dmitriy Gizlyk
Dmitriy Gizlyk
4.4 (50)
  • Informations
12+ années
expérience
0
produits
0
versions de démo
134
offres d’emploi
0
signaux
0
les abonnés
Professional programming of any complexity for MT4, MT5, C#.
Dmitriy Gizlyk
Article publié Neural Networks in Trading: A Multi-Agent System with Conceptual Reinforcement (Final Part)
Neural Networks in Trading: A Multi-Agent System with Conceptual Reinforcement (Final Part)

We continue to implement the approaches proposed by the authors of the FinCon framework. FinCon is a multi-agent system based on Large Language Models (LLMs). Today, we will implement the necessary modules and conduct comprehensive testing of the model on real historical data.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: A Multi-Agent System with Conceptual Reinforcement (FinCon)
Neural Networks in Trading: A Multi-Agent System with Conceptual Reinforcement (FinCon)

We invite you to explore the FinCon framework, which is a a Large Language Model (LLM)-based multi-agent system. The framework uses conceptual verbal reinforcement to improve decision making and risk management, enabling effective performance on a variety of financial tasks.

Dmitriy Gizlyk
Article publié Neural Networks in Trading: A Multimodal, Tool-Augmented Agent for Financial Markets (Final Part)
Neural Networks in Trading: A Multimodal, Tool-Augmented Agent for Financial Markets (Final Part)

We continue to develop the algorithms for FinAgent, a multimodal financial trading agent designed to analyze multimodal market dynamics data and historical trading patterns.

Dmitriy Gizlyk
Article publié Neural Networks in Trading: A Multimodal, Tool-Augmented Agent for Financial Markets (FinAgent)
Neural Networks in Trading: A Multimodal, Tool-Augmented Agent for Financial Markets (FinAgent)

We invite you to explore FinAgent, a multimodal financial trading agent framework designed to analyze various types of data reflecting market dynamics and historical trading patterns.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: An Agent with Layered Memory (Final Part)
Neural Networks in Trading: An Agent with Layered Memory (Final Part)

We continue our work on creating the FinMem framework, which uses layered memory approaches that mimic human cognitive processes. This allows the model not only to effectively process complex financial data but also to adapt to new signals, significantly improving the accuracy and effectiveness of investment decisions in dynamically changing markets.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: An Agent with Layered Memory
Neural Networks in Trading: An Agent with Layered Memory

Layered memory approaches that mimic human cognitive processes enable the processing of complex financial data and adaptation to new signals, thereby improving the effectiveness of investment decisions in dynamic markets.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: Models Using Wavelet Transform and Multi-Task Attention (Final Part)
Neural Networks in Trading: Models Using Wavelet Transform and Multi-Task Attention (Final Part)

In the previous article, we explored the theoretical foundations and began implementing the approaches of the Multitask-Stockformer framework, which combines the wavelet transform and the Self-Attention multitask model. We continue to implement the algorithms of this framework and evaluate their effectiveness on real historical data.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: Models Using Wavelet Transform and Multi-Task Attention
Neural Networks in Trading: Models Using Wavelet Transform and Multi-Task Attention

We invite you to explore a framework that combines wavelet transforms and a multi-task self-attention model, aimed at improving the responsiveness and accuracy of forecasting in volatile market conditions. The wavelet transform allows asset returns to be decomposed into high and low frequencies, carefully capturing long-term market trends and short-term fluctuations.

Dmitriy Gizlyk
Article publié Neural Networks in Trading: A Hybrid Trading Framework with Predictive Coding (Final Part)
Neural Networks in Trading: A Hybrid Trading Framework with Predictive Coding (Final Part)

We continue our examination of the StockFormer hybrid trading system, which combines predictive coding and reinforcement learning algorithms for financial time series analysis. The system is based on three Transformer branches with a Diversified Multi-Head Attention (DMH-Attn) mechanism that enables the capturing of complex patterns and interdependencies between assets. Previously, we got acquainted with the theoretical aspects of the framework and implemented the DMH-Attn mechanisms. Today, we will talk about the model architecture and training.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: A Hybrid Trading Framework with Predictive Coding (StockFormer)
Neural Networks in Trading: A Hybrid Trading Framework with Predictive Coding (StockFormer)

In this article, we will discuss the hybrid trading system StockFormer, which combines predictive coding and reinforcement learning (RL) algorithms. The framework uses 3 Transformer branches with an integrated Diversified Multi-Head Attention (DMH-Attn) mechanism that improves on the vanilla attention module with a multi-headed Feed-Forward block, allowing it to capture diverse time series patterns across different subspaces.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: An Ensemble of Agents with Attention Mechanisms (Final Part)
Neural Networks in Trading: An Ensemble of Agents with Attention Mechanisms (Final Part)

In the previous article, we introduced the multi-agent adaptive framework MASAAT, which uses an ensemble of agents to perform cross-analysis of multimodal time series at different data scales. Today we will continue implementing the approaches of this framework in MQL5 and bring this work to a logical conclusion.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: An Ensemble of Agents with Attention Mechanisms (MASAAT)
Neural Networks in Trading: An Ensemble of Agents with Attention Mechanisms (MASAAT)

We introduce the Multi-Agent Self-Adaptive Portfolio Optimization Framework (MASAAT), which combines attention mechanisms and time series analysis. MASAAT generates a set of agents that analyze price series and directional changes, enabling the identification of significant fluctuations in asset prices at different levels of detail.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: A Multi-Agent Self-Adaptive Model (Final Part)
Neural Networks in Trading: A Multi-Agent Self-Adaptive Model (Final Part)

In the previous article, we introduced the multi-agent self-adaptive framework MASA, which combines reinforcement learning approaches and self-adaptive strategies, providing a harmonious balance between profitability and risk in turbulent market conditions. We have built the functionality of individual agents within this framework. In this article, we will continue the work we started, bringing it to its logical conclusion.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: A Multi-Agent Self-Adaptive Model (MASA)
Neural Networks in Trading: A Multi-Agent Self-Adaptive Model (MASA)

I invite you to get acquainted with the Multi-Agent Self-Adaptive (MASA) framework, which combines reinforcement learning and adaptive strategies, providing a harmonious balance between profitability and risk management in turbulent market conditions.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: Parameter-Efficient Transformer with Segmented Attention (Final Part)
Neural Networks in Trading: Parameter-Efficient Transformer with Segmented Attention (Final Part)

In the previous work, we discussed the theoretical aspects of the PSformer framework, which includes two major innovations in the classical Transformer architecture: the Parameter Shared (PS) mechanism and attention to spatio-temporal segments (SegAtt). In this article, we continue the work we started on implementing the proposed approaches using MQL5.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: A Parameter-Efficient Transformer with Segmented Attention (PSformer)
Neural Networks in Trading: A Parameter-Efficient Transformer with Segmented Attention (PSformer)

This article introduces the new PSformer framework, which adapts the architecture of the vanilla Transformer to solving problems related to multivariate time series forecasting. The framework is based on two key innovations: the Parameter Sharing (PS) mechanism and the Segment Attention (SegAtt).

1
youwei_qing
youwei_qing 2025.04.21
I noticed that the feedForward method with the second input parameter isn't working at all. Could this be an issue? virtual bool feedForward(CNeuronBaseOCL *NeuronOCL); virtual bool feedForward(CNeuronBaseOCL *NeuronOCL, CBufferFloat *SecondInput) { return feedForward(NeuronOCL); }
Dmitriy Gizlyk
Article publié Neural Networks in Trading: Enhancing Transformer Efficiency by Reducing Sharpness (Final Part)
Neural Networks in Trading: Enhancing Transformer Efficiency by Reducing Sharpness (Final Part)

SAMformer offers a solution to the key drawbacks of Transformer models in long-term time series forecasting, such as training complexity and poor generalization on small datasets. Its shallow architecture and sharpness-aware optimization help avoid suboptimal local minima. In this article, we will continue to implement approaches using MQL5 and evaluate their practical value.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: Enhancing Transformer Efficiency by Reducing Sharpness (SAMformer)
Neural Networks in Trading: Enhancing Transformer Efficiency by Reducing Sharpness (SAMformer)

Training Transformer models requires large amounts of data and is often difficult since the models are not good at generalizing to small datasets. The SAMformer framework helps solve this problem by avoiding poor local minima. This improves the efficiency of models even on limited training datasets.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: Optimizing the Transformer for Time Series Forecasting (LSEAttention)
Neural Networks in Trading: Optimizing the Transformer for Time Series Forecasting (LSEAttention)

The LSEAttention framework offers improvements to the Transformer architecture. It was designed specifically for long-term multivariate time series forecasting. The approaches proposed by the authors of the method can be applied to solve problems of entropy collapse and learning instability, which are often encountered with vanilla Transformer.

1
Dmitriy Gizlyk
Article publié Neural Networks in Trading: Hyperbolic Latent Diffusion Model (Final Part)
Neural Networks in Trading: Hyperbolic Latent Diffusion Model (Final Part)

The use of anisotropic diffusion processes for encoding the initial data in a hyperbolic latent space, as proposed in the HypDIff framework, assists in preserving the topological features of the current market situation and improves the quality of its analysis. In the previous article, we started implementing the proposed approaches using MQL5. Today we will continue the work we started and will bring it to its logical conclusion.

1