Other classes in DoEasy library (Part 66): MQL5.com Signals collection class
In this article, I will create the signal collection class of the MQL5.com Signals service with the functions for managing signals. Besides, I will improve the Depth of Market snapshot object class for displaying the total DOM buy and sell volumes.
Neural networks made easy (Part 12): Dropout
As the next step in studying neural networks, I suggest considering the methods of increasing convergence during neural network training. There are several such methods. In this article we will consider one of them entitled Dropout.
Prices and Signals in DoEasy library (Part 65): Depth of Market collection and the class for working with MQL5.com Signals
In this article, I will create the collection class of Depths of Market of all symbols and start developing the functionality for working with the MQL5.com Signals service by creating the signal object class.
Prices in DoEasy library (Part 64): Depth of Market, classes of DOM snapshot and snapshot series objects
In this article, I will create two classes (the class of DOM snapshot object and the class of DOM snapshot series object) and test creation of the DOM data series.
Machine learning in Grid and Martingale trading systems. Would you bet on it?
This article describes the machine learning technique applied to grid and martingale trading. Surprisingly, this approach has little to no coverage in the global network. After reading the article, you will be able to create your own trading bots.
Self-adapting algorithm (Part IV): Additional functionality and tests
I continue filling the algorithm with the minimum necessary functionality and testing the results. The profitability is quite low but the articles demonstrate the model of the fully automated profitable trading on completely different instruments traded on fundamentally different markets.
Prices in DoEasy library (part 63): Depth of Market and its abstract request class
In the article, I will start developing the functionality for working with the Depth of Market. I will also create the class of the Depth of Market abstract order object and its descendants.
Neural networks made easy (Part 11): A take on GPT
Perhaps one of the most advanced models among currently existing language neural networks is GPT-3, the maximal variant of which contains 175 billion parameters. Of course, we are not going to create such a monster on our home PCs. However, we can view which architectural solutions can be used in our work and how we can benefit from them.
Prices in DoEasy library (part 62): Updating tick series in real time, preparation for working with Depth of Market
In this article, I will implement updating tick data in real time and prepare the symbol object class for working with Depth of Market (DOM itself is to be implemented in the next article).
Prices in DoEasy library (part 61): Collection of symbol tick series
Since a program may use different symbols in its work, a separate list should be created for each of them. In this article, I will combine such lists into a tick data collection. In fact, this will be a regular list based on the class of dynamic array of pointers to instances of CObject class and its descendants of the Standard library.
Self-adapting algorithm (Part III): Abandoning optimization
It is impossible to get a truly stable algorithm if we use optimization based on historical data to select parameters. A stable algorithm should be aware of what parameters are needed when working on any trading instrument at any time. It should not forecast or guess, it should know for sure.
Practical application of neural networks in trading (Part 2). Computer vision
The use of computer vision allows training neural networks on the visual representation of the price chart and indicators. This method enables wider operations with the whole complex of technical indicators, since there is no need to feed them digitally into the neural network.
Neural networks made easy (Part 10): Multi-Head Attention
We have previously considered the mechanism of self-attention in neural networks. In practice, modern neural network architectures use several parallel self-attention threads to find various dependencies between the elements of a sequence. Let us consider the implementation of such an approach and evaluate its impact on the overall network performance.
Developing a self-adapting algorithm (Part II): Improving efficiency
In this article, I will continue the development of the topic by improving the flexibility of the previously created algorithm. The algorithm became more stable with an increase in the number of candles in the analysis window or with an increase in the threshold percentage of the overweight of falling or growing candles. I had to make a compromise and set a larger sample size for analysis or a larger percentage of the prevailing candle excess.
Finding seasonal patterns in the forex market using the CatBoost algorithm
The article considers the creation of machine learning models with time filters and discusses the effectiveness of this approach. The human factor can be eliminated now by simply instructing the model to trade at a certain hour of a certain day of the week. Pattern search can be provided by a separate algorithm.
The market and the physics of its global patterns
In this article, I will try to test the assumption that any system with even a small understanding of the market can operate on a global scale. I will not invent any theories or patterns, but I will only use known facts, gradually translating these facts into the language of mathematical analysis.
Developing a self-adapting algorithm (Part I): Finding a basic pattern
In the upcoming series of articles, I will demonstrate the development of self-adapting algorithms considering most market factors, as well as show how to systematize these situations, describe them in logic and take them into account in your trading activity. I will start with a very simple algorithm that will gradually acquire theory and evolve into a very complex project.
Prices in DoEasy library (part 59): Object to store data of one tick
From this article on, start creating library functionality to work with price data. Today, create an object class which will store all price data which arrived with yet another tick.
Manual charting and trading toolkit (Part II). Chart graphics drawing tools
This is the next article within the series, in which I show how I created a convenient library for manual application of chart graphics by utilizing keyboard shortcuts. The tools used include straight lines and their combinations. In this part, we will view how the drawing tools are applied using the functions described in the first part. The library can be connected to any Expert Advisor or indicator which will greatly simplify the charting tasks. This solution DOES NOT use external dlls, while all the commands are implemented using built-in MQL tools.
Neural networks made easy (Part 7): Adaptive optimization methods
In previous articles, we used stochastic gradient descent to train a neural network using the same learning rate for all neurons within the network. In this article, I propose to look towards adaptive learning methods which enable changing of the learning rate for each neuron. We will also consider the pros and cons of this approach.
Analyzing charts using DeMark Sequential and Murray-Gann levels
Thomas DeMark Sequential is good at showing balance changes in the price movement. This is especially evident if we combine its signals with a level indicator, for example, Murray levels. The article is intended mostly for beginners and those who still cannot find their "Grail". I will also display some features of building levels that I have not seen on other forums. So, the article will probably be useful for advanced traders as well... Suggestions and reasonable criticism are welcome...
Gradient boosting in transductive and active machine learning
In this article, we will consider active machine learning methods utilizing real data, as well discuss their pros and cons. Perhaps you will find these methods useful and will include them in your arsenal of machine learning models. Transduction was introduced by Vladimir Vapnik, who is the co-inventor of the Support-Vector Machine (SVM).
Optimal approach to the development and analysis of trading systems
In this article, I will show the criteria to be used when selecting a system or a signal for investing your funds, as well as describe the optimal approach to the development of trading systems and highlight the importance of this matter in Forex trading.
Timeseries in DoEasy library (part 56): Custom indicator object, get data from indicator objects in the collection
The article considers creation of the custom indicator object for the use in EAs. Let’s slightly improve library classes and add methods to get data from indicator objects in EAs.
Practical application of neural networks in trading. Python (Part I)
In this article, we will analyze the step-by-step implementation of a trading system based on the programming of deep neural networks in Python. This will be performed using the TensorFlow machine learning library developed by Google. We will also use the Keras library for describing neural networks.
Neural networks made easy (Part 5): Multithreaded calculations in OpenCL
We have earlier discussed some types of neural network implementations. In the considered networks, the same operations are repeated for each neuron. A logical further step is to utilize multithreaded computing capabilities provided by modern technology in an effort to speed up the neural network learning process. One of the possible implementations is described in this article.
Timeseries in DoEasy library (part 55): Indicator collection class
The article continues developing indicator object classes and their collections. For each indicator object create its description and correct collection class for error-free storage and getting indicator objects from the collection list.
Neural networks made easy (Part 4): Recurrent networks
We continue studying the world of neural networks. In this article, we will consider another type of neural networks, recurrent networks. This type is proposed for use with time series, which are represented in the MetaTrader 5 trading platform by price charts.
Timeseries in DoEasy library (part 54): Descendant classes of abstract base indicator
The article considers creation of classes of descendant objects of base abstract indicator. Such objects will provide access to features of creating indicator EAs, collecting and getting data value statistics of various indicators and prices. Also, create indicator object collection from which getting access to properties and data of each indicator created in the program will be possible.
Grid and martingale: what are they and how to use them?
In this article, I will try to explain in detail what grid and martingale are, as well as what they have in common. Besides, I will try to analyze how viable these strategies really are. The article features mathematical and practical sections.
Brute force approach to pattern search
In this article, we will search for market patterns, create Expert Advisors based on the identified patterns, and check how long these patterns remain valid, if they ever retain their validity.
Timeseries in DoEasy library (part 53): Abstract base indicator class
The article considers creation of an abstract indicator which further will be used as the base class to create objects of library’s standard and custom indicators.
Timeseries in DoEasy library (part 52): Cross-platform nature of multi-period multi-symbol single-buffer standard indicators
In the article, consider creation of multi-symbol multi-period standard indicator Accumulation/Distribution. Slightly improve library classes with respect to indicators so that, the programs developed for outdated platform MetaTrader 4 based on this library could work normally when switching over to MetaTrader 5.
Neural networks made easy (Part 3): Convolutional networks
As a continuation of the neural network topic, I propose considering convolutional neural networks. This type of neural network are usually applied to analyzing visual imagery. In this article, we will consider the application of these networks in the financial markets.
Basic math behind Forex trading
The article aims to describe the main features of Forex trading as simply and quickly as possible, as well as share some basic ideas with beginners. It also attempts to answer the most tantalizing questions in the trading community along with showcasing the development of a simple indicator.
Advanced resampling and selection of CatBoost models by brute-force method
This article describes one of the possible approaches to data transformation aimed at improving the generalizability of the model, and also discusses sampling and selection of CatBoost models.
Timeseries in DoEasy library (part 51): Composite multi-period multi-symbol standard indicators
In the article, complete development of objects of multi-period multi-symbol standard indicators. Using Ichimoku Kinko Hyo standard indicator example, analyze creation of compound custom indicators which have auxiliary drawn buffers for displaying data on the chart.
A scientific approach to the development of trading algorithms
The article considers the methodology for developing trading algorithms, in which a consistent scientific approach is used to analyze possible price patterns and to build trading algorithms based on these patterns. Development ideals are demonstrated using examples.
CatBoost machine learning algorithm from Yandex with no Python or R knowledge required
The article provides the code and the description of the main stages of the machine learning process using a specific example. To obtain the model, you do not need Python or R knowledge. Furthermore, basic MQL5 knowledge is enough — this is exactly my level. Therefore, I hope that the article will serve as a good tutorial for a broad audience, assisting those interested in evaluating machine learning capabilities and in implementing them in their programs.
Neural networks made easy (Part 2): Network training and testing
In this second article, we will continue to study neural networks and will consider an example of using our created CNet class in Expert Advisors. We will work with two neural network models, which show similar results both in terms of training time and prediction accuracy.