All (not yet) about Strategy Tester, Optimization and Cloud - page 12

 

Population optimization algorithms: Stochastic Diffusion Search (SDS)

The Stochastic Diffusion Search (SDS) algorithm was proposed in 1989 by J. Bishop and was actively developed by Bishop and S. Nasuto. A distinctive feature of this algorithm is its deep mathematical justification in comparison with other population algorithms. SDS was originally developed for discrete optimization. In 2011, its modification for global continuous optimization was proposed.
Population optimization algorithms: Stochastic Diffusion Search (SDS)
Population optimization algorithms: Stochastic Diffusion Search (SDS)
  • www.mql5.com
The article discusses Stochastic Diffusion Search (SDS), which is a very powerful and efficient optimization algorithm based on the principles of random walk. The algorithm allows finding optimal solutions in complex multidimensional spaces, while featuring a high speed of convergence and the ability to avoid local extrema.
 

Integrating ML models with the Strategy Tester (Conclusion): Implementing a regression model for price prediction

In the previous article, we completed the implementation of a CSV file management class for storing and retrieving data related to financial markets. Having created the infrastructure, we are now ready to use this data to build and train a machine learning model.

Our task in this article is to implement a regression model that can predict the closing price of a financial asset within a week. This forecast will allow us to analyze market behavior and make informed decisions when trading financial assets.

Integrating ML models with the Strategy Tester (Conclusion): Implementing a regression model for price prediction
Integrating ML models with the Strategy Tester (Conclusion): Implementing a regression model for price prediction
  • www.mql5.com
This article describes the implementation of a regression model based on a decision tree. The model should predict prices of financial assets. We have already prepared the data, trained and evaluated the model, as well as adjusted and optimized it. However, it is important to note that this model is intended for study purposes only and should not be used in real trading.
 

Population optimization algorithms: Charged System Search (CSS) algorithm

Charged System Search (CSS) was first proposed by A. Kaveh and S. Talatahari in 2010.

Optimization is an important and integral part of solving problems of mathematical modeling and machine learning. Metaheuristic algorithms are an effective and popular class of optimization methods. Metaheuristics can be understood as an algorithm that stochastically searches for possible solutions to a problem that are close to optimal until a certain condition is met or a given number of iterations is reached.

In the scientific literature, metaheuristics are considered to combine basic heuristic methods into higher-level algorithmic schemes that allow more efficient exploration of search spaces and decision making. This usually requires less work than developing new specialized heuristics. The challenge is to adapt general metaheuristic schemes to solve difficult optimization problems. In addition, an effective implementation of metaheuristics can ensure that a solution close to the optimal one is found in an acceptable time. Various approaches to understanding metaheuristics make it possible to formulate some fundamental properties that characterize them. In recent years, the use of metaheuristic methods has increased, and efforts have been made to increase the power of algorithms and reduce optimization time.

Population optimization algorithms: Charged System Search (CSS) algorithm
Population optimization algorithms: Charged System Search (CSS) algorithm
  • www.mql5.com
In this article, we will consider another optimization algorithm inspired by inanimate nature - Charged System Search (CSS) algorithm. The purpose of this article is to present a new optimization algorithm based on the principles of physics and mechanics.
 

Optimisation allows you to see profit where there is none.

A good optimiser allows you to switch to real, spend extra and enter new optimisations faster.

brute force optimisation algorithms improves brute force skills :-) In extreme cases promotes dieting



 

Population optimization algorithms: Intelligent Water Drops (IWD) algorithm

In 2007, Iranian scientist Hamed Shah-Hosseini developed an algorithm for the behavior of intelligent drops. The IWD algorithm features several artificial drops of water, which, as a result of interaction, are able to change their environment in such a way that they find the optimal way along the path of least resistance. The IWD algorithm is a constructive population-oriented optimization algorithm.
Population optimization algorithms: Intelligent Water Drops (IWD) algorithm
Population optimization algorithms: Intelligent Water Drops (IWD) algorithm
  • www.mql5.com
The article considers an interesting algorithm derived from inanimate nature - intelligent water drops (IWD) simulating the process of river bed formation. The ideas of this algorithm made it possible to significantly improve the previous leader of the rating - SDS. As usual, the new leader (modified SDSm) can be found in the attachment.
 

Population optimization algorithms: Spiral Dynamics Optimization (SDO) algorithm

Spiral Dynamics Optimization (SDO) is one of the simplest physics algorithms proposed by Tamura and Yasuda in 2011 and developed using the logarithmic spiral phenomenon in nature. The algorithm is simple and has few control parameters. Moreover, the algorithm has high computation speed, local search capability, diversification at an early stage and intensification at a later stage.
Population optimization algorithms: Spiral Dynamics Optimization (SDO) algorithm
Population optimization algorithms: Spiral Dynamics Optimization (SDO) algorithm
  • www.mql5.com
The article presents an optimization algorithm based on the patterns of constructing spiral trajectories in nature, such as mollusk shells - the spiral dynamics optimization (SDO) algorithm. I have thoroughly revised and modified the algorithm proposed by the authors. The article will consider the necessity of these changes.
 

Population optimization algorithms: Differential Evolution (DE) 

1. Introduction
2. Algorithm
3. Test results

Differential evolution (DE) is one of the metaheuristic optimization methods. It differs from other methods in its simplicity and efficiency. DE uses a population of vectors that mutate and crossbreed to create new solutions. It does not require knowledge of the gradient and is capable of finding global optima.

The DE algorithm was developed in the 90s by Storn and Price (published in "Differential Evolution - A Simple and Efficient Heuristic for global Optimization over Continuous Spaces"), and has since become one of the most popular optimization methods that uses a population of parameter vectors to find the optimal solution.

Population optimization algorithms: Differential Evolution (DE)
Population optimization algorithms: Differential Evolution (DE)
  • www.mql5.com
In this article, we will consider the algorithm that demonstrates the most controversial results of all those discussed previously - the differential evolution (DE) algorithm.
 

Population optimization algorithms: Nelder–Mead, or simplex search (NM) method

1. Introduction
2. Algorithm
3. Test results

The Nelder-Mead method was called the "simplex method" and was published in the article "A Simplex Method for Function Minimization" in The Computer Journal in 1965. This method has been accepted by the scientific community and has become widely used in various fields requiring function optimization.

A simplex is a set of points forming a polyhedron, where each point is a set of parameter values of the function being optimized. The idea is to change and move the simplex in the parameter space to find the optimal value of the function.

The Nelder-Mead method (Nelder-Mead simplex method) belongs to the class of unconditional optimization algorithms. It is a deterministic algorithm that does not require the use of function derivatives and can work with functions that have multiple local minimums.

Population optimization algorithms: Nelder–Mead, or simplex search (NM) method
Population optimization algorithms: Nelder–Mead, or simplex search (NM) method
  • www.mql5.com
The article presents a complete exploration of the Nelder-Mead method, explaining how the simplex (function parameter space) is modified and rearranged at each iteration to achieve an optimal solution, and describes how the method can be improved.
 

Population optimization algorithms: Simulated Annealing (SA) algorithm. Part I

The Simulated Annealing algorithm was developed by Scott Kirkpatrick, George Gelatt and Mario Vecchi in 1983. When studying the properties of liquids and solids at high temperatures, it was found that the metal transforms into a liquid state and the particles are distributed randomly, while the state with minimum energy is achieved under the condition of a sufficiently high initial temperature and a sufficiently long cooling time. If this condition is not fulfilled, then the material will find itself in a metastable state with non-minimum energy - this is called hardening, which consists of sharp cooling of the material. In this case, the atomic structure has no symmetry (anisotropic state, or uneven properties of the material inside the crystal lattice).

Thus, the main idea of the algorithm is based on a mathematical analogue of the metal annealing process. During the annealing process, in order to evenly distribute its internal energy, the metal is heated to a high temperature and then slowly cooled, allowing the metal molecules to move and order into more stable states, while internal stresses in the metal are relieved and intercrystalline defects are removed. The term "annealing" is also associated with thermodynamic free energy, which is an attribute of the material and depends on its state.
Population optimization algorithms: Simulated Annealing (SA) algorithm. Part I
Population optimization algorithms: Simulated Annealing (SA) algorithm. Part I
  • www.mql5.com
The Simulated Annealing algorithm is a metaheuristic inspired by the metal annealing process. In the article, we will conduct a thorough analysis of the algorithm and debunk a number of common beliefs and myths surrounding this widely known optimization method. The second part of the article will consider the custom Simulated Isotropic Annealing (SIA) algorithm.
 

Population optimization algorithms: Simulated Isotropic Annealing (SIA) algorithm. Part II

In the first part, we considered the conventional version of the Simulated Annealing (SA) algorithm. The algorithm is based on three main concepts: applying randomness, making worse decisions and gradually reducing the likelihood of making worse decisions. Applying randomness allows exploring different regions of the search space and avoid getting stuck in local optima. Accepting worse decisions with some probability allows the algorithm to temporarily "jump" out of local optima and look for better solutions elsewhere in the search space, allowing it to first explore the search space more broadly and then focus on improving the solution.
Population optimization algorithms: Simulated Isotropic Annealing (SIA) algorithm. Part II
Population optimization algorithms: Simulated Isotropic Annealing (SIA) algorithm. Part II
  • www.mql5.com
The first part was devoted to the well-known and popular algorithm - simulated annealing. We have thoroughly considered its pros and cons. The second part of the article is devoted to the radical transformation of the algorithm, which turns it into a new optimization algorithm - Simulated Isotropic Annealing (SIA).
Reason: