Fractal-Based Algorithm (FBA)
Introduction
Metaheuristic algorithms have proven themselves to be powerful tools for solving complex optimization problems due to their ability to find acceptable solutions in a reasonable time. Over the past decades, many metaheuristic methods have been developed, inspired by various natural and social processes: genetic algorithms, particle swarm optimization, differential evolution, ant colony optimization, and many others, which we have already discussed in previous articles.
In this article, we will consider a new metaheuristic algorithm for solving continuous optimization problems — Fractal-based Algorithm (FBA) by Marjan Kaedi from 2017. This approach is based on the geometric properties of fractals and uses the concept of self-similarity to adaptively explore space. The algorithm is based on an innovative heuristic that evaluates the potential of different search areas based on the density of high-quality solutions within them.
The key aspect of the proposed method is the iterative partitioning of the search space into subspaces with the identification of the most promising zones, which are then examined more thoroughly. During the algorithm execution, self-similar fractal structures are formed, directed towards the optimal solution, ensuring a balance between global exploration and local refinement of the solutions found. In this article, we will examine in detail the theoretical foundations of the algorithm, the details of its implementation, the configuration of key parameters, and present the results of a comparative analysis with other popular optimization methods on standard test functions.
Implementation of the algorithm
Imagine that you are looking for treasure in a huge field. How would you go about it? You probably would not have dug up every centimeter of the ground as it would have taken too much time. This is precisely the problem that the fractal algorithm (FBA) solves — it helps find the "treasure" (the optimal solution) in a vast space of possibilities without checking every point.
First, we divide the entire search area into equal squares, like a chessboard. Then we throw "seekers" (random points) across the entire field to get an initial idea of the area. Each seeker reports the quality of the area it has found. The algorithm selects the most successful (60% of the best points) and looks at squares they are concentrated in. These squares are marked as "promising zones" (the top 30% of squares).
Now we are focusing our attention on promising areas. Each such area is divided into smaller squares, creating a fractal structure. Most new treasure hunters head to these promising areas, and to avoid missing treasure outside of their explored zones, the algorithm forces some seekers (5%) to act a little erratically — they wander off in random directions from their positions, exploring unexpected places.
The process repeats itself again and again. With each step, the algorithm more accurately determines where the treasure might be located and sends more seekers there. Gradually, a hierarchical structure of squares of different sizes is created, reminiscent of a fractal - hence the name of the algorithm.

Figure 1. FBA algorithm operation visualization
The basic idea of FBA, shown in the figure above, is to gradually partition the search space into smaller and smaller regions in a fractal manner, focusing computing resources on the most promising regions. This creates a self-similar structure that explores the solution space. Let's move on to writing the pseudocode for the FBA algorithm.
- Divide the entire search space into equal subspaces
- Generate an initial population uniformly throughout the space
- Evaluate the fitness of each point
- Identifying promising points: select P1% points with the best fitness.
- Calculating the ranks of subspaces: count how many promising points fall into each subspace.
- Selecting promising subspaces: select the P2% of subspaces with the highest promising ranks.
- Subdividing promising subspaces: further division of promising areas into smaller subspaces.
- Generation of a new population: creation of new points with higher density in promising regions.
- Application of mutation: add Gaussian noise to P3% of points (exploration mechanism).
- Integration of populations: merging the current and new populations while preserving the best points.
- Continue until a stopping criterion is reached (maximum number of iterations, fitness threshold, etc.)
- Return the best solution found
Now that we have understood the concept of the algorithm, we can move on to writing the code. The C_AO_FBA class is a specialized implementation of an optimization algorithm based on fractal principles and is derived from the general C_AO class.
The class has public methods and parameters responsible for the configuration and core actions of the algorithm. The constructor specifies initial parameters, such as population size, percentages of promising points and subspaces, proportion of points for random variations, and the number of intervals for splitting each dimension into intervals. The SetParams method allows updating parameters on the fly from external sources. The Init method and the subsequent Moving and Revision functions control the stages and iterations of the algorithm.
The class also declares the S_Subspace internal structural representation used to describe the search subspace. Each subspace is characterized by minimum and maximum boundaries for each dimension, its potential, the level in the hierarchy, and the connection with the parent subspace.
The main methods within the class include functionality for:
- Checks whether a point is inside a given subspace.
- Creation of the initial layout of the space and its further division.
- Identifying promising points, calculating their ranks and selecting the best subspaces for further division.
- Generation of a new population by combining, mutating, and sorting according to efficiency or fitness criteria.
Thus, the C_AO_FBA class implements an adaptive, hierarchical, fractal-based algorithm for searching for optimal solutions in complex multidimensional problems, using dynamic space partitioning, evaluation of promising areas, and heuristic operators to improve search efficiency.
//—————————————————————————————————————————————————————————————————————————————— class C_AO_FBA : public C_AO { public: //-------------------------------------------------------------------- ~C_AO_FBA () { } C_AO_FBA () { ao_name = "FBA"; ao_desc = "Fractal-Based Algorithm"; ao_link = "https://www.mql5.com/en/articles/17458"; popSize = 50; // population size P1 = 60; // percentage of promising points P2 = 30; // percentage of promising subspaces P3 = 0.8; // percentage of points for random modification m_value = 10; // number of intervals to split each dimension into ArrayResize (params, 5); params [0].name = "popSize"; params [0].val = popSize; params [1].name = "P1"; params [1].val = P1; params [2].name = "P2"; params [2].val = P2; params [3].name = "P3"; params [3].val = P3; params [4].name = "m_value"; params [4].val = m_value; } void SetParams () { popSize = (int)params [0].val; P1 = (int)params [1].val; P2 = (int)params [2].val; P3 = params [3].val; m_value = (int)params [4].val; } bool Init (const double &rangeMinP [], // minimum values const double &rangeMaxP [], // maximum values const double &rangeStepP [], // step change const int epochsP = 0); // number of epochs void Moving (); void Revision (); //---------------------------------------------------------------------------- int P1; // percentage of promising points int P2; // percentage of promising subspaces double P3; // share of points for random modification int m_value; // number of intervals to split each dimension into private: //------------------------------------------------------------------- // Structure for representing a subspace struct S_Subspace { double min []; // minimal boundaries of the subspace double max []; // maximum boundaries of the subspace double promisingRank; // potential rank (normalized value) bool isPromising; // flag of potential int parentIndex; // index of the parent subspace (-1 for root ones) int level; // level in hierarchy (0 for original space) void Init (int coords) { ArrayResize (min, coords); ArrayResize (max, coords); promisingRank = 0.0; isPromising = false; parentIndex = -1; level = 0; } }; S_Subspace subspaces []; // array of subspaces // Auxiliary methods bool IsPointInSubspace (const double &point [], const S_Subspace &subspace); void CreateInitialSpacePartitioning (); void DivideSubspace (int subspaceIndex); void IdentifyPromisingPoints (int &promisingIndices []); void CalculateSubspaceRanks (const int &promisingIndices []); void SelectPromisingSubspaces (); void DividePromisingSubspaces (); void GenerateNewPopulation (); void MutatePoints (); void SortByFitness (double &values [], int &indices [], int size, bool ascending = false); void QuickSort (double &values [], int &indices [], int low, int high, bool ascending); int Partition (double &values [], int &indices [], int low, int high, bool ascending); }; //——————————————————————————————————————————————————————————————————————————————
The Init initialization method is intended to set the initial conditions for the algorithm to operate. It accepts arrays with the minimum, maximum variable bounds and step sizes for each parameter, as well as the number of epochs. Inside the method, the basic initialization is first called, after which the initial division of the search space is created, that is, the initial structure of subspaces is formed based on the specified ranges and steps. If all operations are successful, the method returns 'true', otherwise - 'false'.
The basic Moving method performs a solution search loop through successive iterations. The first iteration is the initialization of the initial population of points uniformly throughout the entire search space: each value of the variable is selected randomly within the range and adjusted to a given step.
Next, during the main work, several steps occur. First, the most promising points are identified - a small part of the best ones according to the value of the fitness function. Then, the ranks of these points are calculated in relation to their potential for each subspace. Based on these ranks, the most promising subspaces are selected for further division into smaller areas. These promising areas are then divided, creating more precise search regions. Then, a new population of points is formed taking into account their potential ranks, which allows us to concentrate efforts in the most promising areas. At the end, random modification of points (mutation) is performed to increase variety and avoid getting stuck.
//—————————————————————————————————————————————————————————————————————————————— bool C_AO_FBA::Init (const double &rangeMinP [], // minimum values const double &rangeMaxP [], // maximum values const double &rangeStepP [], // step change const int epochsP = 0) // number of epochs { if (!StandardInit (rangeMinP, rangeMaxP, rangeStepP)) return false; //---------------------------------------------------------------------------- // Create an initial partition of the search space CreateInitialSpacePartitioning (); return true; } //—————————————————————————————————————————————————————————————————————————————— //+----------------------------------------------------------------------------+ //| Basic optimization method | //+----------------------------------------------------------------------------+ void C_AO_FBA::Moving () { // First iteration - initialization of the initial population if (!revision) { // Initialize the initial population uniformly throughout the space for (int i = 0; i < popSize; i++) { for (int c = 0; c < coords; c++) { a [i].c [c] = u.RNDfromCI (rangeMin [c], rangeMax [c]); a [i].c [c] = u.SeInDiSp (a [i].c [c], rangeMin [c], rangeMax [c], rangeStep [c]); } } revision = true; return; } // Main optimization // 1. Identifying promising points (P1% of points with the best function values) int promisingIndices []; IdentifyPromisingPoints (promisingIndices); // 2. Calculation of potential ranks for each subspace CalculateSubspaceRanks (promisingIndices); // 3. Selecting the P2% most promising subspaces SelectPromisingSubspaces (); // 4. Dividing promising subspaces into smaller ones DividePromisingSubspaces (); // 5. Generating new points taking into account potential ranks GenerateNewPopulation (); // 6. Random modification (mutation) MutatePoints (); } //——————————————————————————————————————————————————————————————————————————————
The Revision method is responsible for updating the current best solution found during the algorithm execution. It iterates over all elements of the current population and compares the objective function value of each element with the current best value. If some element has a better result, then the variable storing the best result is updated, and the array of variables (coordinates) associated with it is copied to record this optimal solution. Due to this, the method ensures continuous tracking and storage of the best result found at the given moment.
//+----------------------------------------------------------------------------+ //| Update the best solution | //+----------------------------------------------------------------------------+ void C_AO_FBA::Revision () { // Search for the best solution for (int i = 0; i < popSize; i++) { // Update the best solution if (a [i].f > fB) { fB = a [i].f; ArrayCopy (cB, a [i].c, 0, 0, WHOLE_ARRAY); } } } //——————————————————————————————————————————————————————————————————————————————
This method creates the initial subspace structure used to localize the search for optimal solutions. It determines the total number of subspaces based on a given degree of division and dimension. In case of very high dimensionality, it is limited to a pre-set maximum to avoid excessive resource consumption.
Then, an array of subspaces is initialized, each of which is assigned initial level and boundary parameters for each coordinate. Depending on the dimensions of the space, a suitable division method is selected:
- For a one-dimensional space, it is divided into uniform intervals, each created subspace occupies a certain section of the range.
- For two-dimensional space, it is divided along two axes, forming a grid of rectangular areas.
- In the case of higher dimensions, an iterative approach is used, in which, by analogy with the counter system, combinations of indices are generated for each coordinate, and for each created region, boundaries are set based on the corresponding intervals.
The calculation of boundaries occurs by dividing the ranges for each dimension into equal parts, and for each subspace, minimum and maximum boundaries are set in accordance with the current indices. The iteration continues until all required subspaces have been created, or until the limit on their number is reached. As a result, a structure is formed that represents the initial partitioning of the search space, ready for further refinement and the search for solutions.
//+----------------------------------------------------------------------------+ //| Create the initial partition of the search space | //+----------------------------------------------------------------------------+ void C_AO_FBA::CreateInitialSpacePartitioning () { // Create an initial partition of space int totalSubspaces = (int)MathPow (m_value, coords); // For very large dimensions, limit the number of subspaces if (totalSubspaces > 10000) totalSubspaces = 10000; ArrayResize (subspaces, totalSubspaces); // Initialize all subspaces for (int i = 0; i < totalSubspaces; i++) { subspaces [i].Init (coords); subspaces [i].level = 0; // Initial level } // Divide the initial space into equal subspaces int index = 0; // Select the division method depending on the dimensionality of the space if (coords == 1) { // One-dimensional case double intervalSize = (rangeMax [0] - rangeMin [0]) / m_value; for (int i = 0; i < m_value && index < totalSubspaces; i++) { subspaces [index].min [0] = rangeMin [0] + i * intervalSize; subspaces [index].max [0] = rangeMin [0] + (i + 1) * intervalSize; index++; } } else if (coords == 2) { // Two-dimensional case double intervalSize0 = (rangeMax [0] - rangeMin [0]) / m_value; double intervalSize1 = (rangeMax [1] - rangeMin [1]) / m_value; for (int i = 0; i < m_value && index < totalSubspaces; i++) { for (int j = 0; j < m_value && index < totalSubspaces; j++) { subspaces [index].min [0] = rangeMin [0] + i * intervalSize0; subspaces [index].max [0] = rangeMin [0] + (i + 1) * intervalSize0; subspaces [index].min [1] = rangeMin [1] + j * intervalSize1; subspaces [index].max [1] = rangeMin [1] + (j + 1) * intervalSize1; index++; } } } else { // Multidimensional case - use an iterative approach int indices []; ArrayResize (indices, coords); for (int i = 0; i < coords; i++) indices [i] = 0; while (index < totalSubspaces) { // Calculate the boundaries of the current subspace for (int c = 0; c < coords; c++) { double intervalSize = (rangeMax [c] - rangeMin [c]) / m_value; subspaces [index].min [c] = rangeMin [c] + indices [c] * intervalSize; subspaces [index].max [c] = rangeMin [c] + (indices [c] + 1) * intervalSize; } // Move on to the next subspace int c = coords - 1; while (c >= 0) { indices [c]++; if (indices [c] < m_value) break; indices [c] = 0; c--; } // If the full loop completed, exit if (c < 0) break; index++; } } } //——————————————————————————————————————————————————————————————————————————————
The following method is designed to determine whether a given point belongs to a particular subspace. It sequentially checks each coordinate of the point, comparing its value with the boundaries of the corresponding subspace. If at least one coordinate has a point outside the allowed limits (less than the minimum, or greater than or equal to the maximum), the method returns 'false', which means that the point is not contained in the given subspace. If all coordinates satisfy the conditions, the method confirms that the point belongs to the given subspace, returning 'true'.
//+----------------------------------------------------------------------------+ //| Determine whether a point belongs to a subspace | //+----------------------------------------------------------------------------+ bool C_AO_FBA::IsPointInSubspace (const double &point [], const S_Subspace &subspace) { // Check if the point is in the specified subspace for (int c = 0; c < coords; c++) { if (point [c] < subspace.min [c] || point [c] >= subspace.max [c]) { return false; } } return true; } //——————————————————————————————————————————————————————————————————————————————
The method for identifying promising points is designed to select the most "promising" solutions from the current population. It starts by creating temporary arrays to store the fitness function values of each element and their indices. Then, these arrays are filled with the values and indices of the corresponding elements from the population.
Next, the elements are sorted by the value of the fitness function in descending order. After sorting, a certain percentage of the best solutions, given as P1%, is selected and from these, a list of indices representing promising points is formed. The number of selected points is guaranteed to be at least one and does not exceed the total population size. As a result, the function returns an array of indices of promising solutions.
//+----------------------------------------------------------------------------+ //| Identify promising points | //+----------------------------------------------------------------------------+ void C_AO_FBA::IdentifyPromisingPoints (int &promisingIndices []) { // Select P1% points with the best function values // Create arrays for sorting double values []; int indices []; ArrayResize (values, popSize); ArrayResize (indices, popSize); // Fill in the arrays for (int i = 0; i < popSize; i++) { values [i] = a [i].f; indices [i] = i; } // Sort in descending order (for the maximization problem) SortByFitness (values, indices, popSize); // Select P1% best points int numPromisingPoints = (int)MathRound (popSize * P1 / 100.0); numPromisingPoints = MathMax (1, MathMin (numPromisingPoints, popSize)); ArrayResize (promisingIndices, numPromisingPoints); for (int i = 0; i < numPromisingPoints; i++) { promisingIndices [i] = indices [i]; } } //——————————————————————————————————————————————————————————————————————————————
Further, the method is designed to evaluate and rank subspaces according to their potential. It starts by resetting the current rating values for all subspaces. It then counts how many promising points fall into each subspace by comparing the coordinates of each promising point to the boundaries of each subspace and incrementing the corresponding counter when there is a match.
It is important that one point is considered in only one subspace. After calculating the quantitative indicators, the ranks of each subspace are normalized by dividing by the total number of points with high potential, which gives a relative score for the potential of each subspace, where the value varies between 0 and 1 and corresponds to the proportion of points with high potential within each subspace relative to the entire population of points with high potential. The result is a rating.
//+----------------------------------------------------------------------------+ //| Calculate potential ranks of subspaces | //+----------------------------------------------------------------------------+ void C_AO_FBA::CalculateSubspaceRanks (const int &promisingIndices []) { // Reset the ranks of subspaces for (int i = 0; i < ArraySize (subspaces); i++) { subspaces [i].promisingRank = 0.0; } // Calculate promising points in each subspace for (int i = 0; i < ArraySize (promisingIndices); i++) { int pointIndex = promisingIndices [i]; for (int j = 0; j < ArraySize (subspaces); j++) { if (IsPointInSubspace (a [pointIndex].c, subspaces [j])) { subspaces [j].promisingRank++; break; // A point can only be in one subspace } } } // Normalize the potential ranks according to the article // PromisingRank = Number of promising points in s / Total promising points int totalPromisingPoints = ArraySize (promisingIndices); if (totalPromisingPoints > 0) { for (int i = 0; i < ArraySize (subspaces); i++) { subspaces [i].promisingRank = subspaces [i].promisingRank / totalPromisingPoints; } } } //——————————————————————————————————————————————————————————————————————————————
The SelectPromisingSubspaces method determines which subspaces should be considered promising based on their ranks. Temporary "ranks" and "indices" arrays are created to store the subspace rankings and their corresponding indices. Rating values and subspace indices are copied into temporary arrays. For each subspace, the isPromising flag is set to 'false'. This is necessary in order to start from scratch and not take into account the results of previous iterations. The "ranks" array is sorted in descending order, and at the same time, the "indices" array is rearranged to maintain the correspondence between indices and ratings.
Thus, in "indices" after sorting there are indices of subspaces, sorted in descending order of their ratings. The number of subspaces that will be considered promising is calculated based on the P2 parameter (percentage of best subspaces). The resulting value is limited to a range from 1 to the total number of subspaces. We iterate through the first numPromisingSubspaces elements of the "indices" array. For each index in this array, the isPromising flag is set to 'true' for the corresponding subspace. This means that this subspace is considered promising.
There is also a check for the index being within the allowed range to avoid errors when accessing the "subspaces" array. As a result of executing the method, the isPromising flag is set to 'true' for P2% subspaces with the highest ranks.
//+----------------------------------------------------------------------------+ //| Select promising subspaces | //+----------------------------------------------------------------------------+ void C_AO_FBA::SelectPromisingSubspaces () { // Select P2% subspaces with the highest ranks as promising ones // Create arrays for sorting double ranks []; int indices []; int numSubspaces = ArraySize (subspaces); ArrayResize (ranks, numSubspaces); ArrayResize (indices, numSubspaces); // Fill in the arrays for (int i = 0; i < numSubspaces; i++) { ranks [i] = subspaces [i].promisingRank; indices [i] = i; // Reset the potential flag subspaces [i].isPromising = false; } // Sort by descending ranks SortByFitness (ranks, indices, numSubspaces); // Select P2% most promising subspaces int numPromisingSubspaces = (int)MathRound (numSubspaces * P2 / 100.0); numPromisingSubspaces = MathMax (1, MathMin (numPromisingSubspaces, numSubspaces)); // Mark promising subspaces for (int i = 0; i < numPromisingSubspaces && i < ArraySize (indices); i++) { // Protection against exceeding the array size if (indices [i] >= 0 && indices [i] < ArraySize (subspaces)) { subspaces [indices [i]].isPromising = true; } } } //——————————————————————————————————————————————————————————————————————————————
The DividePromisingSubspaces method is designed to divide promising subspaces into smaller parts. It first identifies all subspaces marked as promising by checking the isPromising flag. The indices of these subspaces are collected into the promisingIndices temporary array. Then, for each subspace whose index is in the promisingIndices array, the DivideSubspace function is called to pass in the index of that subspace.
Thus, the method iterates through all the found promising subspaces and sequentially applies division to each of them using the DivideSubspace function.
//+----------------------------------------------------------------------------+ //| Separate promising subspaces | //+----------------------------------------------------------------------------+ void C_AO_FBA::DividePromisingSubspaces () { // Collect indices of promising subspaces int promisingIndices []; int numPromising = 0; for (int i = 0; i < ArraySize (subspaces); i++) { if (subspaces [i].isPromising) { numPromising++; ArrayResize (promisingIndices, numPromising); promisingIndices [numPromising - 1] = i; } } // Divide each promising subspace for (int i = 0; i < numPromising; i++) { DivideSubspace (promisingIndices [i]); } } //——————————————————————————————————————————————————————————————————————————————
The DivideSubspace method is designed to divide a specific subspace into smaller parts. First, the parent subspace is selected based on the passed index. We make sure that the total number of subspaces does not exceed the set limit (10,000). The total number of new subspaces that results from dividing each dimension into "m_value" parts is determined - this is raising m_value to a power equal to the number of dimensions.
The array storing all subspaces is increased by the number of new subspaces created. For each new subspace, initial parameters are specified, such as the level, the index of the parent subspace, and the boundaries for each coordinate, which are calculated based on the boundaries of the parent subspace and division into equal parts.
For each dimension, an interval size is determined, and the boundaries of the current subspace are set accordingly based on the current indices. After the creation of each new subspace, the indices in the dimensions are incremented to move to the next partitioning according to the multi-index system. When the index in a dimension reaches m_value, it is reset to zero, and the index of the next dimension is incremented, and all combinations are tried.
The process continues until all new subspaces are created or until the limit is reached. When all combinations are exhaustively searched using counters, termination occurs naturally. As a result of this method, the parent subspace is split into many smaller ones, each covering a corresponding portion of the original range across all dimensions.
//+----------------------------------------------------------------------------+ //| Partition a specific subspace | //+----------------------------------------------------------------------------+ void C_AO_FBA::DivideSubspace (int subspaceIndex) { // Divide the specified subspace into m_value^coords subspaces S_Subspace parent = subspaces [subspaceIndex]; // Limit the maximum number of subspaces if (ArraySize (subspaces) > 10000) return; // For each dimension, divide by m_value parts int totalNewSubspaces = (int)MathPow (m_value, coords); int currentSize = ArraySize (subspaces); ArrayResize (subspaces, currentSize + totalNewSubspaces); // Create new subspaces int newIndex = currentSize; int indices []; ArrayResize (indices, coords); for (int i = 0; i < coords; i++) indices [i] = 0; for (int idx = 0; idx < totalNewSubspaces && newIndex < ArraySize (subspaces); idx++) { subspaces [newIndex].Init (coords); subspaces [newIndex].level = parent.level + 1; subspaces [newIndex].parentIndex = subspaceIndex; // Calculate the boundaries of the current subspace for (int c = 0; c < coords; c++) { double intervalSize = (parent.max [c] - parent.min [c]) / m_value; subspaces [newIndex].min [c] = parent.min [c] + indices [c] * intervalSize; subspaces [newIndex].max [c] = parent.min [c] + (indices [c] + 1) * intervalSize; } // Move on to the next subspace int c = coords - 1; while (c >= 0) { indices [c]++; if (indices [c] < m_value) break; indices [c] = 0; c--; } // If the full loop completed, exit if (c < 0) break; newIndex++; } } //——————————————————————————————————————————————————————————————————————————————
The GenerateNewPopulation method creates a new population of points, distributing them across subspaces according to their promisingRank value.
First, the total promisingRank sum for all subspaces is calculated. This value will be used to determine the proportional number of points that should be generated in each subspace. If the sum of the ranks is close to zero (less than 0.0001), which can happen when all subspaces have rank zero, then all subspaces are assigned the minimum positive rank (1.0) to ensure a uniform distribution of points. Then, for each subspace, the number of points to generate in it is calculated, proportional to its promisingRank relative to the total sum of ranks.
We use the equation (subspaces[i].promisingRank / totalRank) * popSize, where popSize is a required population size. The result is rounded to the nearest integer number. The number of points is limited from above so as not to exceed popSize. Within each subspace, a computed number of points are generated, and for each point, 'coords' coordinates are generated using a uniform distribution within the subspace boundaries. For each dimension, the coordinate value is constrained by the specified minimum and maximum values and is formed as a grid with the step of rangeStep[c].
If after distributing the points across the subspaces, the total number of generated points is less than popSize, the remaining points are generated uniformly across the entire search space using the bounds of the entire rangeMin[c] and rangeMax[c] space, as well as as limitations, and are formed as a grid with the step of rangeStep[c]. This ensures that the population will always be of popSize.
//+----------------------------------------------------------------------------+ //| Generate a new population | //+----------------------------------------------------------------------------+ void C_AO_FBA::GenerateNewPopulation () { // Calculate the sum of the ranks of all subspaces double totalRank = 0.0; for (int i = 0; i < ArraySize (subspaces); i++) { totalRank += subspaces [i].promisingRank; } // If all ranks are 0, set the uniform distribution if (totalRank <= 0.0001) // Check for approximate equality to zero { for (int i = 0; i < ArraySize (subspaces); i++) { subspaces [i].promisingRank = 1.0; } totalRank = ArraySize (subspaces); } int points = 0; for (int i = 0; i < ArraySize (subspaces) && points < popSize; i++) { // Calculate the number of points for this subspace according to the equation int pointsToGenerate = (int)MathRound ((subspaces [i].promisingRank / totalRank) * popSize); // Limitation against exceeding the limits pointsToGenerate = MathMin (pointsToGenerate, popSize - points); // Generate points in this subspace for (int j = 0; j < pointsToGenerate; j++) { // Create a new point uniformly within the subspace for (int c = 0; c < coords; c++) { a [points].c [c] = u.RNDfromCI (subspaces [i].min [c], subspaces [i].max [c]); a [points].c [c] = u.SeInDiSp (a [points].c [c], rangeMin [c], rangeMax [c], rangeStep [c]); } points++; if (points >= popSize) break; } } // If not all points were generated, fill the remaining ones uniformly throughout the space while (points < popSize) { for (int c = 0; c < coords; c++) { a [points].c [c] = u.RNDfromCI (rangeMin [c], rangeMax [c]); a [points].c [c] = u.SeInDiSp (a [points].c [c], rangeMin [c], rangeMax [c], rangeStep [c]); } points++; } } //——————————————————————————————————————————————————————————————————————————————
The MutatePoints method of the C_AO_FBA class is designed to mutate points in a population and is part of the algorithm procedure for increasing the variability of solutions and preventing getting stuck in local traps.
The original part of the method involves the idea of adding Gaussian noise to the coordinates of randomly selected points, which is done according to the original FBA algorithm. However, with such a mutation, the algorithm performs poorly and is practically no different from the results of a random algorithm, so this block is commented out, because I found a better implementation.
The main implemented part of the method consists of going through the entire population. For each agent (or point) and each coordinate, a probability condition is checked. If it is satisfied (based on a pre-specified mutation probability), the coordinate value is modified by a function based on a power-law distribution with a power that allows control over the degree of variation. After this, the value is refined and limited using a function that ensures that it stays within the specified ranges and takes into account the sampling steps.
As a result, the method provides random mutations of individual points of the population, promoting the diversity of solutions and improving the ability to find a global optimum.
//+----------------------------------------------------------------------------+ //| Mutation of points in the population | //+----------------------------------------------------------------------------+ void C_AO_FBA::MutatePoints () { // Add Gaussian noise to P3% of randomly selected points from the new population /* int numToMutate = (int)MathRound (popSize * P3 / 100.0); numToMutate = MathMax (1, MathMin (numToMutate, popSize)); for (int i = 0; i < numToMutate; i++) { int index = u.RNDminusOne (popSize); // Add noise to each coordinate for (int c = 0; c < coords; c++) { // Standard deviation of 10% of the range //double stdDev = (rangeMax [c] - rangeMin [c]) * 0.1; // Gaussian noise using the Box-Muller method //double noise = NormalRandom (0.0, stdDev); // Add noise and limit the value a [index].c [c] += noise; a [index].c [c] = u.SeInDiSp (a [index].c [c], rangeMin [c], rangeMax [c], rangeStep [c]); } } */ for (int p = 0; p < popSize; p++) { for (int c = 0; c < coords; c++) { if (u.RNDprobab () < P3) { a [p].c [c] = u.PowerDistribution (cB [c], rangeMin [c], rangeMax [c], 20); a [p].c [c] = u.SeInDiSp (a [p].c [c], rangeMin [c], rangeMax [c], rangeStep [c]); } } } } //——————————————————————————————————————————————————————————————————————————————
The SortByFitness method is designed to sort two arrays - one containing the fitness function values, and the other containing the corresponding element indices. It takes parameters: an array of values, an array of indices, the size of the arrays, and an optional flag specifying the sort order.
If the array size is greater than one, the method calls the internal QuickSort function, which performs a quick sort of the elements. In this case, both arrays are sorted simultaneously so that the correspondence between values and indices is maintained. As a result, after executing the method, the elements will be ordered by the value of the fitness function according to the selected order.
//+----------------------------------------------------------------------------+ //| Sort by fitness function value | //+----------------------------------------------------------------------------+ void C_AO_FBA::SortByFitness (double &values [], int &indices [], int size, bool ascending = false) { if (size > 1) QuickSort (values, indices, 0, size - 1, ascending); } //——————————————————————————————————————————————————————————————————————————————
The QuickSort method implements a quick sort algorithm for two related arrays: "values" and "indices". It recursively divides and sorts arrays in such a way that the correspondence between a value and its original index is preserved. Parameters: arrays to sort, boundaries of the sorted area (low and high), and the "ascending" flag that determines the sorting order.
//+----------------------------------------------------------------------------+ //| Quick sort algorithm | //+----------------------------------------------------------------------------+ void C_AO_FBA::QuickSort (double &values [], int &indices [], int low, int high, bool ascending) { if (low < high) { int pi = Partition (values, indices, low, high, ascending); QuickSort (values, indices, low, pi - 1, ascending); QuickSort (values, indices, pi + 1, high, ascending); } } //——————————————————————————————————————————————————————————————————————————————
The Partition method is a key part of the quick sort algorithm. Its task is to select a pivot element and redistribute the elements of the "indices" array in such a way that all elements "smaller" than the pivot (or "larger", depending on the sorting order) are to the left of it, and "larger" ("smaller") ones are to the right. "Smaller" and "larger" are defined relative to the values in the "values" array pointed to by the indices in "indices".
//+----------------------------------------------------------------------------+ //| Partition function for QuickSort | //+----------------------------------------------------------------------------+ int C_AO_FBA::Partition (double &values [], int &indices [], int low, int high, bool ascending) { double pivot = values [indices [high]]; int i = low - 1; for (int j = low; j < high; j++) { bool condition = ascending ? (values [indices [j]] < pivot) : (values [indices [j]] > pivot); if (condition) { i++; // Exchange values int temp = indices [i]; indices [i] = indices [j]; indices [j] = temp; } } // Exchange values int temp = indices [i + 1]; indices [i + 1] = indices [high]; indices [high] = temp; return i + 1; } //——————————————————————————————————————————————————————————————————————————————
Test results
The algorithm performs well overall.FBA|Fractal-Based Algorithm|50.0|60.0|30.0|0.8|10.0|
=============================
5 Hilly's; Func runs: 10000; result: 0.7900044352090774
25 Hilly's; Func runs: 10000; result: 0.6513385092404853
500 Hilly's; Func runs: 10000; result: 0.2896548031738138
=============================
5 Forest's; Func runs: 10000; result: 0.8715768614282637
25 Forest's; Func runs: 10000; result: 0.5682316842631675
500 Forest's; Func runs: 10000; result: 0.18876552479611478
=============================
5 Megacity's; Func runs: 10000; result: 0.6107692307692306
25 Megacity's; Func runs: 10000; result: 0.46061538461538465
500 Megacity's; Func runs: 10000; result: 0.12398461538461655
=============================
All score: 4.55494 (50.61%)
The visualization shows how the algorithm breaks down the search space into smaller subspaces. The results also show high variability on low-dimensional functions.

FBA on the Hilly test function

FBA on the Forest test function

FBA on the Megacity test function
Based on the test results, the FBA algorithm ranks 29th in our ranking table.
| # | AO | Description | Hilly | Hilly Final | Forest | Forest Final | Megacity (discrete) | Megacity Final | Final Result | % of MAX | ||||||
| 10 p (5 F) | 50 p (25 F) | 1000 p (500 F) | 10 p (5 F) | 50 p (25 F) | 1000 p (500 F) | 10 p (5 F) | 50 p (25 F) | 1000 p (500 F) | ||||||||
| 1 | ANS | across neighbourhood search | 0.94948 | 0.84776 | 0.43857 | 2.23581 | 1.00000 | 0.92334 | 0.39988 | 2.32323 | 0.70923 | 0.63477 | 0.23091 | 1.57491 | 6.134 | 68.15 |
| 2 | CLA | code lock algorithm (joo) | 0.95345 | 0.87107 | 0.37590 | 2.20042 | 0.98942 | 0.91709 | 0.31642 | 2.22294 | 0.79692 | 0.69385 | 0.19303 | 1.68380 | 6.107 | 67.86 |
| 3 | AMOm | animal migration ptimization M | 0.90358 | 0.84317 | 0.46284 | 2.20959 | 0.99001 | 0.92436 | 0.46598 | 2.38034 | 0.56769 | 0.59132 | 0.23773 | 1.39675 | 5.987 | 66.52 |
| 4 | (P+O)ES | (P+O) evolution strategies | 0.92256 | 0.88101 | 0.40021 | 2.20379 | 0.97750 | 0.87490 | 0.31945 | 2.17185 | 0.67385 | 0.62985 | 0.18634 | 1.49003 | 5.866 | 65.17 |
| 5 | CTA | comet tail algorithm (joo) | 0.95346 | 0.86319 | 0.27770 | 2.09435 | 0.99794 | 0.85740 | 0.33949 | 2.19484 | 0.88769 | 0.56431 | 0.10512 | 1.55712 | 5.846 | 64.96 |
| 6 | TETA | time evolution travel algorithm (joo) | 0.91362 | 0.82349 | 0.31990 | 2.05701 | 0.97096 | 0.89532 | 0.29324 | 2.15952 | 0.73462 | 0.68569 | 0.16021 | 1.58052 | 5.797 | 64.41 |
| 7 | SDSm | stochastic diffusion search M | 0.93066 | 0.85445 | 0.39476 | 2.17988 | 0.99983 | 0.89244 | 0.19619 | 2.08846 | 0.72333 | 0.61100 | 0.10670 | 1.44103 | 5.709 | 63.44 |
| 8 | BOAm | billiards optimization algorithm M | 0.95757 | 0.82599 | 0.25235 | 2.03590 | 1.00000 | 0.90036 | 0.30502 | 2.20538 | 0.73538 | 0.52523 | 0.09563 | 1.35625 | 5.598 | 62.19 |
| 9 | AAm | archery algorithm M | 0.91744 | 0.70876 | 0.42160 | 2.04780 | 0.92527 | 0.75802 | 0.35328 | 2.03657 | 0.67385 | 0.55200 | 0.23738 | 1.46323 | 5.548 | 61.64 |
| 10 | ESG | evolution of social groups (joo) | 0.99906 | 0.79654 | 0.35056 | 2.14616 | 1.00000 | 0.82863 | 0.13102 | 1.95965 | 0.82333 | 0.55300 | 0.04725 | 1.42358 | 5.529 | 61.44 |
| 11 | SIA | simulated isotropic annealing (joo) | 0.95784 | 0.84264 | 0.41465 | 2.21513 | 0.98239 | 0.79586 | 0.20507 | 1.98332 | 0.68667 | 0.49300 | 0.09053 | 1.27020 | 5.469 | 60.76 |
| 12 | ACS | artificial cooperative search | 0.75547 | 0.74744 | 0.30407 | 1.80698 | 1.00000 | 0.88861 | 0.22413 | 2.11274 | 0.69077 | 0.48185 | 0.13322 | 1.30583 | 5.226 | 58.06 |
| 13 | DA | dialectical algorithm | 0.86183 | 0.70033 | 0.33724 | 1.89940 | 0.98163 | 0.72772 | 0.28718 | 1.99653 | 0.70308 | 0.45292 | 0.16367 | 1.31967 | 5.216 | 57.95 |
| 14 | BHAm | black hole algorithm M | 0.75236 | 0.76675 | 0.34583 | 1.86493 | 0.93593 | 0.80152 | 0.27177 | 2.00923 | 0.65077 | 0.51646 | 0.15472 | 1.32195 | 5.196 | 57.73 |
| 15 | ASO | anarchy society optimization | 0.84872 | 0.74646 | 0.31465 | 1.90983 | 0.96148 | 0.79150 | 0.23803 | 1.99101 | 0.57077 | 0.54062 | 0.16614 | 1.27752 | 5.178 | 57.54 |
| 16 | RFO | royal flush optimization (joo) | 0.83361 | 0.73742 | 0.34629 | 1.91733 | 0.89424 | 0.73824 | 0.24098 | 1.87346 | 0.63154 | 0.50292 | 0.16421 | 1.29867 | 5.089 | 56.55 |
| 17 | AOSm | atomic orbital search M | 0.80232 | 0.70449 | 0.31021 | 1.81702 | 0.85660 | 0.69451 | 0.21996 | 1.77107 | 0.74615 | 0.52862 | 0.14358 | 1.41835 | 5.006 | 55.63 |
| 18 | TSEA | turtle shell evolution algorithm (joo) | 0.96798 | 0.64480 | 0.29672 | 1.90949 | 0.99449 | 0.61981 | 0.22708 | 1.84139 | 0.69077 | 0.42646 | 0.13598 | 1.25322 | 5.004 | 55.60 |
| 19 | DE | differential evolution | 0.95044 | 0.61674 | 0.30308 | 1.87026 | 0.95317 | 0.78896 | 0.16652 | 1.90865 | 0.78667 | 0.36033 | 0.02953 | 1.17653 | 4.955 | 55.06 |
| 20 | SRA | successful restaurateur algorithm (joo) | 0.96883 | 0.63455 | 0.29217 | 1.89555 | 0.94637 | 0.55506 | 0.19124 | 1.69267 | 0.74923 | 0.44031 | 0.12526 | 1.31480 | 4.903 | 54.48 |
| 21 | CRO | chemical reaction optimization | 0.94629 | 0.66112 | 0.29853 | 1.90593 | 0.87906 | 0.58422 | 0.21146 | 1.67473 | 0.75846 | 0.42646 | 0.12686 | 1.31178 | 4.892 | 54.36 |
| 22 | BIO | blood inheritance optimization (joo) | 0.81568 | 0.65336 | 0.30877 | 1.77781 | 0.89937 | 0.65319 | 0.21760 | 1.77016 | 0.67846 | 0.47631 | 0.13902 | 1.29378 | 4.842 | 53.80 |
| 23 | BSA | bird swarm algorithm | 0.89306 | 0.64900 | 0.26250 | 1.80455 | 0.92420 | 0.71121 | 0.24939 | 1.88479 | 0.69385 | 0.32615 | 0.10012 | 1.12012 | 4.809 | 53.44 |
| 24 | HS | harmony search | 0.86509 | 0.68782 | 0.32527 | 1.87818 | 0.99999 | 0.68002 | 0.09590 | 1.77592 | 0.62000 | 0.42267 | 0.05458 | 1.09725 | 4.751 | 52.79 |
| 25 | SSG | saplings sowing and growing | 0.77839 | 0.64925 | 0.39543 | 1.82308 | 0.85973 | 0.62467 | 0.17429 | 1.65869 | 0.64667 | 0.44133 | 0.10598 | 1.19398 | 4.676 | 51.95 |
| 26 | BCOm | bacterial chemotaxis optimization M | 0.75953 | 0.62268 | 0.31483 | 1.69704 | 0.89378 | 0.61339 | 0.22542 | 1.73259 | 0.65385 | 0.42092 | 0.14435 | 1.21912 | 4.649 | 51.65 |
| 27 | ABO | african buffalo optimization | 0.83337 | 0.62247 | 0.29964 | 1.75548 | 0.92170 | 0.58618 | 0.19723 | 1.70511 | 0.61000 | 0.43154 | 0.13225 | 1.17378 | 4.634 | 51.49 |
| 28 | (PO)ES | (PO) evolution strategies | 0.79025 | 0.62647 | 0.42935 | 1.84606 | 0.87616 | 0.60943 | 0.19591 | 1.68151 | 0.59000 | 0.37933 | 0.11322 | 1.08255 | 4.610 | 51.22 |
| 29 | FBA | Fractal-Based Algorithm | 0.79000 | 0.65134 | 0.28965 | 1.73099 | 0.87158 | 0.56823 | 0.18877 | 1.62858 | 0.61077 | 0.46062 | 0.12398 | 1.19537 | 4.555 | 50.61 |
| 30 | TSm | tabu search M | 0.87795 | 0.61431 | 0.29104 | 1.78330 | 0.92885 | 0.51844 | 0.19054 | 1.63783 | 0.61077 | 0.38215 | 0.12157 | 1.11449 | 4.536 | 50.40 |
| 31 | BSO | brain storm optimization | 0.93736 | 0.57616 | 0.29688 | 1.81041 | 0.93131 | 0.55866 | 0.23537 | 1.72534 | 0.55231 | 0.29077 | 0.11914 | 0.96222 | 4.498 | 49.98 |
| 32 | WOAm | wale optimization algorithm M | 0.84521 | 0.56298 | 0.26263 | 1.67081 | 0.93100 | 0.52278 | 0.16365 | 1.61743 | 0.66308 | 0.41138 | 0.11357 | 1.18803 | 4.476 | 49.74 |
| 33 | AEFA | artificial electric field algorithm | 0.87700 | 0.61753 | 0.25235 | 1.74688 | 0.92729 | 0.72698 | 0.18064 | 1.83490 | 0.66615 | 0.11631 | 0.09508 | 0.87754 | 4.459 | 49.55 |
| 34 | AEO | artificial ecosystem-based optimization algorithm | 0.91380 | 0.46713 | 0.26470 | 1.64563 | 0.90223 | 0.43705 | 0.21400 | 1.55327 | 0.66154 | 0.30800 | 0.28563 | 1.25517 | 4.454 | 49.49 |
| 35 | ACOm | ant colony optimization M | 0.88190 | 0.66127 | 0.30377 | 1.84693 | 0.85873 | 0.58680 | 0.15051 | 1.59604 | 0.59667 | 0.37333 | 0.02472 | 0.99472 | 4.438 | 49.31 |
| 36 | BFO-GA | bacterial foraging optimization - ga | 0.89150 | 0.55111 | 0.31529 | 1.75790 | 0.96982 | 0.39612 | 0.06305 | 1.42899 | 0.72667 | 0.27500 | 0.03525 | 1.03692 | 4.224 | 46.93 |
| 37 | SOA | simple optimization algorithm | 0.91520 | 0.46976 | 0.27089 | 1.65585 | 0.89675 | 0.37401 | 0.16984 | 1.44060 | 0.69538 | 0.28031 | 0.10852 | 1.08422 | 4.181 | 46.45 |
| 38 | ABHA | artificial bee hive algorithm | 0.84131 | 0.54227 | 0.26304 | 1.64663 | 0.87858 | 0.47779 | 0.17181 | 1.52818 | 0.50923 | 0.33877 | 0.10397 | 0.95197 | 4.127 | 45.85 |
| 39 | ACMO | atmospheric cloud model optimization | 0.90321 | 0.48546 | 0.30403 | 1.69270 | 0.80268 | 0.37857 | 0.19178 | 1.37303 | 0.62308 | 0.24400 | 0.10795 | 0.97503 | 4.041 | 44.90 |
| 40 | ADAMm | adaptive moment estimation M | 0.88635 | 0.44766 | 0.26613 | 1.60014 | 0.84497 | 0.38493 | 0.16889 | 1.39880 | 0.66154 | 0.27046 | 0.10594 | 1.03794 | 4.037 | 44.85 |
| 41 | CGO | chaos game optimization | 0.57256 | 0.37158 | 0.32018 | 1.26432 | 0.61176 | 0.61931 | 0.62161 | 1.85267 | 0.37538 | 0.21923 | 0.19028 | 0.78490 | 3.902 | 43.35 |
| 42 | ATAm | artificial tribe algorithm M | 0.71771 | 0.55304 | 0.25235 | 1.52310 | 0.82491 | 0.55904 | 0.20473 | 1.58867 | 0.44000 | 0.18615 | 0.09411 | 0.72026 | 3.832 | 42.58 |
| 43 | CROm | coral reefs optimization M | 0.78512 | 0.46032 | 0.25958 | 1.50502 | 0.86688 | 0.35297 | 0.16267 | 1.38252 | 0.63231 | 0.26738 | 0.10734 | 1.00703 | 3.895 | 43.27 |
| 44 | CFO | central force optimization | 0.60961 | 0.54958 | 0.27831 | 1.43750 | 0.63418 | 0.46833 | 0.22541 | 1.32792 | 0.57231 | 0.23477 | 0.09586 | 0.90294 | 3.668 | 40.76 |
| 45 | ASHA | artificial showering algorithm | 0.89686 | 0.40433 | 0.25617 | 1.55737 | 0.80360 | 0.35526 | 0.19160 | 1.35046 | 0.47692 | 0.18123 | 0.09774 | 0.75589 | 3.664 | 40.71 |
| RW | neuroboids optimization algorithm 2(joo) | 0.48754 | 0.32159 | 0.25781 | 1.06694 | 0.37554 | 0.21944 | 0.15877 | 0.75375 | 0.27969 | 0.14917 | 0.09847 | 0.52734 | 2.348 | 26.09 | |
Summary
The mutation-modified version of the FBA algorithm, with results of 50.61%, demonstrates a two-fold improvement in efficiency compared to the results of the original version.
FBA|Fractal-Based Algorithm|50.0|60.0|30.0|5.0|
=============================
5 Hilly's; Func runs: 10000; result: 0.4780639253626462
25 Hilly's; Func runs: 10000; result: 0.3222029113692554
500 Hilly's; Func runs: 10000; result: 0.25720991988937014
=============================
5 Forest's; Func runs: 10000; result: 0.36567166223346415
25 Forest's; Func runs: 10000; result: 0.22043205527307377
500 Forest's; Func runs: 10000; result: 0.15869146061036057
=============================
5 Megacity's; Func runs: 10000; result: 0.2861538461538461
25 Megacity's; Func runs: 10000; result: 0.15015384615384622
500 Megacity's; Func runs: 10000; result: 0.09838461538461622
=============================
All score: 2.33696 (25.97%)
Due to fundamental changes in the mutation mechanism, the new approach provides a more reasonable balance between global exploration of the search space and local exploitation of discovered promising solutions.
The key achievement is that the algorithm now uses accumulated knowledge about the search landscape to guide mutation, instead of completely random changes. This is more consistent with natural optimization processes, where randomness and determinism coexist in a balanced way. This approach allows the algorithm to converge to the global optimum faster, especially in complex multidimensional spaces with numerous local extrema, which explains the significant performance improvement.

Figure 2. Color gradation of algorithms according to the corresponding tests

Figure 3. Histogram of algorithm testing results (scale from 0 to 100, the higher the better, where 100 is the maximum possible theoretical result, in the archive there is a script for calculating the rating table)
FBA pros and cons:
Pros:
- Stable results on medium and high dimension functions.
Cons:
- Large scatter of results on low-dimension functions.
- An interesting, but rather "weak" basic idea of the algorithm.
The article is accompanied by an archive with the current versions of the algorithm codes. The author of the article is not responsible for the absolute accuracy in the description of canonical algorithms. Changes have been made to many of them to improve search capabilities. The conclusions and judgments presented in the articles are based on the results of the experiments.
Programs used in the article
| # | Name | Type | Description |
|---|---|---|---|
| 1 | #C_AO.mqh | Include | Parent class of population optimization algorithms |
| 2 | #C_AO_enum.mqh | Include | Enumeration of population optimization algorithms |
| 3 | TestFunctions.mqh | Include | Library of test functions |
| 4 | TestStandFunctions.mqh | Include | Test stand function library |
| 5 | Utilities.mqh | Include | Library of auxiliary functions |
| 6 | CalculationTestResults.mqh | Include | Script for calculating results in the comparison table |
| 7 | Testing AOs.mq5 | Script | The unified test stand for all population optimization algorithms |
| 8 | Simple use of population optimization algorithms.mq5 | Script | A simple example of using population optimization algorithms without visualization |
| 9 | Test_FBA.mq5 | Script | FBA test stand |
Translated from Russian by MetaQuotes Ltd.
Original article: https://www.mql5.com/ru/articles/17458
Warning: All rights to these materials are reserved by MetaQuotes Ltd. Copying or reprinting of these materials in whole or in part is prohibited.
This article was written by a user of the site and reflects their personal views. MetaQuotes Ltd is not responsible for the accuracy of the information presented, nor for any consequences resulting from the use of the solutions, strategies or recommendations described.
Features of Custom Indicators Creation
MQL5 Wizard Techniques You should know (Part 86): Speeding Up Data Access with a Sparse Table for a Custom Trailing Class
Features of Experts Advisors
GoertzelBrain: Adaptive Spectral Cycle Detection with Neural Network Ensemble in MQL5
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
You agree to website policy and terms of use