You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Population optimization algorithms: Changing shape, shifting probability distributions and testing on Smart Cephalopod (SC)
Population optimization algorithms: Evolution Strategies, (μ,λ)-ES and (μ+λ)-ES
The name "Evolutionary Strategies" may be misleading, as researchers might think it is a general name for a class of evolutionary algorithms. However, this is not the case. In fact, it is a specific group of algorithms that use ideas of evolution to solve optimization problems.
Contents
1. Introduction
2. Algorithm
3. Replacing the test function
4. Test results
5. Conclusions
Population optimization algorithms: Bacterial Foraging Optimization - Genetic Algorithm (BFO-GA)
The BFO-GA hybrid optimization algorithm combines two different optimization algorithms: the foraging optimization (BFO) algorithm and the genetic algorithm (GA). This hybrid algorithm was created to improve optimization efficiency and overcome some of the shortcomings of each of the individual algorithms.
BFO (Bacterial Foraging Optimization) is an optimization algorithm inspired by the foraging behavior of bacteria. It was proposed in 2002 by Rahul K. Kujur. BFO models bacterial movement using three main mechanisms: transitions, diffusion, and position update. Each bacterium in the algorithm represents a solution to the optimization problem, and food corresponds to the optimal solution. Bacteria move through search space to find the best food.
Genetic algorithm (GA) is an optimization algorithm inspired by the principles of natural selection and genetics. It was developed by John Holland in the 1970s. GA works with a population of individuals representing solutions to an optimization problem. Individuals undergo the operations of crossing (combining genetic information) and mutation (random changes in genetic information) to create new generations. After several generations, GA strives to find the optimal solution.
Population optimization algorithms: Micro Artificial immune system (Micro-AIS)
The immune system is an amazing mechanism that plays an important role in protecting our body from external threats. Like an invisible shield, it fights bacteria, viruses and fungi, keeping our body healthy. But what if we could use this powerful mechanism to solve complex optimization and learning problems? This is exactly the approach used in the Artificial Immune System (AIS) optimization method.
Artificial Immune System (AIS) optimization method was proposed in the 1990s. Early research on this method dates back to the mid-1980s, with significant contributions by Farmer, Packard, Perelson (1986) and Bersini and Varela (1990).
A Generic Optimization Formulation (GOF) to Implement Custom Max with Constraints
In general terms, there are two main types of optimization algorithms. The first type is the more classical, based on the calculation of gradients of all functions involved in the optimization problem (this dates back to Isaac Newton’s times). The second type is more recent (since the ~1970’s) that does not use gradient information at all. In between, there may be algorithms that combine the two approaches mentioned, but we don’t need to address them here. The MetaTrader 5 algorithm called “Fast Genetic based Algorithm”---in the MetaTrader 5 terminal Settings tab---belongs to the second type. This allows us to skip the need for the computation gradients for objective and constraint functions. Even more, thanks to the gradient-less nature of the MetaTrader 5 algorithm, we were able to account for constraints functions that would not had been appropriate with gradient-based algorithms. More on this will be discussed below.
One important point is that the MetaTrader 5 algorithm called “Slow Complete Algorithm” is not actually an optimization algorithm but a brute force, exhaustive evaluation of all possible combinations of values for all the input variables within the side constraints.
Population optimization algorithms: Evolution of Social Groups (ESG)
Population optimization algorithms: Artificial Multi-Social Search Objects (MSO)
In the previous article, we considered the evolution of social groups where they moved freely in the search space. However, here I propose that we change this concept and assume that groups move between sectors, jumping from one to another. All groups have their own centers, which are updated at each iteration of the algorithm. In addition, we introduce the concept of memory both for the group as a whole and for each individual particle in it. Using these changes, our algorithm now allows groups to move from sector to sector based on information about the best solutions.
In this article, we will conduct a series of experiments to explore how these new concepts affect the search performance of an algorithm. We will analyze the interaction between groups, their ability to cooperate and coordinate, and their ability to learn and adapt. Our findings may shed light on the evolution of social systems and help better understand how groups form, evolve and adapt to changing environments.