You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Well, the optimization already goes through them, so where else are they to be deviated?
It is about introducing insignificant errors during optimization, i.e. values of input parameters will be randomly distorted slightly and it is supposed to prevent the genetic algorithm from fixing on false extrema with steep descents. At the same time for extrema with gentle slopes the small distortions will also have a small effect on the optimization algorithm.
For example, if an extremum has coordinates in the form of values of input parameters x, y, z and fitness function profit = f(x, y, z) then x ± delta, y ± delta, z ± delta where delta is a small distortion for values of input parameters, for false extremums the deviation of the fitness function will affect the final result significantly while for extrema with gentle slopes the deviation of the fitness function will not be significant.
This is analogous to road traffic: if the road is narrow and slippery, it is difficult to drive on it without flying out on the side of the road - a false, i.e. unstable extremum. If the road is wide and well paved, it can be driven easily - a more true, i.e. stable extremum. We introduce small deviations in steering and it is no longer possible to drive on a narrow and slippery road and fit into corners - we cut off false extrema. Since finding extrema of multidimensional functions is very similar to climbing uphill on different tracks, the analogy is quite appropriate.
Thus theoretically the genetic algorithm will in theory try to avoid false extrema as their descendants as chromosomes will not fit into turns and will aim at maximizing on stable ones.
After optimization, distortions of input parameters should be disabled.
After optimisation, distortion of the input parameters must be disabled.
Maybe you should not have excluded them. They can give confidence boundaries
After optimization, input parameter distortions should be disabled.
Maybe you shouldn't rule them out, sir. They can give confidence boundaries
Why do you need distortions in trading? Maybe you switch them on to have a plus/minus one-kilometre drawdown, but I don't need such a pleasure. The task is only to cut off a large part of unstable extremums during optimization.
Maybe you shouldn't rule them out. They can give confidence boundaries
Mr. Yuri, what about the article? When will it be published?
Great news! I would love to read your work by the weekend!
Good luck!
Great news! I would love to read your work by the weekend!
Good luck!
It's all nerdy, some formulas, some nerdy words, pictures to make it more important.
I'd rather read a detective story.
It is about introducing insignificant errors during optimization, i.e. values of input parameters will be randomly distorted slightly and it is supposed to prevent the genetic algorithm from fixing on false extrema with steep descents. At the same time for extrema with gentle slopes the small distortions will also have a small effect on the optimization algorithm.
For example, if an extremum has coordinates in the form of values of input parameters x, y, z and fitness function profit = f(x, y, z) then x ± delta, y ± delta, z ± delta where delta is a small distortion for values of input parameters, for false extremums the deviation of the fitness function will significantly affect the final result while for extrema with gentle slopes the deviation of the fitness function will not be significant.
This is analogous to road traffic: if the road is narrow and slippery, it is difficult to drive on it without flying out to the side of the road - a false, i.e. unstable extremum. If the road is wide and well paved, it can be driven easily - a more true, i.e. stable extremum. We introduce small deviations into the steering and it is no longer possible to drive on a narrow and slippery road while cornering - we cut off false extrema. Since finding extrema of multidimensional functions is very similar to climbing uphill on various trails, the analogy is quite appropriate.
Thus theoretically the genetic algorithm will, in theory, try to avoid false extrema as their descendants in the form of chromosomes will not fit into turns and seek to maximize on stable ones.
After optimization, distortions of input parameters must be disabled.
don't accidentally distort anything. This is done by the GA itself by the mutation mechanism. Optimization is needed to check each individual option for robustness, not to search for global extremes. If an option does not meet the requirements, then this filter, or element of the set (depending on what the parameter is) should be revised or discarded altogether.
The GA is only needed for an initial reference point - to choose values of the opts that are more or less working, to fix them and then check each opt separately.
Forward is necessary in any case. How else can we estimate it?
You don't need a forward at all if you analyse wholesalers correctly.
The essence of the forward is to evaluate whether extremums of optimized parameters float over time. I.e., to cut off variants when there are several local extrema in the whole testing section (optimization + autofsample). It can be cut off much better through a separate analysis for each option of its extremum singularity and monotonicity. I.e. it is already a guarantee that an option does not "float" in time. And forward has a serious disadvantage - it only considers individual points on the optimization surface and not in the aggregate. That, coupled with phonon division of plots into optimisation and autofsamples, reduces statistical reliability of such analysis below the plinth)) It's just one realization - maybe one will be lucky to select an autofsample and a shitty set of options will be passed, or vice versa - an autofsample will fall in a period of temporary drawdown of a "good" set of options.
But in any case I repeat that the optimization task is to evaluate robustness of each parameter of the system. If in doubt it is better to discard or modify it. Leave only what is 100% supported by statistics and trading logic.