Optimization

 

Hi,

I have one EA where I can turn on/off one indicator, and it also have params for this indicator.

When I mark the option to use or not and the indicators params to optimize it generates some redundancies (as expected).

 

OBS: "Não" means "No" and "Sim" means "Yes".

But makes no sense to optimize the indicator params while it's not been used, so when I'm not using the indicator I would like to force to zero all the indicators params, to avoid re-rprocessing.

I feel  that's possible to achieve this by using the "OnTesterInit" or maybe "OnTesterPass", but I'm not quite sure.

Any ideas?

Att.

 
Henrique Vilela:

Hi,

I have one EA where I can turn on/off one indicator, and it also have params for this indicator.

When I mark the option to use or not and the indicators params to optimize it generates some redundancies (as expected).

 

OBS: "Não" means "No" and "Sim" means "Yes".

But makes no sense to optimize the indicator params while it's not been used, so when I'm not using the indicator I would like to force to zero all the indicators params, to avoid re-rprocessing.

I feel  that's possible to achieve this by using the "OnTesterInit" or maybe "OnTesterPass", but I'm not quite sure.

Any ideas?

Att.

Hi Henrique Vilela,

Have you seen the functions ParameterSetRange and ParameterGetRange?

Regards,
Malacarne 

 
Rodrigo Malacarne:

Hi Henrique Vilela,

Have you seen the functions ParameterSetRange and ParameterGetRange?

Regards,
Malacarne 

Thank you for the reply.

Yes, I did. But with ParameterSetRange I can turn off the optimization for a specific param, but not contextually (do not try different MA periods if the MA is off) assuming that I'm also optimization the use or not of the MA itself. 

 

Another (kind of) related question:

On the OnInit function is possible to somehow get the whole list of params?

for (each EA param)
{
    // do something 
} 
 
Henrique Vilela:

Thank you for the reply.

Yes, I did. But with ParameterSetRange I can turn off the optimization for a specific param, but not contextually (do not try different MA periods if the MA is off) assuming that I'm also optimization the use or not of the MA itself. 

1) You can check whether you are optimizing by IsOptimization().

2) In Oninit() you can check whether the parameter has been set by the optimizer to a ridiculous combination and then just do:

return(INIT_PARAMETERS_INCORRECT);

Now - acc. to the reference: "

Testing for the given set of parameters of the Expert Advisor will not be executed

If it is this what you are asking about.
 

It's a bit more complex than that.

Let's say I have four params: A1, A2, B1 e and B2. A1 and B1 are booleans and A2 and B2 are integers. A2 are used only when A1 is true, B1 and B2 has the same relationship.

So, let's say that I want to optimize the four variables: A1 and B1 can be true/fase and A2 and B2 can be 1/2. It would give me 16 combinations, but 7 are redundant (all values in red are been ignored) letting only 9 real results.

   A1    A2  B1    B2 
1  true  1   true  1
2  true  1   true  2
3  true  1   false 1
4  true  1   false 2 (same as 3)
5  true  2   true  1
6  true  2   true  2
7  true  2   false 1
8  true  2   false 2 (same as 7)
9  false 1   true  1
10 false 1   true  2
11 false 1   false 1
12 false 1   false 2 (same as 11)
13 false 2   true  1 (same as 9)
14 false 2   true  2 (same as 10)
15 false 2   false 1 (same as 11 and 12)
16 false 2   false 2 (same as 11, 12 and 15)

As long I have many variables like that, most of the combinations are redundant so my problem is even worst than this example.

So, what I want to do is return INIT_PARAMETERS_INCORRECT to all redundant params, as you suggest. But here's the problem: Which ones should be rejected?

My idea would be to reject if A1 is false and A2 is different form the "start" param (same for B1 and B2), what would give me exactly the lines in bold. That would be perfect, except for one problem: ParameterGetRange is unaccessible from the OnInit function so I can't know the range that is been tested (start, step and stop) from this function.

I could save this information to a file from the OnTesterInit function and load it on the OnInit, but this kills the cloud / remote optimization feature (that's a big deal here).

Any ideas?

 
Henrique Vilela:

It's a bit more complex than that.

Let's say I have four params: A1, A2, B1 e and B2. A1 and B1 are booleans and A2 and B2 are integers. A2 are used only when A1 is true, B1 and B2 has the same relationship.

So, let's say that I want to optimize the four variables: A1 and B1 can be true/fase and A2 and B2 can be 1/2. It would give me 16 combinations, but 7 are redundant (all values in red are been ignored) letting only 9 real results.

As long I have many variables like that, most of the combinations are redundant so my problem is even worst than this example.

So, what I want to do is return INIT_PARAMETERS_INCORRECT to all redundant params, as you suggest. But here's the problem: Which ones should be rejected?

My idea would be to reject if A1 is false and A2 is different form the "start" param (same for B1 and B2), what would give me exactly the lines in bold. That would be perfect, except for one problem: ParameterGetRange is unaccessible from the OnInit function so I can't know the range that is been tested (start, step and stop) from this function.

I could save this information to a file from the OnTesterInit function and load it on the OnInit, but this kills the cloud / remote optimization feature (that's a big deal here).

Any ideas?

If the problem is 'only' that a combination might appear twice or more don't worry! The tester has cache file in ..\tester\files and it will not test the same set more than once.

This cache file is even used if you are optimizing the same EA another time!

 
calli:

If the problem is 'only' that a combination might appear twice or more don't worry! The tester has cache file in ..\tester\files and it will not test the same set more than once.

This cache file is even used if you are optimizing the same EA another time!

Internally the EA will do exactly the same thing (because one or more variables are ignored) but to the strategy tester is a enterally new test (as long one of the variables has a different value).

This two, for example:

3  true  1   false 1
4  true  1   false 2 (same as 3) 

 For the EA is the same thing (B2 will not be used) but for the strategy tester is a different thing (because B2 is different).

 
Henrique Vilela:

Internally the EA will do exactly the same thing (because one or more variables are ignored) but to the strategy tester is a enterally new test (as long one of the variables has a different value).

This two, for example:

 For the EA is the same thing (B2 will not be used) but for the strategy tester is a different thing (because B2 is different).

If so I don't see anything else than an internal list of valid combinations and otherwise a

return(INIT_PARAMETERS_INCORRECT);
Reason: