Machine learning in trading: theory, models, practice and algo-trading - page 1074

 
Maxim Dmitrievsky:

The first line of selection is totally correct now. The 2nd line we must combine all best variables with each other in loop, and learn RDF with each combination of features. Also here we can appy different polynomial transformations

this is how I see it

It took me lot of time to understand your first code to implement this:))))...so now please don't give me another code:)))

Whenever you want to transform a feature like close price...please make sure to collect it in a variable either from new price data or old trained data from matrix...that you should do.:)))

Then, just call the function"CalculateNeuron(ker,degree)" with the variable and then, when you get the new feature update it in Matrix or whatever you want to do it:))

I mean the whole task of feature transformation using GMDH will be done inside function which I understand it properly...Now, how you feed the values and retrieve the values that is your choice... )))))

 
Maxim Dmitrievsky:

Well, just give me more time. I dont understand nothing yet, but all be good :)

too many combinatorics here

but on output all must be clear - just n-dimentional array with selected features and formulas

Well,if you already understood your previous code(which I understood a little so far :)) then it should be just 2 minutes work for you :)))... because I am not doing anything here...just copying your previous code and replacing it by GMDH :))

But if you want to do it another way...you can take your own time:)))

Of course, I can't guarantee about my implementation of GMDH:))... and we can't know anything until we run the final EA in LIVE trading mode:)))... even backtesting results don't seem to be reliable...

So you can try your own way or you can just let me know in case if you need the code for "(CalculateNeuron(ker,degree)" to implement

If it will work, then I can even expand the base components to 20 or 30... and it can be slow during training and may be slow in trading also due to multiple for loops...but since it will just check one value of degree at a time and hence, we can expect average speed...

 
Maxim Dmitrievsky:

For you, maybe you will beter understand it

Ok, I will try...but I will ask 100 different question on this code also and you should be ready to answer :))...

Because you should understand that it is completely your approach and also, you are trying to code in a different way which has no link to your previous code and hence, I have no way to understand it until I understand exactly what you are trying to do in this version...So it will take some more time to understand and then, I will try to create a bridge to GMDH...

 
Maxim Dmitrievsky:

we use genetic selection for gdmh

for this, we must first learn RDF with every feature separately, do you understand this?

What do you mean by genetic selection of GDMH?

In my knowledge of GMDH, for every set of features or inputs, it will give one output as a summation of all inputs breaking into pieces of features. So if you give one feature you will get one output and even if you give 100 features still you will get one output ONLY as summation of broken pieces of all previous features and that's all is GMDH

If you give ONLY one feature as input it will give one output=feature1*weight1

or

2.If you give one feature and all previous features(new feature or trained features from RDF), then it will just transform the current feature to new output=feature1*w1+feature2*w2+feature3*w3+....m components of base functions

SO here if you want to transform a new feature, then create an array to store the trained features from RDF and then, pass it to the "CalculateNeuron(ker,degree)" function.But you need to pass one more array element to this function.

 
Maxim Dmitrievsky:

can you just provide the scheme of your gmdh view? step by step

Give me sometime...I will provide you the complete source code of both GMDH library and the EA implemented using your previous code...I am just searching for it now:)))

 

I've been poking around all levels with the help of "MO" (those MOs are looking for anticipated bounce levels), sometimes the signals are not bad


And sometimes the system just goes crazy in a trend

===================

Do i know the reason? How to filter it? Has anybody ever done anything like that or am i the only one in the field?

 
Maxim Dmitrievsky:

we don't need to summarize predictors with '+', because we don't use linerar solver. Instead of it, we just add new inputs, step by step, increasing features numbers and its combinations

Getetic means that we work only with best predictors on every step of transformation, not with every predictor. So we select only n best on every step

No problem then, you just create a dynamic array and at every step pass those array elements as inputs to the function "Neuron function()", but you need to add one more input to this function.

I am using the inputs from the "Calsignal()" where you copy the close prices...So instead of that you just add the array elements on every step of RDF training completion and then, free the array.Did you get it?

I will paste the codes in my next post. Please copy and then, delete the post.

 
GMDH
 

GMDH EA:

 
Maxim Dmitrievsky:

we don't need to summarize predictors with '+', because we don't use linerar solver. Instead of it, we just add new inputs, step by step, increasing features numbers and its combinations

Getetic means that we work only with best predictors on every step of transformation, not with every predictor. So we select only n best on every step

Please copy the code and let me know...I will delete the code:

Reason: