Machine learning in trading: theory, models, practice and algo-trading - page 909

 
Mihail Marchukajtes:

I don't know... Doc also gave all the codes for MKUL so that the model can be used directly in MT. The only thing I didn't like about elmnn is that no matter how many times I trained it, it always gives me the same result on OOS. So, no matter how many times I train it, it always produces the same result, so I' ll get a dick :-) But the work has just begun and I need more tests to get a confident verdict...

Oh, let that be the motto of this thread :))

 
Mihail Marchukajtes:

No. But I wrote a script that unloads everything into Excel for me, and then I'm already working there. I can't give you the script, because it's my brainchild .... Well, I'm there funky original thing done. I don't know how to estimate the predictors, but the result is very readable and convenient table for further analysis ... That's about it...

I know what I mean about the inability to give the script.

What I don't understand is why I can only give it 0 and 1 in predictors? What models does it support (tree/forest/NS)?

 
Maxim Dmitrievsky:

Oh, let that be the motto of this thread :))

This motto was not invented by me. I first heard it from Leonid Velichkovsky. Quite a well-known person in our circles. He was interviewed here was published and we were together at one closed laboratory. There were about 20 people there, it was a closed forum of NeuroBord Club. A closed forum on some kind of free hosting. I think it still works, only pity I deleted the bookmarks where there was a link to it. Was thinking about it recently. Thought about visiting. And yes, Leonid was the lead singer of the band Technology, but you probably, Maximka, haven't heard about that one. He was just a kid..... All the time he was being laughed at, "Press the button, you'll get the result and your dream will come true" in good faith, of course....

 
Mihail Marchukajtes:

This motto was not invented by me. I first heard it from Leonid Velichkovsky. Quite a well-known person in our circles. He was interviewed here and we were together at one closed laboratory. There were about 20 people there, it was a closed forum of NeuroBord Club. A closed forum on some kind of free hosting. I think it still works, only I deleted the bookmarks where there was a link to it. Was thinking about it recently. Thought about visiting. And yes, Leonid was the lead singer of the band Technology, but you probably, Maximka, haven't heard about that one. He was just a kid..... All the time they laughed at him "Press the button, you will get the result and your dream will come true" from good motives of course....

How not heard, heard about the band. Wow, where the roots go, well the expression sounds like all his songs, yeah :) (just kidding)

 
Aleksey Vyazmikin:

I understand about the inability to give the script.

What I don't understand is why it can only give 0 and 1 in predictors? What models does it support (tree/forest/NS)?

In what predictors? I wrote that this is a requirement for the target. You make a table where in the columns you have predictors and in the last one you have a target of 0 and 1. It will tell you which predictors contain predictive ability to the target when it calculates the table. I have greatly improved the quality of the models after this processing. This was in the beginning of March, so thank you very much and kudos to Doc for that :-)

 
Mihail Marchukajtes:

I don't know... Doc also gave all the codes for MKUL so that the model can be used directly in MT. The only thing I didn't like about elmnn is that no matter how many times I trained it, it always gives me the same result on OOS. So, no matter how many times I train it, it always produces the same result, so I'll get a dick :-) But the work has just begun and we need more tests to get a confident verdict...

It can not be by definition. Every run of the ELM neural network generates a network with weights initiated randomly and doesn't use backprop. Read the description of this specific neural network model.

If your neural network does not change, you must have screwed up somewhere.

 
Vladimir Perervenko:

This cannot be the case by definition. Each run of the ELM neural network generates a network with weights initiated randomly and does not use backprop. Read the description of this specific neural network model.

If your neural network does not change, then you must have screwed up somewhere.

The thing is, the model transfer from P is done by saving scales and each time they are ALWAYS different. But when I put four different weights on the model, the result is the same for all of them. I mean the signals. The doc says it's because of used data, I don't think he gave me wrong code or I've done something wrong with it, but it's a fact....

 
Mihail Marchukajtes:

What predictors? I wrote that this is a requirement for the target. You make a table where you have predictors in the columns, and in the last one you have a target of 0 and 1. It will tell you which predictors contain predictive power to the target when it calculates the table. I have greatly improved the quality of the models after this processing. That's how my ascent in trading actually started. It was in the beginning of March, thanks a lot to Doc and kudos to him :-)

Yes, I got it wrong. It was the goal-oriented question, and it suits me fine.

But, I do not quite understand the answer, I look at the log like a sheep at a new gate - is it the log of the script, and not of the package itself?

Forum on trading, automated trading systems and testing of trading strategies

Theories and Practices of Machine Learning in Trading (Trading and Not Only)

Mihail Marchukajtes, 2018.05.14 11:49

forexFeatures<-forexFeatures1[i:n_rw,1:n_enter+1]
set.seed(1234)
#designTreatmentsC  подходит только для классификации с двумя классами
treatmentsC <- designTreatmentsC(dframe = forexFeatures,
                                varlist=colnames(forexFeatures)[-ncol(forexFeatures)], #названия  колонок с предикторами (тут - все кроме последней колонки)
                                 outcomename = colnames(forexFeatures)[ncol(forexFeatures)], #названия  колонок с таргетом (тут - последняя колонка)
                                 outcometarget = "1") #текст  или цифра одного из классов
#обработка,  сортировка результата
treatmensC_scores <- treatmentsC$scoreFrame[order(treatmentsC$scoreFrame$sig),]
treatmensC_scores <- treatmensC_scores[!duplicated(treatmensC_scores$origName),]
treatmensC_scores <- treatmensC_scores[,c("origName","sig")] 
treatmensC_scores$is_good <- treatmensC_scores$sig <= 1/nrow(forexFeatures)
treatmensC_scores

Generally speaking, it's like this. But this is an estimate for the target classification where there are only 0 and 1. For regression it's different...


 
Mihail Marchukajtes:

The thing is that transferring a model from P is done by saving the weights, and each time they are ALWAYS different. But when I put four different model weights, the result is the same for all of them. I mean the signals. Doc says it's because of used data, I don't think he gave me a faulty code or I've done something wrong with it, but it's a fact....

Once again, it can not be in principle. Just repeat the experiment with your P data on 100+ ELM models and you will not find two identical results. Look for the error.

Good luck

 
Vladimir Perervenko:

Once again, this cannot be the case in principle. Just repeat the experiment with your P data on 100+ ELM models and you will not find two identical results. Look for the error.

Good luck

Yeah I know it looks weird myself, but let's see how it goes. Man, I want to show you one photo, but I can't find it. But I found so much junk from that time, and the main thing is the networks, all around adons and neural steins, I even cried. I'll show you a picture of it...

Reason: