Machine learning in trading: theory, models, practice and algo-trading - page 3146

 
СанСаныч Фоменко #:

Going round and round in two circles on account of you being obtuse, supposedly not understanding what I write.

There's no code. You figure it out.

Too bad))))))

 

Yuri Asaulenko used to be here, I don't remember his surname. He also wrote something for some reason all the time, without any details.

When people asked him to explain why he was writing it, he would tell them to figure it out for themselves.

 

distinguish between blabbermouths and practitioners ...

It's so simple....

2-3 features, that's all...

you can build a classifier on a single decision tree and it will work better than gullible you...

 
СанСаныч Фоменко #:

The window is the number of predictor values that are fed to the model input. Mine is 1500 bar on H1.

Sanych, the window of 1500 bars on H1 is always the same: a tangent to the middle of the window.

It is said above - going round and round.

That's right!

 
mytarmailS #:

differentiate between blabbermouths and practitioners.....

It's that simple....

2-3 signs, just...

you can build a classifier on a single decision tree and it will work better than the gullible you...

The gullible ones are those who write articles and advertise themselves here on the forum, not traders ).

 
Maxim Dmitrievsky #:

Yuri Asaulenko used to be here, I don't remember his surname. He also wrote something for some reason all the time, without any details.

When people asked him to explain why he was writing it, he would tell them to figure it out for themselves.

Oh yes! And what epic sheets he and Renat Akhtyamov used to write. Epic in volume and meaninglessness.)

 
СанСаныч Фоменко #:

Going round and round in two circles on account of you being dumb, supposedly not understanding what I write.

There's no code. You figure it out.

No code. Judging by the fact that even the set of features changes from step to step, we are talking about ordinary retraining to fit some "predictive criterion" you have invented. There can be no question of any stability if the pattern jumps so much from step to step.

 
Aleksey Nikolayev #:

Oh yes! And what epic sheets he and Renat Akhtyamov used to write. Epic in volume and meaninglessness)

It's just a matter of putting predictors into neuronics without thinking....

 

If NS trained once doesn't work all the time, it won't be profitable ever!

О! Yes, fitting i.e. MO can be done even on every tick. ))

 
Don't much like discussing someone's homebrews that aren't fully described. The usual problem with them is weak argumentation and meagre experimental base. That's why you need either prufs and references to theory or reproducible code.
And it is better to discuss something widely used.