Machine learning in trading: theory, models, practice and algo-trading - page 1776

 
Aleksey Vyazmikin:

These, as you said, "chances" can be stacked, which is why they keep them that way.

Indeed... They add up these logodds from different trees. Then they calculate the final probability.
 
Aleksey Vyazmikin:

Yes, on the new ones, but now I realized that the target is wrong. I took the actual vector ZZ with an offset, which is wrong.

I'll have to draft a script to get the target out.

So what's up? What's the result?

 
Maxim Dmitrievsky:

I saw it somewhere in the tutorials... I think it's more convenient to do it during pre-learning or something to do with it.

Maxim, you seem to be doing clustering now.
Here it shows that scaffolding is similar to clustering.

https://habr.com/ru/company/ods/blog/324402/

Section "Similarity of random forest to k-nearest neighbor algorithm".

Открытый курс машинного обучения. Тема 5. Композиции: бэггинг, случайный лес
Открытый курс машинного обучения. Тема 5. Композиции: бэггинг, случайный лес
  • habr.com
Пятую статью курса мы посвятим простым методам композиции: бэггингу и случайному лесу. Вы узнаете, как можно получить распределение среднего по генеральной совокупности, если у нас есть информация только о небольшой ее части; посмотрим, как с помощью композиции алгоритмов уменьшить дисперсию и таким образом улучшить точность модели; разберём...
 
elibrarius:


Any Question?

 
elibrarius:

Maxim, you seem to be clustering now.
Here it shows that scaffolding is similar to clustering.

https://habr.com/ru/company/ods/blog/324402/

Section "Similarity of random forest to k-nearest neighbor algorithm".

How am I doing... I started and then I gave up.) Forest can cluster too, yeah.

As for clustering as it is - it is quite good at separating increments into 3 groups, including new data. It makes sense to use as categorical features, that's what I wanted to do
 
Maxim Dmitrievsky:


Any Question?

 
BOBBY Brothers!!!!! They won......

THIS IS THE WIN!!!!! Brothers!!!! HORRAAAAAAAAAAAAA!!!!! Happy Holidays everyone.

Because as soon as we forget this war another one will start immediately. Let's remember it always!!!!!!!! VICTORYAAAAAAAAAAAAAAAAA!!!!!!! Pew, pew (that's me firing my imaginary TT gun up and running down the street in my officer's uniform)

 
You see, we are on the same side of the barricades! Happy Holidays, everyone!
 
mytarmailS:

So what's up? What acuracy did you get?

10 CatBoost models with tree depth 6, learning stop at 100 new trees not improving results, sitting in increments of 100.

Accuracy=70.72461682377491
Accuracy=70.86133697920415
Accuracy=70.77066992876159
Accuracy=70.64690220910988
Accuracy=70.78506152406995
Accuracy=70.88004605310499
Accuracy=70.69871195221991
Accuracy=70.59509246599985
Accuracy=70.58501834928403
Accuracy=70.71454270705908

Learning sample 80% 2018 and 2019, 20% sample to control learning stop. Independent sample January-May 2020.

If you torture the sample with different partitioning methods and build more models, I think you can get 72.

Balance of classification


 
Aleksey Vyazmikin:

10 CatBoost models with tree depth 6, learning stop at 100 new trees not improving results, sitting in increments of 100.

Learning sample 80% 2018 and 2019, 20% sample to control learning stops. Independent sample January-May 2020.

If you torture the sample with different partitioning methods and build more models, I think you can get 72.

Classification balance.


Well... nice and plausible. I would like to see the balance of the trade and a chart with entries.

What is the difference between these 10 models?

Reason: