Machine learning in trading: theory, models, practice and algo-trading - page 2315

 
Maxim Dmitrievsky:
really explains it in such a way that it's easier to ask Napoleons in an insane asylum

There are no predictions in quantum physics, there are calculations of uncertainties, or probabilities.)

 
Valeriy Yastremskiy:

In quantum physics there are no predictions, there are calculations of uncertainties, well, or probabilities.)

The scope seems to have been touched upon, but not disclosed at all. Instead of it some porridge from the babushkoy rota began

 
Maxim Dmitrievsky:

The field of application seemed to be touched upon, but was not disclosed at all. Instead, some kind of mush from the babka company began.

Of today's gibberish on and around AI into science, less than a hundredth of a percent....

Do you agree?

 
Valeriy Yastremskiy:

of today's nonsense on AI and near AI into science would be less than one hundred percent....

Agreed?

Depends on what science)

there are interesting tricks from retraining, but they did not disclose them, and where else to read about it have not found yet
 
Maxim Dmitrievsky:

it depends on what science )

there are interesting features from retraining, but they have not opened them, and where else to read about it i could not find

science is knowledge that is used. no matter what the name of the science))))

 
Maxim Dmitrievsky:

uncertainty prediction is an interesting thing


They have a link to an article there - at first glance just the usual Bayesian plus normal distributions.

Uncertainty in Gradient Boosting via Ensembles
  • arxiv.org
For many practical, high-risk applications, it is essential to quantify uncertainty in a model's predictions to avoid costly mistakes. While predictive uncertainty is widely studied for neural networks, the topic seems to be under-explored for models based on gradient boosting. However, gradient boosting often achieves state-of-the-art results on tabular data. This work examines a probabilistic ensemble-based framework for deriving uncertainty estimates in the predictions of gradient boosting classification and regression models. We conducted experiments on a range of synthetic and real datasets and investigated the applicability of ensemble approaches to gradient boosting models that are themselves ensembles of decision trees. Our analysis shows that ensembles of gradient boosting models successfully detect anomaly inputs while having limited ability to improve the predicted total uncertainty. Importantly, we also propose a concept of a \emph{virtual} ensemble to get the...
 
Aleksey Nikolayev:

They have a link to an article there - at first glance just a regular Bayesian plus normal distributions.

I just did not understand what button to press to get better results for classification

for regression, just an example.

i have understood that by default the maximum gradient sampling is used (kind of a new feature)

or it is just built-in by default and i don't need to do anything

By the way, catbust is very cool in terms of retraining... it's very hard to make it retrain. If the dataset is crap... ...it will learn badly and not remember all the options.
 

Watching another video for now


 
Valeriy Yastremskiy:

science is knowledge that is used. no matter what the name of the science))))

there's nothing wrong with knowledge, it's just applications that need to be considered

The boosting algorithm itself is still very cool. If only we could get some more normal speakers from the studio
 
Maxim Dmitrievsky:

Well, there's nothing wrong with knowledge, you just need to look at the areas of application

The boosting algorithm itself is still very cool. If only we could get some more normal speakers from the studio.

(this is all sophisms))) Without intermediate knowledge there is no knowledge)) anything happens at the intermediate stage, the times of change are usually similar to the SB)))

Reason: