Machine learning in trading: theory, models, practice and algo-trading - page 3721

 

Perplexity about probabilistic MO and non-stationarity:

Probabilistic machine learning (ML) for non-stationary time series takes into account that the properties of the series (expectation, variance, autocovariance) change over time, so standard models for stationary series will not work.

Peculiarities of non-stationarity

  • Nonstationarity manifests itself in trends, seasonal changes, changes in variance, and structural shifts in time data. These changes make forecasting difficult and require more flexible models that can adapt to the changing characteristics of the series.

Probabilistic approach to modelling

  • Probabilistic models for non-stationary series are often built with time-varying parameters and/or include hidden states that improve the modelling of dynamics by allowing changes in the properties of the series to be monitored efficiently.

  • For example, extended variations of the Hidden Markov Model (HMM), dynamic Bayesian networks and state-space models are able to model complex non-stationarity due to their probabilistic description of transitions and noise.

  • Bayesian methods can efficiently estimate uncertainty and adapt to changing data distributions, which is critical in non-stationarity.

Bottom line

Probabilistic machine learning is suitable for non-stationary time series through:

  • Using models with time-varying parameters or hidden states,

  • the use of dynamic and adaptive probabilistic structures,

  • accounting for uncertainty and distributional variation in temporal dynamics,

which, as a result, allows to build more accurate and stable forecasts on data whose properties change over time.

 

It is also about combining probabilistic and causal MO approaches for non-stationary time series:

Combining probabilistic and causal machine learning for non-stationary time series builds on integrating the advantages of both approaches for more robust and interpretable analyses of dynamic data.

Fundamentals of integration

  • Probabilistic models provide a flexible and adaptive formalisation of non-stationarity, taking into account uncertainty and dynamically varying series parameters. They allow estimation of distributions and confidence intervals of predictions, taking into account noise and bias.

  • The causal approach in doing so reveals directional causal relationships, which allows us to interpret how and why a series changes, rather than only predicting its behaviour. This significantly improves the understanding of the internal dynamics of a system, especially in the presence of hidden or changing factors.

Practical methods

  • Recent research at MSU and FIC IS RAS has proposed probabilistically informed machine learning methods that use mixed connectivity components to generate informative features from time series in a sliding window. This helps to account for non-linear and stochastic relationships in the data and integrates with deep neural networks including LSTMs and transformers.

  • This approach allows simultaneous modelling of the probabilistic nature of the data and extraction of causal patterns, which improves forecast quality and interpretability of results.

  • Importantly, it takes into account changes in series properties (e.g., noise structure, model parameters) over time, which is critical for correctly identifying and interpreting causal relationships within non-stationary data.

Bottom line

A joint probabilistic and causal machine learning approach for non-stationary time series:

  • Adapts probabilistic models to account for dynamic and changing characteristics,

  • extracts causal relationships, providing explainability and insight into the internal mechanics,

  • uses algorithms to generate informative features and account for the stochastic nature of the data,

  • improves prediction accuracy and model robustness to noise and structural shifts,

which makes it possible to apply such methods to complex real-world problems with dynamic and noisy time series.

 
In general, possible topics for articles - you can't rewrite)
 

About probabilistic MO in a tutorial from Yandex.

Although, mostly only about small promotion within the classification task, but still.

[Deleted]  
Aleksey Nikolayev #:
In general, possible topics for articles - you can't rewrite)
I always want to use the example of bots, because theory is theory, and practice is practice :)
In general, I don't really understand abstract articles on the example of ideal cases. And in practice it turns out to be the most difficult work, which is usually ignored.
 
Maxim Dmitrievsky #:
I always want to use bots as an example, because theory by theory and practice by practice :)
In general, I don't really understand abstract articles on the example of ideal cases. And in practice you get the most difficult work, which is usually ignored.
Also true. But sometimes articles without practice but with interesting ideas (especially if it is clear how they can be applied to practice) seem interesting.
[Deleted]  
Aleksey Nikolayev #:
Also true. But sometimes articles without practice but with interesting ideas (especially if it is clear how they can be applied to practice) also seem interesting.
There have been no such articles for a very long time, too, it makes my eyes bleed. Especially the series of articles that last for the lives of their writers.
 
Maxim Dmitrievsky #:
These have been gone for a very long time too, it's starting to make my eyes bleed. Especially a series of articles that last the wimpy lives of their writers.

))

I meant articles in general, not only on this resource. Sometimes it happens that there seems to be practice, but only formally to tick the box. And sometimes (very rarely) it seems that nothing about practice is written, but behind every line of theory you can feel the experience of practice. As they say, there is nothing more practical than good theory )

[Deleted]  
Aleksey Nikolayev #:

))

I meant articles in general, not only on this resource. Sometimes it happens that there seems to be practice, but only formally for a tick. And sometimes (very rarely) it seems that nothing about practice is written, but behind every line of theory you can feel the experience of practice. As they say, there is nothing more practical than good theory )

On other resources passions have also subsided :) almost all MO models have already been attached to charts in different ways, without much effect.
 
Maxim Dmitrievsky #:
On other resources passions have also subsided :) almost all the MO models have already been attached to the graphs in different ways, without much effect.

Imho, the usual situation, when the advertising hype is replaced by the usual routine everyday life. If anyone is fishing via MO, they are unlikely to share their knowledge of fishing spots.

Incidentally, a similar thing is happening with AI right now. The howling about the soon AI chat rooms taking over the power has almost completely disappeared, the soon AGI/ASI is much less often. At most, they periodically roll out the next models that have gained another trillion per cent on the next murky benchmarks. And Bezos are quietly whining about the AI-bubble, as this holiday is mostly at their expense.