Discussing the article: "Gaussian Processes in Machine Learning: Regression Model in MQL5"

 

Check out the new article: Gaussian Processes in Machine Learning: Regression Model in MQL5.

We will review the basics of Gaussian processes (GP) as a probabilistic machine learning model and demonstrate its application to regression problems using synthetic data.

Gaussian processes (GPs) are a Bayesian non-parametric modeling framework widely used in machine learning for regression and classification problems. Unlike many traditional models that provide only point forecasts, GPs generate a full probability distribution for the predictive values. This allows the model to provide not only point predictions but also uncertainty estimates, typically expressed as confidence intervals. This is a distinctive feature of the Bayesian approach, which combines prior knowledge with observed data to obtain a predictive distribution.

GPs belong to the class of kernel methods that use covariance functions (or kernels) to model dependencies between data. The ability to combine different kernels (e.g. by addition or multiplication) allows for a certain flexibility in describing possible predictive functions. Each kernel has its own hyperparameters that need to be optimized to achieve maximum model accuracy.

In this article, we will examine in detail the forecasting process using a Gaussian process regression model, clearly demonstrating how GPs allow not only to generate accurate forecasts but also to comprehensively assess their uncertainty.


Author: Evgeniy Chernish

 

Interesting.

Most of the pictures won't load for some reason.

 
Check again, please
 
Rashid Umarov #:
Check again, please.
Thank you, all the pictures are now loaded.
 
Looking forward to the sequel.
 
There is a strong relation between the Gaussian Processes for regression and the Wiener-Khinchin theorem https://danmackinlay.name/notebook/wiener_khintchine.html https://www.numberanalytics.com/blog/wiener-khinchin-theorem-guide It would be great if you can continue in this direction to enlighten us .
[Deleted]  
Mathematically beautiful tool, but it turned out to be niche, like, for example, the method of support vectors. In reality, you don't hear of it being used anywhere at all :) All models based on Gaussian mixtures are slow and perform poorly on big data.
 
nevar #:
There is a strong relation between the Gaussian Processes for regression and the Wiener-Khinchin theorem https://danmackinlay.name/notebook/wiener_khintchine.html https://www.numberanalytics.com/blog/wiener-khinchin-theorem-guide It would be great if you can continue in this direction to enlighten us .
Fourier analysis is more about stationarity and linear relationships. It is easier to work in time domain with ARIMA models, it is equivalent in some sense to Fourier analysis.

But GPs are already about the search for non-linear relationships, in this sense they are not far from neural networks, such as MLP, but with the possibility of extrapolation and building confidence intervals of forecasts.

Therefore, I do not plan to cover Fourier, but will continue on GP.

 
Maxim Dmitrievsky method of support vectors. In reality, you don't hear of it being used anywhere at all :) All models based on Gaussian mixtures are slow and perform poorly on big data.
It's not a very popular tool of course, but I see it as a promising one. What attracts me is that once you understand the kernel approach, you get a single coherent point of view on data analysis. There is regression and classification and kernel density estimation and selection of significant features and statistical tests for independence, etc.


[Deleted]  
Evgeniy Chernish #:
It is not a very popular tool, of course, but I see it as a promising one. I am attracted by the fact that once you understand the kernel approach, you get a single coherent point of view on data analysis. There is regression and classification and kernel density estimation and selection of significant features and statistical tests for independence, etc.


Anyway interesting :)