Discussing the article: "Econometric tools for forecasting volatility: GARCH model" - page 2

 
Tests see no significant differences, residuals at any reasonable filtering parameters are recognised as stationary :)
 
Maxim Dmitrievsky #

Since splines don't work on new data, you can redo for HP or any other filter. If there is a desire to build exactly some model.
And what prevents you from training a regular linear Regression???
Why do you even stick to these splines, there are a million better methods....
 
Maxim Dmitrievsky #:

I made LLM DeepSeek with God's help. You can substitute your own data.

Explanation:

To make the residuals as close to a normal distribution as possible during the optimisation process, a criterion of agreement (e.g. Shapiro-Wilk criterion or Kolmogorov-Smirnov criterion) can be used to assess the normality of the residuals. The parameters k k and s scan then be optimised to minimise the deviation of the residuals from the normal distribution.

  1. Error function considering normality of residuals: A new function spline_error_with_normality , which calculates residuals and uses the Shapiro-Wilk criterion to assess their normality, is introduced. The negative p-value is minimized to maximise the normality of the residuals.

  2. Optimisation: Minimize is used to optimise the parameters k k and s s based on a new error function.

This approach allows the spline parameters to be adjusted so that the residuals maximise the normality of the distribution, which can improve the quality of the model and the interpretability of the results.

Since splines do not work on new data, it is possible to redo under HP or any other filter. If there is a desire to build a particular model.

I got an error on line 49 when trying to run it - name 'norm' is not defined. The problem is probably my inexperience with collab. But the idea in general is quite clear from the code.

The main problem is that splines (as well as any other attempt to build a deterministic function) do not work on new data. Therefore, in serious offices working with options, imho, serious mathematicians usually build serious stochastic models for volatility, similar in spirit to the one in the article under discussion. At the same time, when you look at the reasoning of small option traders, you get the feeling that behind them are ideas about determinism of volatility fluctuations, similar in spirit to the ideas from Stepanov's article.

 
Aleksey Nikolayev #:

When I tried to run it, I got the error on line 49 - name 'norm' is not defined. The problem is probably due to my inexperience with collab. But the idea in general is quite clear from the code.

The main problem is that splines (as well as any other attempt to build a deterministic function) do not work on new data. Therefore, in serious offices working with options, imho, serious mathematicians usually build serious stochastic models for volatility, similar in spirit to the one in the article under discussion. At the same time, when you look at the reasoning of small option traders, you get the feeling that behind them are ideas about determinism of volatility fluctuations, similar in spirit to the ideas from Stepanov's article.

Yes, corrected, the library was not imported

needed:
from scipy.stats import shapiro, norm

Well, I use it for other purposes (marking trades on history), so I do it through any curves and see what I get :)

You can compare it with a zigzag, when marking by vertices. Here you can make markup by deviations from the spline.

Well, it is so, in the order of nonsense, to the subject of the article does not apply.

 
mytarmailS #:
What's stopping you from teaching a regular linear regression???
Why do you even bother with these splines? There are a million better methods out there.

Already wrote in the MO topic that for my problem linear regression proved to be worse. Besides, the spline is also built from regressions (piecewise).

That is, I don't predict anything with this spline. I use curves for marking deals on history.