The Sultonov Regression Model (SRM) - claiming to be a mathematical model of the market. - page 27

 
gpwr:

The random walk has price increments described by a normal distribution, not the price itself.

You have now characterised one particular class of SB. There are at least three.
 
TheXpert:
Where can you get one?

It doesn't exist. I gave this example to show that it is possible to trade knowing the statistics of price behaviour, forgetting about the complicated market models on which (18), trigonometric and polynomial regressions and neural networks are based, for example.
 
anonymous:

You have now characterised one particular class of SBs. There are at least three of them.


I characterized the most frequently used class of SBs. Here's one from the English wikipedia (the Russian one is temporarily closed):

A random walk having a step size that varies according to a normal distribution is used as a model for real-world time series data such as financial markets. The Black-Scholes formula for modeling option prices, for example, uses a gaussian random walk as an underlying assumption.

Actually I was trying to explain that just because a random variable's increments have some kind of distribution (normal, uniform, etc.) doesn't mean that the random variable itself has the same distribution. And it's not even the same distribution :)

 
gpwr:


Characterized the most common class of SBs used. Here is from the English wikipedia (the Russian one is temporarily closed):

A random walk having a step size that varies according to a normal distribution is used as a model for real-world time series data such as financial markets. The Black-Scholes formula for modeling option prices, for example, uses a gaussian random walk as an underlying assumption.

Actually I was trying to explain that just because a random variable's increments have some kind of distribution (normal, uniform, etc.) doesn't mean that the random variable itself has the same distribution. And not even that kind of distribution :)

Just for the record, I note that (18) operates on the incremental price per unit of the billing period and arrives at the price itself by adding a notional constant component which it recalculates each time.
 
gpwr:

It doesn't exist. I gave this example to show that it is possible to trade knowing the statistics of price behaviour, forgetting about complicated market models based on (18), trigonometric and polynomial regressions and neural networks.
Well, why not - cointegration, quite a common statistical characteristic and widely used in the construction of TS.
 
gpwr:


Characterized the most frequently used class of SB. Here is from English wikipedia (Russian is temporarily closed):

A random walk having a step size that varies according to a normal distribution is used as a model for real-world time series data such as financial markets. The Black-Scholes formula for modeling option prices, for example, uses a gaussian random walk as an underlying assumption.

Actually I was trying to explain that just because a random variable's increments have some kind of distribution (normal, uniform, etc.) doesn't mean that the random variable itself has the same distribution. And not even the same distribution :)

A classical coin (i.e. a uniformly distributed discrete value of straying) will give you, for an infinite) number of realizations, a perfect discretized normal distribution already at step 120. Remember Galton's board... )

And with normally-distributed continuous increments, the process may be called Wienerian. And the Brownian bridge is just around the corner.

;)

 
yosuf:
For the record, I note that (18) operates on the incremental price per unit of the calculation period and arrives at the price itself by adding a conditionally constant component, which it recalculates each time.

You briefly describe what the differences are from linear regression...
 
Roman.:

You briefly describe what the differences are from linear regression...
Linear regression is applied when you assume the existence of a linear dependence of price on time, which is clearly not the case in general, although in a limited time interval a linear dependence can sometimes appear, but attempting to apply this assumption will lead to significant deviations in the future. We are therefore forced to apply non-linear regression, to which RMS belongs, and, as shown earlier, it covers unambiguously the case of linear regression as well.
 

In this regard http://www.machinelearning.ru/wiki/index.php?title=%D0%A0%D0%B5%D0%B3%D1%80%D0%B5%D1%81%D1%81%D0%B8%D0%BE%D0%BD%D0%BD%D0%B0%D1%8F_%D0%BC%D0%BE%D0%B4%D0%B5%D0%BB%D1%8C, maybe change the name of the branch?:

A distinction is made between a mathematical model and a regression model. A mathematical model involves the analyst in constructing a function that describes some known pattern. A mathematical model is interpretable - explainable in terms of the pattern under study. When building a mathematical model, first a parametric family of functions is created and then the model is identified - its parameters are found using measured data. The known functional relationship between the explanatory variable and the response variable is the main difference between mathematical modelling and regression analysis.

The disadvantage of mathematical modelling is that measured data are used for verification, but not for model building, which can lead to an inadequate model. It is also difficult to obtain a model of a complex phenomenon in which a large number of different factors are interrelated.

A regression model combines a broad class of universal functions that describe a pattern. The model is based mainly on measured data rather than knowledge of the properties of the pattern under study. Such a model is often uninterpretable but more accurate. This is either due to the large number of candidate models used to construct an optimal model or to the high complexity of the model. Finding the parameters of a regression model is called model training.

Disadvantages of regression analysis: models with too little complexity may be inaccurate, and models with excessive complexity may be over-trained.

Examples of regression models: linear functions, algebraic polynomials, Chebyshev series, feedback-free neural networks such as the Rosenblatt single layer persepctron, radial basis functions, etc.

Both the regression model and the mathematical model usually specify a continuous mapping. The continuity requirement is due to the class of problems to be solved: most often it is a description of the physical, chemical and other phenomena, where the continuity requirement is put forward naturally. Sometimes monotonicity, smoothness, measurability, and some other restrictions are imposed on the mapping. Theoretically, nobody forbids to work with functions of any kind and to allow existence in models not only of discontinuities, but also to set a finite, disordered set of values of a free variable, i.e. to transform regression problems into classification problems.

When solving problems of regression analysis, the following questions arise.
How to choose the type and structure of the model, which family it should belong to?
What is the hypothesis of data generation, what is the distribution of a random variable?
What is the target function for estimating the quality of approximation?
How to find the parameters of the model, what should be the algorithm for parameter optimization?

 
yosuf:
Linear regression is applied when you assume the existence of a linear dependence of price on time, which is clearly not the case in general, although in a limited time interval a linear dependence can sometimes appear, but trying to apply this assumption will lead to significant deviations in the future. We are therefore forced to apply non-linear regression, to which RMS belongs, and, as shown earlier, it covers unambiguously the case of linear regression as well.


Exactly non-linear? Is it a gamma-function regression? Or is it still linear, but not with a straight line, but with a gamma function?

In any case, Yusuf, you have not discovered anything. Mathematics is given to regression, linear, non-linear, with a fifth line, with any other function.

Reason: