The Sultonov Regression Model (SRM) - claiming to be a mathematical model of the market. - page 40

 
HideYourRichess:
The sacred question is, has a lot of money been made with this regression? Isn't it time to think about it already?
First you have to develop a belief in some kind of regression, and then comes the stage of making money.
 


Roman.:

you briefly describe what the differences are from linear regression...

yosuf 12.07.2012 09:21
Linear regression (LR) applies when you assume the existence of a linear dependence of price on time, which is clearly not the case in general, although in a limited time interval a linear dependence can sometimes appear, but trying to apply this assumption will lead to significant deviations in the future. We are therefore forced to apply non-linear regression, to which RMS belongs, and, as shown earlier, it unambiguously covers the case of linear regression as well.

Addition to the above:

Here is an example of the LR and RMS treatment of discrete series simulation results using the iterative algorithm https://forum.mql4.com/ru/50108/page5:

 
yosuf:


Roman..:

Would you briefly describe what the differences are from linear regression...

yosuf 12.07.2012 09:21
Linear regression (LR) applies when you assume the existence of a linear relationship between price and time, which is clearly not observed in the general case, although in a limited time interval sometimes can appear linear dependence, but trying to use this assumption will lead to significant deviations in the future. We are therefore forced to apply non-linear regression, to which RMS belongs, and, as shown earlier, it covers unambiguously the case of linear regression as well.

Addendum to the above:

Here is an example of LR and RMS processing the results of a discrete series simulation using the iterative algorithm https://forum.mql4.com/ru/50108/page5, from which we can see that LR takes the researcher beyond the possible domain of appearance of the results:


Thank you, Yusuf. I will read more in the sources myself.

 
avatara:

The merits of the Sultonov model can and should include optimality in a broad sense to the number of degrees of freedom. the number of model parameters is fixed without loss of accuracy.

who argues? do polynomials have it?

;)

In RMS, at derivation (18), one of problems of applied statistics, related to definition of parameters of Gamma-distribution, is solved also in the form of relations (12-14), namely: http://www.aup.ru/books/m163/2_2_1.htm

"In most cases there are no analytical solutions, it is necessary to apply numerical methods to find the GMD. This is the case, for example, with samples from a Gamma distribution or a Weibull-Gnedenko distribution. In many works, the system of maximum likelihood equations is solved by some iterative method ([8], etc.) or the likelihood function of type (8) is directly maximized (see [9], etc.).

However, the application of numerical methods generates numerous problems. The convergence of iterative methods requires justification. In a number of examples, the likelihood function has many local maxima, and therefore the natural iterative procedures do not converge [10]. For VNII railway steel fatigue test data the maximum likelihood equation has 11 roots [11]. Which of the eleven to use as a parameter estimate?

As a consequence of the above difficulties, works on proving the convergence of algorithms for finding maximum likelihood estimates for specific probability models and specific algorithms began to appear. An example is the paper [12].

However, the theoretical proof of convergence of an iterative algorithm is not everything. The question arises about a reasonable choice of the moment of stopping the computation due to achieving the required accuracy. It is unsolved in most cases.

But that's not all. Calculation accuracy must be correlated with the amount of sampling - the larger it is, the more accurate parameter estimates must be found - otherwise we cannot speak about the validity of an evaluation method. Moreover, as the sample size increases, it is necessary to increase the number of digits used in a computer and switch from single- to double-precision calculations and so on, again for the sake of achieving consistency of estimates.

Thus, in the absence of explicit formulas for maximum likelihood estimates, there are a number of computational problems associated with estimating the OLS. Specialists in mathematical statistics allow themselves to ignore all of these problems when discussing PMO in theoretical terms. Applied statistics, however, cannot ignore them. The problems noted call into question the feasibility of the practical use of WMD.

There is no need to absolutise the WMD. Apart from them, there are other types of estimates that have good statistical properties. Examples are single-step estimators (SSE estimators).

Many types of estimates have been developed in applied statistics. Let us mention quantile estimators. They are based on an idea similar to the method of moments, only instead of sample and theoretical moments the sample and theoretical quantiles are equated. Another group of estimators is based on the idea of minimising the distance (difference index) between the empirical data and the element of the parametric family. In the simplest case, the Euclidean distance between empirical and theoretical histograms is minimized, or more precisely, the vectors composed of the heights of histogram bars."

Now these problems for the parameters of the Gamma distribution are solved analytically in the form of relations (12-14) https://www.mql5.com/ru/articles/250 and there is no need to search for methods for their numerical evaluation. It should be suggested to introduce them into GOST as in the case of binomial distribution (from there): "For this reason in GOST 11.010-81 unbiased estimations are used for estimation of parameters of negative binomial distribution but not OMR [7]. It follows from what has been said that one can - if one can - a priori prefer OMPs to other types of estimators only at the stage of studying the asymptotic behaviour of the estimators."

 
yosuf:
Now you tell me yourself, putting your hand on your heart, whether or not the forecast you made and gave on 10.07.12. at 19.14 https://forum.mql4.com/ru/50108/page20 in a completely non-obvious situation is fully correct?

At this point in time, part of the forecast has been confirmed (if I understand the meaning of the indicator correctly). However, this is only one prediction, and that is not enough to draw any conclusions.
Also, it's not clear how to set SL and TP, which is extremely important.
 
yosuf:


....

Here is an example of LR and RMS processing the results of a discrete series simulation using the iterative algorithm https://forum.mql4.com/ru/50108/page5, from which we can see that LR takes the researcher beyond the possible domain of appearance of the results:

Where is this discrete series? The yellow dots? If yellow dots, how did linear regression get so sideways?
 
Integer:
Where is this discrete series? The yellow dots? If yellow dots, how did linear regression get so sideways?

Here is the data from here https://forum.mql4.com/ru/50108/page4, derived this way https://forum.mql4.com/ru/50108/page5, do the math and see for yourself:

anonymous 10.07.2012 11:58 am.


Yusuf, try using your model to continue at least ten steps into the next row:

101101100011101100011101100010010011100010011100010011101101100010010011100010011101101100

p.s. This series is not random. I will reveal the algorithm and further values of the series after I receive your prediction.

xi Yi Yn L
0,00000001 1,0000 0,00000001 -0,411673682
1,00000001 0,0000 0,071581228 -0,392656547
2,00000001 1,0000 0,075244112 -0,373639413
3,00000001 1,0000 0,09192784 -0,354622278
4,00000001 0,0000 0,130452259 -0,335605143
5,00000001 1,0000 0,192774 -0,316588009
6,00000001 1,0000 0,273940135 -0,297570874
7,00000001 0,0000 0,365335416 -0,27855374
8,00000001 0,0000 0,458061228 -0,259536605
9,00000001 0,0000 0,545051494 -0,240519471
10,00000001 1,0000 0,621835168 -0,221502336
11,00000001 1,0000 0,68638294 -0,202485201
12,00000001 1,0000 0,738521184 -0,183468067
13,00000001 0,0000 0,77925761 -0,164450932
14,00000001 1,0000 0,810202137 -0,145433798
15,00000001 1,0000 0,833148102 -0,126416663
16,00000001 0,0000 0,849810912 -0,107399529
17,00000001 0,0000 0,861691707 -0,088382394
18,00000001 0,0000 0,870027242 -0,06936526
19,00000001 1,0000 0,875792141 -0,050348125
20,00000001 1,0000 0,879728335 -0,03133099
21,00000001 1,0000 0,882385057 -0,012313856
22,00000001 0,0000 0,884159565 0,006703279
23,00000001 1,0000 0,885333612 0,025720413
24,00000001 1,0000 0,886103678 0,044737548
25,00000001 0,0000 0,886604772 0,063754682
26,00000001 0,0000 0,886928466 0,082771817
27,00000001 0,0000 0,887136159 0,101788951
28,00000001 1,0000 0,887268591 0,120806086
29,00000001 0,0000 0,887352546 0,139823221
30,00000001 0,0000 0,887405482 0,158840355
31,00000001 1,0000 0,887438693 0,17785749
32,00000001 0,0000 0,88745943 0,196874624
33,00000001 0,0000 0,887472321 0,215891759
34,00000001 1,0000 0,887480302 0,234908893
35,00000001 1,0000 0,887485223 0,253926028
36,00000001 1,0000 0,887488247 0,272943162
37,00000001 0,0000 0,887490099 0,291960297
38,00000001 0,0000 0,887491228 0,310977432
39,00000001 0,0000 0,887491916 0,329994566
40,00000001 1,0000 0,887492333 0,349011701
41,00000001 0,0000 0,887492585 0,368028835
42,00000001 0,0000 0,887492737 0,38704597
43,00000001 1,0000 0,887492829 0,406063104
44,00000001 1,0000 0,887492884 0,425080239
45,00000001 1,0000 0,887492916 0,444097373
46,00000001 0,0000 0,887492936 0,463114508
47,00000001 0,0000 0,887492948 0,482131643
48,00000001 0,0000 0,887492955 0,501148777
49,00000001 1,0000 0,887492959 0,520165912
50,00000001 0,0000 0,887492961 0,539183046
51,00000001 0,0000 0,887492963 0,558200181
52,00000001 1,0000 0,887492964 0,577217315
53,00000001 1,0000 0,887492964 0,59623445
54,00000001 1,0000 0,887492965 0,615251585
55,00000001 0,0000 0,887492965 0,634268719
56,00000001 1,0000 0,887492965 0,653285854
57,00000001 1,0000 0,887492965 0,672302988
58,00000001 0,0000 0,887492965 0,691320123
59,00000001 1,0000 0,887492965 0,710337257
60,00000001 1,0000 0,887492965 0,729354392
61,00000001 0,0000 0,887492965 0,748371526
62,00000001 0,0000 0,887492965 0,767388661
63,00000001 0,0000 0,887492965 0,786405796
64,00000001 1,0000 0,887492965 0,80542293
65,00000001 0,0000 0,887492965 0,824440065
66,00000001 0,0000 0,887492965 0,843457199
67,00000001 1,0000 0,887492965 0,862474334
68,00000001 0,0000 0,887492965 0,881491468
69,00000001 0,0000 0,887492965 0,900508603
70,00000001 1,0000 0,887492965 0,919525737
71,00000001 1,0000 0,887492965 0,938542872
72,00000001 1,0000 0,887492965 0,957560007
73,00000001 0,0000 0,887492965 0,976577141
74,00000001 0,0000 0,887492965 0,995594276
75,00000001 0,0000 0,887492965 1,01461141
76,00000001 1,0000 0,887492965 1,033628545
77,00000001 0,0000 0,887492965 1,052645679
78,00000001 0,0000 0,887492965 1,071662814
79,00000001 1,0000 0,887492965 1,090679948
80,00000001 1,0000 0,887492965 1,109697083
81,00000001 1,0000 0,887492965 1,128714218
82,00000001 0,0000 0,887492965 1,147731352
83,00000001 1,0000 0,887492965 1,166748487
84,00000001 1,0000 0,887492965 1,185765621
85,00000001 0,0000 0,887492965 1,204782756
86,00000001 1,0000 0,887492965 1,22379989
87,00000001 1,0000 0,887492965 1,242817025
88,00000001 0,0000 0,887492965 1,261834159
89,00000001 0,0000 0,887492965 1,280851294

 
yosuf:

Here is the data from here https://forum.mql4.com/ru/50108/page4, obtained this way https://forum.mql4.com/ru/50108/page5, do the math and see for yourself:

xi Yi Yn L
0,00000001 1,0000 0,00000001 -0,411673682
1,00000001 0,0000 0,071581228 -0,392656547
2,00000001 1,0000 0,075244112 -0,373639413
3,00000001 1,0000 0,09192784 -0,354622278
4,00000001 0,0000 0,130452259 -0,335605143
5,00000001 1,0000 0,192774 -0,316588009
6,00000001 1,0000 0,273940135 -0,297570874
7,00000001 0,0000 0,365335416 -0,27855374
8,00000001 0,0000 0,458061228 -0,259536605
9,00000001 0,0000 0,545051494 -0,240519471
10,00000001 1,0000 0,621835168 -0,221502336
11,00000001 1,0000 0,68638294 -0,202485201
12,00000001 1,0000 0,738521184 -0,183468067
13,00000001 0,0000 0,77925761 -0,164450932
14,00000001 1,0000 0,810202137 -0,145433798
15,00000001 1,0000 0,833148102 -0,126416663
16,00000001 0,0000 0,849810912 -0,107399529
17,00000001 0,0000 0,861691707 -0,088382394
18,00000001 0,0000 0,870027242 -0,06936526
19,00000001 1,0000 0,875792141 -0,050348125
20,00000001 1,0000 0,879728335 -0,03133099
21,00000001 1,0000 0,882385057 -0,012313856
22,00000001 0,0000 0,884159565 0,006703279
23,00000001 1,0000 0,885333612 0,025720413
24,00000001 1,0000 0,886103678 0,044737548
25,00000001 0,0000 0,886604772 0,063754682
26,00000001 0,0000 0,886928466 0,082771817
27,00000001 0,0000 0,887136159 0,101788951
28,00000001 1,0000 0,887268591 0,120806086
29,00000001 0,0000 0,887352546 0,139823221
30,00000001 0,0000 0,887405482 0,158840355
31,00000001 1,0000 0,887438693 0,17785749
32,00000001 0,0000 0,88745943 0,196874624
33,00000001 0,0000 0,887472321 0,215891759
34,00000001 1,0000 0,887480302 0,234908893
35,00000001 1,0000 0,887485223 0,253926028
36,00000001 1,0000 0,887488247 0,272943162
37,00000001 0,0000 0,887490099 0,291960297
38,00000001 0,0000 0,887491228 0,310977432
39,00000001 0,0000 0,887491916 0,329994566
40,00000001 1,0000 0,887492333 0,349011701
41,00000001 0,0000 0,887492585 0,368028835
42,00000001 0,0000 0,887492737 0,38704597
43,00000001 1,0000 0,887492829 0,406063104
44,00000001 1,0000 0,887492884 0,425080239
45,00000001 1,0000 0,887492916 0,444097373
46,00000001 0,0000 0,887492936 0,463114508
47,00000001 0,0000 0,887492948 0,482131643
48,00000001 0,0000 0,887492955 0,501148777
49,00000001 1,0000 0,887492959 0,520165912
50,00000001 0,0000 0,887492961 0,539183046
51,00000001 0,0000 0,887492963 0,558200181
52,00000001 1,0000 0,887492964 0,577217315
53,00000001 1,0000 0,887492964 0,59623445
54,00000001 1,0000 0,887492965 0,615251585
55,00000001 0,0000 0,887492965 0,634268719
56,00000001 1,0000 0,887492965 0,653285854
57,00000001 1,0000 0,887492965 0,672302988
58,00000001 0,0000 0,887492965 0,691320123
59,00000001 1,0000 0,887492965 0,710337257
60,00000001 1,0000 0,887492965 0,729354392
61,00000001 0,0000 0,887492965 0,748371526
62,00000001 0,0000 0,887492965 0,767388661
63,00000001 0,0000 0,887492965 0,786405796
64,00000001 1,0000 0,887492965 0,80542293
65,00000001 0,0000 0,887492965 0,824440065
66,00000001 0,0000 0,887492965 0,843457199
67,00000001 1,0000 0,887492965 0,862474334
68,00000001 0,0000 0,887492965 0,881491468
69,00000001 0,0000 0,887492965 0,900508603
70,00000001 1,0000 0,887492965 0,919525737
71,00000001 1,0000 0,887492965 0,938542872
72,00000001 1,0000 0,887492965 0,957560007
73,00000001 0,0000 0,887492965 0,976577141
74,00000001 0,0000 0,887492965 0,995594276
75,00000001 0,0000 0,887492965 1,01461141
76,00000001 1,0000 0,887492965 1,033628545
77,00000001 0,0000 0,887492965 1,052645679
78,00000001 0,0000 0,887492965 1,071662814
79,00000001 1,0000 0,887492965 1,090679948
80,00000001 1,0000 0,887492965 1,109697083
81,00000001 1,0000 0,887492965 1,128714218
82,00000001 0,0000 0,887492965 1,147731352
83,00000001 1,0000 0,887492965 1,166748487
84,00000001 1,0000 0,887492965 1,185765621
85,00000001 0,0000 0,887492965 1,204782756
86,00000001 1,0000 0,887492965 1,22379989
87,00000001 1,0000 0,887492965 1,242817025
88,00000001 0,0000 0,887492965 1,261834159
89,00000001 0,0000 0,887492965 1,280851294


Excuse me, but you do not seem to be able to answer the most elementary question? Read my question again and answer it.
 
The second column is Yi? Him?
 
Integer:
Where is this discrete row? The yellow dots? If yellow dots, how did a linear regression get so sideways?
Yes, yellow dots.
Reason: