Author's dialogue. Alexander Smirnov. - page 37

 
This formula RMS^2 = M[X^2] - (M[X])^2, is actually for variance, i.e. for a perfect characteristic. Most likely, in its derivation the "crossover" sums were explicitly or implicitly zeroed out. For large samples this is correct enough. But for small samples the real RMS may be different. But I haven't got around to checking it yet.
 
Mathemat:

Yura, I want to count the RMS faster than the standard function. What if it works? For a single call it should be faster than any code written in the language, but for mass (whole chart calculation) it is possible to save on costs.


If we are talking about sco for linear regression, it is calculated analytically in one step by the formula

SKO^2=D[Y] - D[X]*A^2,

where D[Y]=M[Y^2]-M[Y]^2, D[X]=M[X^2]-M[X]^2, and A is the linear regression coefficient Y=A*X+B

So recurrence is not needed here.

PS And cross sums are zeroed out explicitly and purely analytically. Sample size has nothing to do with it.

 
Prival:

Still an error 2008.02.15 17:07:22 2007.01.11 12:15 OTF_1 EURUSD,M1: negative argument for MathSqrt function

Judging by the error's content, the MathSqrt(lambda*lambda*lambda+16.0*lambda*lambda) must be somewhere around here. But lambda=MathAbs(Value1[i]/Value2[i]) ;cannot be negative.
That's why the only thing I can think of, just in case, is to execute lambda=0.0;
and/or MathSqrt(MathAbs(lambda*lambda*lambda*lambda)) to get rid of this error forever.
//---- main loop
double alpha, lambda=0.0 ;
//********************************************************************************************
for (i = limit; i >= 0; i--)
{
Price[i]=(High[i]+Low[i])/2.0;
}
for (i = limit; i >= 0; i--)
{
Value1[i]=SC*(Price[i]-Price[i+1])+4*SC*Value1[i+1];
Value2[i]=SC*(High[i]-Low[i])+4*SC*Value2[i+1];
}
for (i = limit; i >= 0; i--)
{
if(Value2[i]< Point)Value2[i]= Point;else lambda=MathAbs(Value1[i]/Value2[i]);

alpha=(-lambda*lambda+ MathSqrt(lambda*lambda*lambda) )/8.0;

Value3[i]=alpha*Price[i]+(1.0-alpha)*Value3[i+1];
}
//********************************************************************************************

P.S. Generally nonsense. This error must be popping up in your tester.
P.P.S. Most likely, the variable double lambda; was initialized by default with very small negative rubbish. Then the double lambda=0.0; expression should help.
Slava teaches us - never work by default, and we do not learn!
 
VBAG:
Prival:

Still an error 2008.02.15 17:07:22 2007.01.11 12:15 OTF_1 EURUSD,M1: negative argument for MathSqrt function


P.S. That's crazy. Is this error probably popping up in your tester?
P.P.S. Most likely, the variable double lambda; was initialized by default with very small negative rubbish. Then the double lambda=0.0; expression should help.
Slava teaches us - never work by default, and we do not learn!

Local botniks are always trying to invent some bicycle.

Don't sweat it for nothing. Among custom indices there is Bands.mq4 - there is an algorithm of RMS calculation
 
<br / translate="no">

SKO^2=D[Y] - D[X]*A^2,

where D[Y]=M[Y^2]-M[Y]^2, D[X]=M[X^2]-M[X]^2, and A is linear regression coefficient Y=A*X+B

So recurrence is not needed here.

There's a lot of unnecessary stuff in these formulas.
And what do you mean recurrence is unnecessary, how are the sums supposed to be calculated? Or do you have an idea how to do without replacing expectation by the average?

P.S. By the way, cross sums do not leave by themselves. At least I did not. Try to work not with variance but with "real" expression
 
RMS for any function

sq = 0.0;
for (int n=0; n<period; n++)
{
   dc = Close[n] - fx[n];
   sq += dc * dc;
}
sq = MathSqrt(sq/period);
Hence for linear regression it is

sq = 0.0;
for (n=0; n<p; n++)
{
   lr = b + a * n;  
   dc = Close[n] - lr;
   sq += dc * dc;
}
sq = MathSqrt( sq / p );
 
ANG3110:
Consequently, for a linear regression it would be

sq = 0.0;
for (n=0; n<p; n++)
{
   lr = b + a * n;  
   dc = Close[n] - lr;
   sq += dc * dc;
}
sq = MathSqrt( sq / p );

This is by definition. If LR is not too short, you can calculate RMS more accurately, without any extra cycle. There is code for RMS calculation in the source MovingLR.mq4, but it is commented out and RMS is called rmsY.
 
lna01:

SKO^2=D[Y] - D[X]*A^2,

where D[Y]=M[Y^2]-M[Y]^2, D[X]=M[X^2]-M[X]^2, and A is linear regression coefficient Y=A*X+B

So recurrence is not needed here.

There is a lot of redundancy in these formulas.
And what do you mean recurrence is not needed, how are the sums supposed to be calculated? Or do you have an idea how to avoid replacement of expected payoffs with a mean?

P.S. By the way, cross sums do not leave by themselves. At least I did not. Try to work not with variance but with "real" expression


Very eager to know what could be superfluous in these formulas ? :-)

MO is, of course, replaced by the mean and sums have to be calculated. However, recurrence or even a cycle is not needed. The following formula will suffice

S(X)[i+1]=S(X)[i] - X[i-N+1] + X[i+1] where S(X)[i]=Sum(X[k]; k=i-N+1, i-N+2, ...,i-1,i )

Well, maybe you mean this expression when you speak of recurrence? Then, of course, you're right.

As for "real expression", where do you think all these formulas come from ? Well, if you substitute into this "real expression" the finite formulas derived from the MNC for A and B, then you get just the above expression for RMS. I can give you the corresponding analytical calculations.

 
Yurixx:
lna01:


I can give you the relevant analytical calculations.


If it's not too much trouble to elaborate from here. With new data the coefficients A and B may change, I think, although I may be wrong :-). For LR it seems to be solved, but for parabolic regression how ?
 
lna01:
This is by definition. If LR is not too short, you can calculate RMS more accurately, without any extra cycle. There is a code for RMS calculation in the source MovingLR.mq4, but it is commented out and RMS is called rmsY.
Yes, you can do it once and subtract the last element and add the new first one. Then it works without cycle. I did such things with mql2, in MT3. Even now I do it where necessary and not only for linear regression.
Reason: