Author's dialogue. Alexander Smirnov. - page 38

 
Prival:
Yurixx:
lna01:


I can give you the relevant analytical calculations.


here from here if it is not difficult to elaborate. with the arrival of new data the coefficients A and B may change, I think, although I may be wrong :-). For LR it seems to be solved, but for parabolic regression how ?

sum=0.0;
for (i=0; i<p; i++)
{
    fx = A*i*i + B*i + C;
    dc = Close[i] - fx;
    sum += dc * dc;
}
sq = MathSqrt(sum / p);
 
Yurixx:


Very much want to know what could be superfluous in these formulas ? :-)

As for the "real expression", where do you think all these formulas come from? If you substitute the finite formulas derived from MNA for A and B into this "real expression", then you get the above expression for the RMS. I can give the corresponding analytical calculations.

OK, I agree, not in these particular ones :)
By definition, recursion is calculation of the next value using the previous one? Then cumulative sums calculation is the most natural recursion.
The point is that my calculation by "real expression" gives some inconsistency with these formulas. Here are the results for N=5 and N=20. The lines were counted as LR + 3*SCO, for the white line the RMS was taken as sqrt((RMS^2)*N/(N-2)). Red line is according to my formula, white line is according to your formula. For N=20 the red line is almost invisible, we can assume that the results coincide with a good accuracy. But for N=5 the differences are quite noticeable.
 
ANG3110:
Yes, you can count the sum once at the beginning and simply subtract the last element and add a new first element. Then it works without a cycle.

The problem is that in LRMA a and b are recalculated at each bar. That is, simply modifying the sum of errors is not enough.
 
ANG3110:
Prival:
Yurixx:
lna01:


I can give you the relevant analytical calculations.


here from here if you don't mind elaborating. with the arrival of new data the coefficients A and B may change, I think, although I could be wrong :-). For LR it seems to be solved, but for parabolic regression how ?

sum=0.0;
for (i=0; i<p; i++)
{
    fx = A*i*i + B*i + C;
    dc = Close[i] - fx;
    sum += dc * dc;
}
sq = MathSqrt(sum / p);


There is no calculation of coefficient B. Although if you add its calculation, it seems to come back to the original value. There is no recursion, i.e. adding to the previous value a new one, calculated at step 0. ANG3110 sorry there is no recursion
 
lna01:
ANG3110:
Yes, you can count the sum once at the beginning and just subtract the last element and add the new first element. Then it works without a cycle.

The problem is that LRMA recalculates a and b on every bar. So you can't get away with just modifying the sum of errors.
And that case with LRMA expression - it's to immediately read LR end data and is not intended to calculate RMS.
But calculating LRMA, without using the coefficients of line a and b, gains nothing in the calculated ressources, and impoverishes in possibilities, because in the linear regression formula b is the end position, and a*i is the angle. And more importantly, knowing a and b, you can easily calculate RMS. Or we can do the opposite and calculate the RMS to be constant and the period to vary, then we get a regression, like a suit tailored exactly to the size of the trend.
 
ANG3110:
and the period would change, then get a regression, like a suit sewn exactly to size, under the trend.

If there is an indicator that has this property. Would it be possible to share. Although I understand that this is not something that is posted in the public domain, but if you suddenly decide to, yellow trousers and two coo at a meeting + your favorite drink at this time of day will try to get it :-)

I need a parabola, I'm not interested in LR.

 
Prival:
ANG3110:
Prival:
Yurixx:
lna01:


I can give you the relevant analytical calculations.


here from here if you don't mind elaborating. with the arrival of new data the coefficients A and B may change, I think, although I could be wrong :-). For LR it seems to be solved, but for parabolic regression how ?

sum=0.0;
for (i=0; i<p; i++)
{
    fx = A*i*i + B*i + C;
    dc = Close[i] - fx;
    sum += dc * dc;
}
sq = MathSqrt(sum / p);


There is no calculation of coefficient B. Although if you add its calculation, it seems to come back to the original value. There is no recursion, i.e. adding to the previous value a new one, calculated at step 0. ANG3110 Sorry, there's no recursion here.
But why do we need recursion in this case? Well, I understand when in the calculations used 10 - 20 regressions at once, well, then methods of calculation without a cycle, become relevant, and solved with arrays is very easy. But for one or two lines, it's like there's nothing else to do but to make up recursion. I personally am sitting at my daughter's birthday party and I really have nothing else to do, so I'm waiting for them to finish.
 
ANG3110:
...
Why do we need this recursion in this case? Well, I understand when in calculations 10 - 20 regressions are used at once, well, then methods of calculation without cycle, become actual, and are solved with arrays very easily, but for one - two lines. It's like there's nothing else to do but compose recursion. I personally am sitting at my daughter's birthday party and really have nothing else to do, so I'm waiting for them to finish.

multi-currency analysis, with different cycle periods. If you count cycles (sample period) of 1, 2, 8, 12, 24 and 120 hours + for 12 currencies, then the calculation speed is not the last thing. Although (sorry there's no smiley face with a mug or shot) my daughter has her 12th birthday on February 14, so I'm writing between shots and entertaining the guests (who all gathered on Saturday).
 
ANG3110:
But calculating LRMA, without using the a and b line coefficients, gains nothing in computational resources, and impoverishes the possibilities,
...
And, importantly, it is possible to calculate RMS. Or we can do it in the opposite way, and calculate the RMS to be constant and the period to vary, then we get regression, like a suit tailored exactly to the size of the trend.
Just LRMA algorithms from this branch gain a lot of resources. Adding to the algorithm for calculating a, and RMS (b in my version counts) additional resources will of course take, but not much. By the way above picture with "half-channels" was just quickly made from my version of LRMA (that's from MovingLR). Actually my interest in this branch is in polishing of forced algorithm of regression recalculated on every bar, so that the RMS was constant I tried before and I wasn't satisfied with the results.
 
What are we doing with a and b? There is a proven formula for LR - there are no straight line k-types. There are trivial mash-ups. Prival, I'm talking exactly about LR, let's deal with it first.
Reason: