Discussion of article "3 Methods of Indicators Acceleration by the Example of the Linear Regression"
Honestly bajan, about this is already five years as known on the fourth forum.
But I think it will be interesting for beginners.
The article once again shows that statistical methods implemented in standard indicators rule, do not underestimate them.
. by the way, there are also formulas of second-order regression through mashki.
- www.mql5.com
Honestly bajan, about this is already five years as known on the fourth forum.
But I think newcomers will be interested.
Yes, the article is oriented more on beginners.
The article once again shows that statistical methods implemented in standard indicators rule, you should not underestimate them.
That's for sure. I think that indicators built into the terminal (iMA, etc.) work fast not only due to the terminal developers' knowledge of optimisation methods, but also due to the fact that they are executed as part of an exe-file. That is, unlike external indicators, it is a full-fledged compilation, not some pi-code. Well, perhaps, they have some direct high-speed access to timers, which is unavailable for external indicators.
I have tried to use non-embedded mashas for convolution method - although they are optimised well, they are much slower than the embedded ones.
Author's dialogue. Alexander Smirnov. this is the second discussion of this topic, the first one was somewhere in 2006.
. by the way, there are also formulas for second-order regression via mashki.
Yes, I have seen similar topics. But nowhere was it clear step-by-step how the convolution was obtained. I had to derive all the formulas myself (I wrote several sheets of drafts) to tell all the nuances in the article. There is a tricky point with bar numbering for LWMA, for example.
Here is the proof. That is, a bit earlier than Urain gave the link.
I don't claim priority :). It is, generally speaking, almost a trivial fact.
P.S. If I haven't forgotten everything, the result of the comparison of velocities was still in favour of the classical calculation - but significantly optimised (the difference is not great, but still; see Candid' a's posts). However, a similar technique applied to higher-order regressions (quadratic, cubic, etc.) seems to be able to show the advantage of the "convolution" method over the classical one.
- www.mql5.com
Very good and usefull article but I have one question.
If we use the Moving Totals method is true that the indicator gain a lot of speed but this mean to add 4 auxiliary buffers wich will increase memory consumption.
So the question is wich method is better the Standard method wich use less memory or the Moving Totals method wich use more memory but is faster?
Something is wrong in the implementation:
When LRMethod == LR_M_Sum
it turns out that Sx and Sxx are constants:
Sxx = ExtBufSxx[prevbar];
ExtBufSxx [bar] = Sxx;
If this is the case, why the buffers?
Perhaps it will be even faster if you count SMA and LWMA using the moving sum method and count the result as a convolution.
Moreover, it would be good to know the slope of the regression, which can also be calculated via SMA and LWMA.
I implemented it on 4: https://www.mql5.com/en/code/10642

- votes: 3
- 2012.03.08
- Vladislav Eremeev
- www.mql5.com
Hello
Thanks for this article. But Could you please tell me how to call your methods inside my MQL5 code? I do not understand how to do it, i don't understand the list of parameters inside your methods. how can i get the value returned by the linear regression at a given moment inside my code after integrating yours?

- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
You agree to website policy and terms of use
New article 3 Methods of Indicators Acceleration by the Example of the Linear Regression is published:
Author: Andrew