
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
I think we are looking at 2 different things - I was referring to the OP's requirement to "find the ave value over a number of bars". That is easy to do in a deterministic way if one maintains the sum and count and that was my point. Min and max are also easy to collect.
You have opted for MA (which I am not sure was the original request, but may be...), which is of course different, but not difficult to do the necessary operations on a fixed size array.
Ultimately its about effort vs benefit - none of this is new and all easily done
This was not clear to me - I never found any issues putting objects into arrays, or arrays into objects - is it something else you are trying to do?
I was thinking about this - if you have a variable to sum up the values, and another to count the values, you have the simplest/fastest (and deterministic) way to compute the mean.
Only 3 operations: 1. add the new value to the sum 2. increment the count 3. mean = new sum divided by the new count.
Of course this does not retain a complete history of all values, but that is a simple matter of adding each new entry to an array, should one need it for deeper analysis.
Alternatively if one chooses to keep an incremental history array, just use ArraySize() instead of having a count variable, all of this would occur in microsecs anyway
This was not clear to me - I never found any issues putting objects into arrays, or arrays into objects - is it something else you are trying to do?
Ahm... No.
Apples and Peaches... Haha
OK, let me try to explain the issue.
Lets say you have an array, close prices, given to you by a function, like OnCalculate().
Lets say you want to have an object that calculates the mean over the last 15 values. When instantiating this object, you cannot give reference to the array and make it store this reference. Because you cannot create reference-variables, except as paramter in functions, which loose focus on return.
This means, you cannot have a memory address pointer to point to an external array within the given object, and you are forced to pass the array every time to the object.
Therefore you cannot write such code as this:
Thats why it is neccessary to keep track of data internally, within the object itself.
I think we are looking at 2 different things - I was referring to the OP's requirement to "find the ave value over a number of bars". That is easy to do in a deterministic way if one maintains the sum and count and that was my point. Min and max are also easy to collect.
You have opted for MA (which I am not sure was the original request, but may be...), which is of course different, but not difficult to do the necessary operations on a fixed size array.
Ultimately its about effort vs benefit - none of this is new and all easily done
To be hoest, I think you have a thought error on this.
"sum and count and that was my point" -> No, thats not a mean or average value over a given amount of values, especially when they keep changing, or better said, as the scope of relevance keeps shifting, not adding or extending the scope.
"Min and max are also easy to collect" -> Please show me how to do this, because I cannot think of a way on how to do this in a O 1 way, or a deterministic execution environment. - How would you tackle this issue?
This seems to be overthinking it.
Over the years I did a lot of performance optimization, usually with very large data sets/systems, when processing could take days (yes days.. even with extremely expensive hardware clusters) and involved many disparate systems & technologies.
I met many teams who would approach the issues from this perspective: caches, buffers, OS tuning etc. and frankly it was always wrong. At best you could shave off a few millisecs of the runtime when we need to be reducing it by factors of hours. We were taught very clearly that this is the wrong methodology (not far from you actually - in Walldorf and Munich).
The bottlenecks, and solutions, were always higher up in the application stack and the gains to be had were immense once found. I won't go into detail as its way beyond the scope of this discussion, but suffice to say a few nanoseconds here and there because of using ArraySize() rather than keeping a variable are largely irrelevant and not worth worrying about until it manifests itself as real problem which needs a solution.
"Min and max are also easy to collect" -> Please show me how to do this, because I cannot think of a way on how to do this in a O 1 way, or a deterministic execution environment. - How would you tackle this issue?
I outlined the 3 operations in maintaining a sum, count and mean. You could also have variables for min and max - just compare these with the incoming number and decide whether it is a min or max and keep it if so. A fixed number of operations... two ternary statements would do it
This seems to be overthinking it.
Over the years I did a lot of performance optimization, usually with very large data sets/systems, when processing could take days (yes days..) and involved many disparate systems & technologies.
I met many teams who would approach the issues from this perspective: caches, buffers, OS tuning etc. and frankly it was always wrong. At best you could shave off a few millisecs of the runtime when we need to be reducing it by factors of hours. We were taught very clearly that this is the wrong methodology (not far from you actually - in Walldorf and Munich).
The bottlenecks, and solutions, were always higher up in the application stack and the gains to be had were immense once found. I won't go into detail as its way beyond the scope of this discussion, but suffice to say a few nanoseconds here and there because of using ArraySize() rather than keeping a variable are largely irrelevant and not worth worrying about until it manifests itself as real problem which needs a solution.
Repeat your statement after testing this piece of code please:
Non optimized compile results:
2022.08.27 11:05:28.394 2022.08.06 00:00:00 Time in microseconds used: 53
2022.08.27 11:05:28.394 2022.08.06 00:00:00 Time in microseconds used: 34
Optimized compile results:
2022.08.27 11:05:55.261 2022.08.06 00:00:00 Time in microseconds used: 65
2022.08.27 11:05:55.261 2022.08.06 00:00:00 Time in microseconds used: 33
EDIT:
Its half the execution time, that is a relevant factor, no matter how much nanoseconds you scalp off of the execution.
Repeat your statement after testing this piece of code please:
Non optimized compile results:
2022.08.27 11:05:28.394 2022.08.06 00:00:00 Time in microseconds used: 53
2022.08.27 11:05:28.394 2022.08.06 00:00:00 Time in microseconds used: 34
Optimized compile results:
2022.08.27 11:05:55.261 2022.08.06 00:00:00 Time in microseconds used: 65
2022.08.27 11:05:55.261 2022.08.06 00:00:00 Time in microseconds used: 33
You are worried about a few microseconds? Nothing more to be said.