
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
And the conclusion is more down-to-earth, to which I think we are steadily approaching ("we" as in "we have ploughed on" :-) - there is no more information in ticks than in bars, or not much more. Please forgive the incorrigible scepticism.
If you want to know the formula according to which the blue curve was generated. I would like to give my comments on each of its components, how it was generated and what was done. Thanks. Or just a file, I can figure it out on my own, I know Matkad.
1. Construct a series of first differences (FDD) of the initial BP on the selected TF (let it be 5 min);
2. find the volatility of FFD - let it be m;
3. replacement of all increments in the FF with m, taking into account the sign of increments, we obtain the first difference of the series of equal increments (FF);
4. Integrate the first difference of RP and the output is a synthetic series - RP.
ALL.
But I disagree with the last assumption - unsubstantiated hypothesis, nothing more. The movement of the crowd just seems to create "agitation" (similar to the behaviour of quantum collectives), which violates your RP, so it's unlikely to justify the confirmation of the conspiracy theory here :). And the conclusion is more down-to-earth, to which I think we are steadily approaching ("we" - in the sense of "we have ploughed on!" :-) - there is no more information in ticks than in bars, or not much more.
How about that!
The crowd, those are the ones who are in the majority. If we plot the distribution of price increments, the bulk of the volume will come from small increments - they are the vast majority - so it's a crowd. Remove from the initial price series increments greater than some, leaving only the "small" ones, i.e. - "from the crowd", and we get a series close to the RP series. In the limit, replacing all increments in initial BP by the same, we obtain some indicator of "the crowd's mood", which, as you can see, goes perpendicular to the rate...
The answer to the question of where there is more information is obvious: having a tick history, I will definitely recover all imaginable TFs. But having a definite TF, I will not recover any smaller TF, and moreover - ticks. So, the information is more in ticks than in bars! Prival is right "...in one large bar in 30 years there is almost as much information in ticks. That's not right. I think the more bars, the less information they contain."
As for the statement"there's not much more information in ticks than in bars" I think that in Forex the information is never superfluous! - It is worth its weight in gold:-)
Crowds are those who are in the majority. If you plot the distribution of price increments, the bulk of the increments are small increments - they are the vast majority - so it's a crowd. If we remove from the initial price series increments greater than some, leaving only "small" increments, i.e. "from the crowd", we get a series close to the RP series. In the limit, replacing all increments in initial BP by the same, we will get some indicator of "crowd's mood", which, as you can see, goes perpendicular to the rate...
The K-factor, is analogous to the Point value in MT4. It is determined automatically, and for example for EURUSD it is 10^4, and for EURJPY it is 10^2. It does not participate in the sorting process.
The format of file is standard: <DTYYYYMMDD>,<TIME>,<OPEN>,<HIGH>,<LOW>,<CLOSE>,<VOL>. In the program, the second column, <OPEN>, is used for constructions.
rsi, I agree with you, moreover, I myself hold a similar opinion. For example, I'm convinced that the Zig-Zag is the optimal approximation operator for a price chart. In this way of decomposition, all the information inside of the H-points step of price scale is considered non-interesting. This allows us to significantly compress the amount of incoming information without throwing the baby out with the bath water.
As for the phenomenon under discussion, this is somewhat different. I'm trying to decompose the price chart on some basis, in particular, on the basis defined in the space of possible values of price increments. The thing is that there are many small movements of no interest, and few large and strong ones! In this situation, I would not risk to discard the small ones, they are too big.
For example:
Here in red, the 1-minute quotient is divided into 12 vectors consisting of equal increments from 1 to 12 pips. According to the principle:
In the original series we leave only those increments whose modulus is equal to n points (e.g. n=5 points). At points where this condition is not fulfilled, the value of the series is assigned to the left value, etc. We obtain a set of vectors of n - realizations. The figure shows vectors for RP of 2,5 and 8 points, and the sum of all vectors from 1 to 12 is black line. It can be seen that the original VR can be reconstructed with any accuracy using more decompositions.
The vectors themselves are of interest. First of all for analysis of quotient dynamics. Perhaps these realizations are easier to predict than the original series. Or their dynamics may inform us in advance of the expected change of trend on the market... Let me remind you - if we eliminate null elements from the series of the first difference (informativity doesn't decrease), then we deal with stationary series, with all positive consequences that follow from this. Mathemat, can you hear me?
I hear Neutron, of course. I see the words "decomposition" and "vector" and wonder where else to put orthonormality here. Just kidding. Actually it is a curious experiment, I haven't thought about it yet. And about stationarity: of course, it must be strictly justified. The cry has already been made here to mechmatics.
P.S. Aren't there any tools for checking stationarity in the matcadex?
That's what I meant when I said that the charts are interesting. That's where the stats should be typed in!
P.S. I remember even from my youth there was such a device - threshold statistics ...
I have created a small indicator which draws the price chart in equal increments, the amplitude of which is equal to the volatility of the instrument in the selected timeframe.
Now it's time to think how to use it.
I don't think we can expect an "easy" solution.
Here's a pattern I've noticed. Sometimes you can see a stable combination of a price chart and a line of equal increments (PL):
In the pictures, the green colour shows the price chart and the WP line in red. Usually, the price chart and the WP are close to each other as if competing, but sometimes, the price chart makes a sharp forward movement leaving the WP far behind. It appears that such a market condition is not random (efficient) and that with non-zero probability it tends to return to an unperturbed state (Figure left). In contrast, the figure on the right shows a situation where price and RP move almost in concert, without sharp perturbations. In such a state, the market is efficient and further developments can follow an arbitrary path.