Random Flow Theory and FOREX - page 11

 

Neutron

Did I answer ? if not, couldn't dispel the fog of these formulas. Ask away.

I'll go look for my grandfather tomorrow. It's a good book he wrote. Tikhonov V.I. Nonlinear Transformation of Random Processes -M.: Radio and Communications. 1986. If you will use the book there are some typos, I think I have found one more, it does not work out for me. I'll post the results if I get a chance to meet you. It looks like after subtracting the trend (y(x)=a+bx), it's a second order inertial.

Mathemat autoregression of first order, variance tends to infinity (if I'm not confused). But the second orderinertial link makes oscillatory movements, as if it tends to an equilibrium point, it seems to me more plausible in the "character" of quotes movement. But maybe all together there ;-(

 
Prival:

Let me try this again with an example.

The important thing is to understand this formula.

...


All right, Prival, that's it!

What you described with the formula is a first order autoregression representation for first differences (Markov process), where w is a random component (noise with certain characteristics) and F is a scalar (special case of matrix) equal to correlation coefficient between first differences of BP. Once again, this formula applies to and predicts the first BP differences, not the BP itself. In order to restore and then predict BP, you need a procedure for integrating a series of increments!

Now the question is: What are you going to study? All information on this subject is well explained and presented in a very digestible form in many works.

Now for a nuance. Markov process. According to this theory, the transition from L(k) to L(k+1) does not depend on the state L(k-1), i.e. the rate was the same yesterday, an hour ago and a minute ago. The main thing is the exchange rate L(k). What it will be at the moment L(k+1) is determined by this damned (I can't think of another word ;-)) matrix F.

It's a special case of the Markov process (when F=0) and has a proper name: "Wiener process" or "one-dimensional Brownian motion". It is of no practical interest.

The question is, what does all the above have to do with an aeroplane pilot?

 
I was wondering what L(k) is too. It looks like a vector after all. Then F is a matrix. But what kind of vector is it?
 
Mathemat:
I was wondering what L(k) is too. It looks like a vector after all. Then F is a matrix. But what kind of vector is it?

L(k) is the current count of first differences of the original BP. L is the vector of first differences, L(k+1) is the predicted value of the first difference.
 
Then what matrix F are we talking about if it is a scalar? If L(k+1) is a predicted vector, then the formula formally resembles AR(1), but only formally.
 

Asked! I don't know why Prival calls it a matrix.

In general, the point is this:

we have an Nth-order autoregressive model, which can be written in the form

where sigma is a random variable (its concrete form is a subject of a separate talk), X is a vector of available estimates of first differences of predicted BP -Y(i), and autoregressive coefficients (their form has limitations).

So, to calculate the autoregressive coefficients you have to solve a system of linear equations of N-th order, consisting of ACF values of first differences. This is the only matrix in the whole case. The system of equations is called Yule-Walker [Yule (1927)], [Walker (1931)].

After finding X(i+1) of the difference, it is not difficult to construct a prediction for the original BP: Y(i+1)=Y(i)+X(i+1).

That's it, the problem is solved!

 

I see, Neutron, the AR(N) is clear. Nevertheless, I am puzzled by a more complicated formula

for which Prival happened to mention that F is a transition matrix.

A curious thing turns out. If L(k) is a vector (e.g. the last M returns values), then no ordinary autoregression is out of the question. Although formally it is the same AR(1), but for a vector flow (process) L(k). W(k) is a vector too, but it's already unrelated.

Do you understand me, Neutron? Maybe this is the model Prival is talking about, that the calculations here are unbearable? And MNC would be just right here, if we run it through history (to find the right matrix F).

 
Is he referring to any sources, articles? And if so (I mean vectors instead of scalars), where is the justification for the applicability of this machination to our case? You can count something like this for the rest of your life... But what for?
 

All right, we wait for the author who made this mess. Some strange model comes out: by taking the last returns as components of vector L(k), we thereby set dependences of some returns on their future values. I guess it's not good somehow.

 
Mathemat:

All right, we wait for the author who made this mess. Some strange model comes out: by taking the last returns as components of vector L(k), we thereby set dependences of some returns on their future values. I guess it's not good somehow.

I guess formally it can be said about any predictive function? The direction of the time arrow is up to us.

P. S. These contrails are just all over the place :)
Reason: