Zero sample correlation does not necessarily mean there is no linear relationship - page 2

 
There are also non-linear dependencies. Spearman or Pearson correlation coefficients (or covariance) do not reveal them.
 
Prival:

In fact, the books say that if QC=0, it does not mean that the two quantities in question are unrelated.

The books say that they are not linearly related.

The link that Rosh gave is exactly Spearman's Rank Correlation Coefficient. That's how it's calculated. If you want to see autocorrelation, it is calculated a little differently, like this https://www.mql5.com/ru/code/8295

Your autocorrelation is not calculated correctly at all.
 
Generally speaking, if you understand the principle of forex price formation, the distribution cannot be normal. With the help of correlation we can try to find graphical patterns, we can try to recognise figures and waves. But probability theory cannot be applied. A person armed with knowledge of probability theory is just as blind as an unarmed person.
 
What does non-stationarity have to do with it? It's about interpreting the correlation on a sample. And the measure of linear dependence on the same sample.
 

It has become clear why a linear relationship is associated with a correlation.

Imagine two BPs as vectors. The point is that for some reason, decided that there is no linear relationship if the vectors are orthogonal.

The orthogonality of vectors is zero scalar product.

For Euclidean space the scalar product of vectors is considered as follows:

- It's almost a ready-made correlation.

So if vectors are linearly independent (based on the definition above) then their correlation is zero.

Another thing is that linear dependence, defined as a measure of the angle between vectors, is quite a bad definition.

 

A little background information.

Correlation and dependence are often confused because in the case of Gaussian distributions they are equivalent (see any textbook on matstatistics for the proof), and many people believe that everything in the world is normally distributed:))

Another common misconception is to confuse the concepts of "correlation coefficient" (i.e. a characteristic of stochastic dependence between c.v.) and "sample correlation coefficient" (an estimate - one of many possible - of the true SC). These are actually quite different things, and substituting one for the other is fundamentally wrong.

To follow up, two other terms that are often confused - functional dependence and stochastic dependence (aka statistical, regression, etc.).


Reading the thread for the hundredth time I am convinced that matstatistics can not be understood simply by reading a dozen textbooks.

IN IT YOU MUST PASS THE EXAM.

Preferably with "excellent":))))

 
alsu:

Another common misconception is to confuse "correlation coefficient" (i.e. a characteristic of the stochastic relationship between c.i.s.) and "sample correlation coefficient" (an estimate - one of many possible - of the true SC). These are actually quite different things, and substituting one for the other is fundamentally wrong.

The word "sampling" is in the thread title. Linear correlation is also discussed in sampling, not as a theoretical characteristic of random variables.
 
alsu:

Just a short lesson.

Correlation and dependence are often confused because in the case of Gaussian distributions they are equivalent (see any textbook on matstatistics for the proof), with many people believing that everything in the world is normally distributed:))

Another common misconception is to mix up the concepts of "correlation coefficient" (i.e. a characteristic of stochastic dependence between c.v.) and "sample correlation coefficient" (an estimate - one of many possible - of the true CC). These are actually quite different things, and substituting one for the other is fundamentally wrong.

In follow-up, two more terms that are often confused - dependence is functional and dependence is stochastic (aka statistical, regression, etc.).


Reading the thread, for the hundredth time, I am convinced that matstatistics cannot be understood simply by reading a dozen textbooks.

YOU HAVE TO PASS AN EXAM IN IT.

Preferably with an "A":)))

And if there is a desire to "use" the workings?

Never mind FFT or whatever.

Multiple regressions and correlations.

;)

Sounds!

What's the physical model of fora got to do with it?

All right, they would do it on a fondue, at least there the metric of state space is not a torus but a ball.

;) DDD

 
hrenfx:

It has become clear why a linear relationship is associated with a correlation.

Imagine two BPs as vectors. The point is that for some reason, decided that there is no linear relationship if the vectors are orthogonal.

The orthogonality of vectors is zero scalar product.

For Euclidean space the scalar product of vectors is considered as follows:

- It's almost a ready-made correlation.

So if vectors are linearly independent (based on the definition above) then their correlation is zero.

Another thing is that linear dependence, defined as a measure of the angle between vectors, is quite a bad definition.

Don't they give you enough assignments at the institute?

 
hrenfx:

....

Your autocorrelation is not counting correctly at all.

It turns out that I was a fool to double-check the code 10 times before I posted it. I looked through textbooks. I checked with matrix samples with known matrix packages. In particular, matcadec has a built-in function. I checked and everything matched. But it turns out wrong ...

Maybe you can tell me how to do it right? Before I'm really wrong.

just in case https://ru.wikipedia.org/wiki/Автокорреляционная_функция

Reason: