Dependency statistics in quotes (information theory, correlation and other feature selection methods) - page 6

 
alexeymosc:

Slides, slides... ) It's also from an anecdote.

but replication of your results should be possible for other researchers.

Why the "humour"?

Why just declarations?

Put your Excel data out there for anyone to check.

And so bluster with a hint of "genius".

And Alexei - I don't recognise you. Sound words require proof, not demagogy,

so let's wait for the entropy to decrease in this thread.

;)

 
avatara:

but replication of your results should be possible for other researchers.

Why the "humour"?

Why just declarations?

Put your Excel data out there for anyone to check.

And so bluster with a hint of "genius".

And Alexei - I don't recognise you. Sound words require proof, not demagogy,

so let's wait for the entropy to decrease in this thread.

;)


Not much to post at the moment. Calculation of mutual information based on data from Alpari? So let anyone check it, and if the results differ, we will discuss the reason. If there is no difference, then there is nothing to talk about, and if there is, then we can move on.

I agree, so far everything is purely theoretical, although already at this stage constructive has been heard regarding volatility.

 
sayfuji: So from theory to practice, at what point is the transition planned?

Here is all the information I use. Nothing more so far.

There is a strictly private idea of how to switch to practice. But it hasn't been thoroughly tested yet. There are bound to be pitfalls.

Briefly: from average mutual information calculated by the topicstarter on a level "from system to system", we go to the level "from event to event" - and stupidly calculate all the possible ways of returning a predicted bar, calculating information lost/suffered in it. And then, analysing accumulated statistics, we check the main axiom of the intuitive theory of information evolution of systems with memory ("a process evolves so that the resulting information flow is in some sense maximal"). If the axiom is confirmed, we move on to direct prediction.

Actually, there is a certain analogy between information theory and quantum mechanics. Actualisation of an ensemble into an event is an information transfer, directly corresponding to actualisation of wave function in quantum mechanics as a result of observation (remember Schrödinger's cat?). I ask not to laugh and not to throw tomatoes; honestly, I did not mean to, it came by itself!

The following question remains for Avals & anonymous: if volatility is the main cause of these dependencies, then where do the distant and practically reliable dependencies (confidence level - 0.9999... (many nines)) in data that are pure returns without any volatility come from?

2 Svinozavr: I don't eat fly swatter at all: I have plenty of it in my head as it is.

 
Mathemat:

The following question remains for Avals & anonymous: if volatility is the main cause of these dependencies, then where do the distant and virtually reliable dependencies (confidence level - 0.9999... (many nines)) in data that are pure returns without any volatility come from?

Alexei, where are the calculations that result in "distant and practically credible dependencies"? And what do you mean by net returns without volatility (how are the returns obtained, since just returns contain volatility)? That wasn't in the starter's article :)
 
Mathemat:

Here is all the information I use. Nothing more so far.


Alexey, do you know if it is realistic to translate all this pleasure into a code in the direction we are interested in...

A.Sergeev did something similar while translating Sultonov indicator into code or am I mistaken?

Just when I observe so many different limits and logarithms with their mutual sums etc. - I get confused... :-))) although in principle I did very well in maths at that university...

 
alexeymosc:


So you have questioned the very legitimacy of this approach in my article?

Exactly.

alexeymosc:


Reading your assumptions, we're still on to something. I can't say that I am at the early stages, however, I strive to approach the subject without imposing any subjective limitations, conventions, theories on it. The study started exactly from a clean slate, that is, all sorts of economic and other meanings in interpreting the process were not applied. Therefore, I believe that applying the TI formulas, at least, is not wrong for such a task.

All I see so far are attempts to pull an abstract mathematical apparatus, from a completely different field of knowledge, onto the market. In these cases, I tirelessly quote a famous wisdom - this is disrespect to the market, the market will take revenge on you, and it will definitely pull you down in retaliation.

If "economic and other meanings" didn't apply - what did you study?

 
Mathemat:

What prevents you from doing this in relation to the return? It can be discretised, it is a random variable. Quite a decent object for information theory applications. How can you search for identity? You're playing a war game, my dear...

Are elementary events identical to the elementary events of TI?


Well, here I am reading a popular exposition of Shannon formula, it starts with "Suppose we have an alphabet consisting of N symbols, with frequency response P1, P2, . . . PN, where Pi is the probability of occurrence of the i-th symbol. All probabilities are non-negative and their sum equals 1."


Hence the question, what kind of "symbols" do we have in the market?

 
Obviously, in our interpretation of the process, these are discrete return values.
 
HideYourRichess:


If "economic and other senses" did not apply - what did you study?


First, statistical analysis and Data Mining methods. Machine Learning methods: artificial neural networks, classification trees, regression analysis.

When applied to the market, I read Peters.

Why the question? Are you taking an exam for me?

 
alexeymosc:
Obviously, it would seem that in our interpretation of the process, they are discrete returns.

How can they be discrete if you are working with relative increments?

And the second question -- what is the number of characters ) ?

Reason: