From theory to practice - page 787

 
Oleg avtomat:

It's not a trend here, it's a rambling mess.



Random wandering )):


If to me the author of this thread should write a big poster with the most important trader's wisdom - Trend is a Friend and read it a hundred times a day like a prayer, then his trading may begin.

 
sibirqk:



Random rambling )):


In general, I think the author of this thread should write a big poster with the most important trader's wisdom - Trend My Friend, and read it a hundred times a day, like a prayer, then his trading may go better.

He's not interested in other people's bikes. He wants his own bike, even if it's bad, but it's his own.
 
Олег avtomat:

Randomness is not involved in the definition of information.

You are mistaken. A meaningful concept of information (not a chattering concept, but one that allows you to express it numerically) was first introduced by Claude Shannon, based on a probabilistic model of randomness.

 
Aleksey Nikolayev:

You are mistaken. A meaningful concept of information (not a chattering concept, but one that allows you to express it numerically) was first introduced by Claude Shannon, based on a probabilistic model of randomness.

You have only a part of information about information, and that part, although not insignificant, is still of an auxiliary value.

One must not reduce the notion of information to a narrow Shannon definition, concerning the carrying capacity of a communication channel (C.Shannon's information theory). To ignore the meaning of a message and the significance of the meaning of a message for sender and receiver is to misunderstand the problem in its entirety.

Take a look at the big picture, for starters:

http://terme.ru/termin/informacija.html#item-5215

 
Aleksey Nikolayev:

You are mistaken. Claude Shannon was the first to introduce a meaningful notion of information (not chattering, but giving the possibility to express it numerically), based on a probabilistic model of randomness.

And in general, I've noticed for some time that almost every statement of mine causes a furious spirit of contradiction with you.

Why should that be? ;)))

 
Alexander_K2:

This suggests that we are facing some kind of symbiosis of an Ornstein-Uhlenbeck type SB process and a memory process.

If there were no emissions, my TS would be 100% profitable, and he who asserts that it is impossible to make money on SB is imbecile. On a coin, I don't know, but on an OM, it's easy.

And it's impossible to eliminate these outliers in tails - this has also been proven in this thread. You have to live with them somehow... How? No answer...

SB and op-amp are fundamentally different processes, don't confuse them or they will laugh at you. Opu is stationary and with memory, SB is non-stationary and without it.


Well, take range-bars and you will not have any tails. There will only be two sticks in the distribution.

It's not about tails.

 
Yuriy Asaulenko:
By the way, I recommended a book to you a few months ago, where the whole methodology is described in detail. Look it up in the thread.

let me read it.)
What's the name of the book?

 
Aleksey Nikolayev:


Do you understand game theory? In its differential formulation? I gave you a link to Pontryagin's work. Have you got it? If not, I strongly recommend you to do it.

 
Олег avtomat:

You have only part of the information about information. And this part, although not insignificant, is still of ancillary importance.

You cannot reduce the concept of information to a narrow Shannon definition of the bandwidth of a communication channel (C.Shannon's information theory). To ignore the meaning of a message and the significance of the meaning of a message for sender and receiver is to misunderstand the problem in its entirety.

Take a look at the big picture, for starters:

http://terme.ru/termin/informacija.html#item-5215

Well yes, as I wrote - chatter. It is impossible to calculate anything without the Shannon approach.

As Wikipedia says: "The emergence of information theory is associated with the publication of Claude Shannon's The Mathematical Theory of Communication in 1948."

 
Aleksey Nikolayev:

Well, yes, as I wrote - chattering. It is impossible to calculate anything without the Shannon approach.

As Wikipedia says: "The emergence of information theory is linked to Claude Shannon's publication of The Mathematical Theory of Communication in 1948."

There you go again, persisting in your misunderstanding.

Reason: