Machine learning in trading: theory, models, practice and algo-trading - page 208

 
Quantum:
Correct. Now let's calculate pgamma from 0+eps. What will it be equal to? Infinity because of dgamma(0,0.5,1)=inf. Right?

If you are looking for pgamma(0+eps, 0.5, 1) you should not compare with dgamma(0, 0.5, 1), but with dgamma(0+eps, 0.5, 1)

I've been answering just about that this morning, you missed it:

Dr.Trader:
Let's take a simpler example:
x=1*10^(-90)
The number is very small, not zero, and there are no uncertainties.
> dgamma(1*10^(-90), 0.5, 1)
[1] 5.641896e+44
> pgamma(1*10^(-90), 0.5, 1)
[1] 1.128379e-45

Tungsten, the result is the same:
PDF[GammaDistribution[0.5,1], 1*10^(-90)]
5.6419×10^44
CDF[GammaDistribution[0.5,1], 1*10^(-90)]
1.12838×10^-45

Now, paraphrasing your question, without all the infinities in the formulas:
How can integrating dgamma, which returns big numbers like 5.641896e+44, result in a very small number1.128379e-45?

You must be satisfied that at X->0 dgamma will be very large, tending to infinity, while pgamma is very small tending to zero. This can be seen even in tungsten. How is it possible in this case that the integration gives a small result?
I took 1e-90 because tungsten can't do finer. In R you can look at the result at x=1e-300 - there will be a huge result in dgamma, and insignificant in pgamma.

And the only clue is that you're apparently trying to find pgamma by doing summation integration in a cycle with small steps, and Inf would really bother you. And R does it by some formula, not directly using the result of dgamma().
You are integrating something wrong somewhere.

 

I have searched for papers that mention the gamma density of the distribution at zero at different alpha & beta.

Here is one of them: http://journals.ametsoc.org/doi/pdf/10.1175/1520-0442(1990)003%3C1495%3AMLEFTG%3E2.0.CO%3B2

The researcher explicitly says that the density is maximized at point zero. And nothing, it lives, it doesn't suffer...

When Mr. Quantum admits that the error statement is an exaggeration or something else, that is, not correct, then my doubts about his professional competence will sort of dissipate. So far, I see religious arguments on his part and the head of MQ's part of his beneficence.

So far.

 
Quantum:

How do the developers of R explain their results:

dgamma(0,0.5,1)=inf

pgamma(0,0.5,1)=0

if they have point 0 included (as seen in the definition), gives infinite density at point x=0, and then when integrating into pgamma(x,0.5,1) infinity is considered as zero, as if it did not exist.

Quantum:
Now let's calculate pgamma from 0+eps. What will it be equal to? Infinity because of dgamma(0,0.5,1)=inf. Right?

http://www.wolframalpha.com/input/?i=integrate[pdf[gammadistribution[0.5,1],x]+,{x,0,1*10^(-90)}]

The integral is the area of the figure shaded in blue. As you can see, the left side of the shaded figure tends to infinity. Even though wolfram does not include the point x=0 in the pdf function, there is still no finite "highest point," you can think of the left side of the figure as growing up infinitely. Logically, if the left side of the figure grows up infinitely, then its area will also tend to infinity. But this doesn't actually prevent you from getting a non-infinite result when determining the area of that figure. Math.

 
By the way, has anyone wondered if the gamma and related distributions can even be used in the market? It's just a question...

Gamma, exponential, Poisson. They are all side by side, and they are for independent processes. If the magnitude of events in these processes also satisfies i.i. d. then the sum of events is normal....

In general, I don't see the application yet. Normality can still be drawn to, for example, the sum of the magnitudes of independent transactions... And this is a useful property, by the way. I showed earlier the distribution of cumulative trades. With a large number of samples, the statistics is close to normal.
 
mytarmailS:
5 pages of arguments about imaginary error in function that nobody needs in this thread, in the thread about machine learning, something is clearly wrong in this world...

You just can't read between the lines, you don't understand the hidden purpose of such pseudoscientific demagogy. Let me illustrate with a fictitious example.

Let's take, for example, oil production, let's assume that in narrow circles of successful oil producers gradually accumulates experience of research to find oil deposits on the basis of indirect, external signs, such as chemical composition of soil samples, vegetation pattern , etc. Naturally all this is kept in the strictest secrecy, and novice drillers are fed all sorts of TRUTHFUL information, something obvious with minor modifications, but not working, or even disinformation, which is difficult to check, except for trying and going bankrupt, with the help of "authorities". Time goes on, people are people, information gradually leaks and the time has come when it is already impossible to hide the technology in general terms, it became apparent and true, what to do?

The first thing that comes to mind, as in any game, when the enemy found out about the "secret technique" is all sorts of diversions aimed at complicating the comprehension of this secret knowledge, such as wetting his swamp of details, in a giant poorly structured stream of information which the brain is physically unable to digest and for 100 lifetimes, to take away from the essence , You want to understand how the Perspectron works, and you are recommended to understand number theory, at least at graduate school level, then mathematics, linear algebra, and all this is not in fact, but in detail, then you have to read all the papers, articles, etc. You want to read about how to develop a web application and you get tons of arguments about errors and programming patterns.

The second is all sorts of fakes, spoofing, when you cleverly moved to their field where the game is not by your rules. Do you need a perseptron? What "idiot" at the end of 2016 is going to write it himself? Ahahahaha)))) Cyclist disgraceful)))) There are tons of libraries out there! Buy a ferrari horse! Dig into other people's libraries and functions like a real "scientist"! You don't need to understand how and what is arranged there, you just need to go through the options that the developers have given you!

Well, so on and so on, I hope you understand what I mean :)

Play on your field and by your rules.

 
Alexey Burnakov:
By the way, has anyone thought if Gamma and its related distributions can be used in the market? It is just a question...

Gamma, exponential, Poisson. They are all side by side, and they are for independent processes. If the magnitude of events in these processes also satisfies i.i. d. then the sum of the events is normal....

In general, I don't see the application yet. Normality can still be drawn to, for example, the sum of the magnitudes of independent transactions... And this is a useful property, by the way. I showed earlier the distribution of cumulative trades. When the number of samples is large, statistics is close to normal.
The ZZ trend length in bars falls by Poisson for small alpha. Didn't get into it more accurately, as there are no ideas on how to use
 
SanSanych Fomenko:
ZZ trend length in bars by eye falls Poisson for small alphas. I didn't get into it more precisely, as I have no idea how to use it.
What do you mean, trend length distribution? Poisson is for the number of events per time delta. Or is there a stretch here as well? I just haven't grasped the physical context of the application...
 
Alexey Burnakov:
What do you mean, trend length distribution? Poisson is for the number of events per time delta. Or is it possible to stretch here as well? I just did not understand the physical context of the application...
We take the distance between ZZ reversals in bars and build a histogram. By eye poisson.
 
SanSanych Fomenko:
We take the distance between ZZ reversals in bars and construct a histogram. Poisson to the eye.
I'll think about it... I will experiment.
 
I started getting answers to my question in R. I managed to get through to R Core, so I'm not a member of the team... Was recommended to write to the r-devel mailing list. This level is more technically in-depth than just R-help. here is the first answer. Read it and think about it. My job is to lay it out.

Re: [Rd] dgamma density values at extreme point
DM
Duncan Murdoch
November 13 at 22:28
English→RussianTranslate

On 13/11/2016 1:43 PM, Alexey Burnakov wrote:

Dear R-Devel group,

My name is Alexey, a data scientist from Moscow, currently working for
Align Technology Inc.

We have recently had a discussion of the results that the dgamma
function (stats) returns for an extreme point (x == 0).


<dgamma(0,1,1,log = FALSE)

[1] 1


and

<dgamma(0,0.5,1,log = FALSE)
[1] Inf

Density appears to be defined in point zero for the distribution with
the said parameters.

It looks like the returned value is a limit of f(x) where x --> inf.


It's the limit as x --> 0.

Hide quote

Although several other "big" statistics engines like Wolfram and Matlab
return 0 (zero) for gamma density with the same function parameters
where x == 0. Which looks like a convention rather than exact answer, in
our opinion. Is this a correct assumption?

When studies scrupulously, it appears that the density is undefined when
we get x^0 where x == 0, for example.

As I could not have reached the author of the code for dgamma, could you
comment on this behavior of the dgamma function at zero? Is it safe to
use the function given such behavior. Is it prudent to report density =
inf in zero? Is there a preferable way to estimate the gamma density in
zero otherwise?


Using the limit is the most sensible method. Having a discontinuity in
the density will cause more problems, e.g. if the density is used in
quadrature.

As to the "correctness", we all know that the value of a density at any
particular point is irrelevant. Only the integrals of densities have
any meaning.

Duncan Murdoch

Reason: