Machine learning in trading: theory, models, practice and algo-trading - page 3643
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Regarding sine plus noise. It is a stationary process with zero MO.
But in the example you gave, the MO at a particular point of the process is considered. Naturally, at point t = 1, for example, sin(t) = 0.841.... + noise with zero MO, so we have Mo at this particular point = 0.841.
But we observe the process at different moments of time t=1,2,3,4.... and we count Mo over the whole trajectory of the process for this period of time, not at a particular point.
I wonder who is the author of the textbook ?
I don't know, some people.
The theorems of Tsybenko (1989) and Hornik (1989) on universal approximation, based on the Kolmogorov-Arnold theorem (1957), say that a single-layer network can approximate any continuous function with any accuracy. There is no requirement for the functions to be stationary or periodic.
It follows from these theorems that if we take any segment of a continuous function of a process, then it is reliably known that there exists such a net approximating this segment that also approximates the whole function, and to any accuracy.
These theorems speak only about the existence of such a net set, but do not speak about the ways of finding it. This is what I was talking about earlier, that there is no theoretical justification of ways to find a robust network set that will continue approximation with the same accuracy on the oos.
?
It follows from these theorems that if we take any segment of a continuous function of a process, then it is known for sure that there exists such a net approximating this segment that approximates the whole function, and with any accuracy.
Trolling again, passing off approximation as extrapolation.
Trolling again, passing off approximation as extrapolation.
It has been written 3 times already that you need to load representative data or analytical data into the neural network. Then everything is extrapolated.
I don't know about stationary phs, but apparently you do.
It's not an IO problem, it's a data problem. And the spacing between the IO and the data.
You're just bored, aren't you? Nothing useful to say?You don't know what the ph, but you assume there are periodic components.
You try it.
you get it, you test it.
What's your problem?