Machine learning in trading: theory, models, practice and algo-trading - page 3638

[Deleted]  
Andrey Dik #:

Here's a clear example of the process:

and here is the same process, but at a different site t:

even further away:

Is the process non-stationary? - No, it is stationary, the formula is the same, here it is:

F := sin(x)/4 + cos(x^2)/4 + 1/2

[Deleted]  
Reply from DeepSeek. I don't know much about phs.



And a reply from Mistral:


 
Maxim Dmitrievsky #:

Well he wrote stationary ) I thought he was good at it. Then I don't know what exactly is being discussed.

If we bluntly apply to deterministic functions the definition of stationarity from the theory of random processes (at least in the broad sense, at least in the narrow sense), then only functions equal to a constant will be stationary.

Informally, we can construct a time series on its function, taking values at equal time intervals. Then the sine from the square will give an oscillatory-trend component near zero and a purely noise component starting from some moment depending on the time discretisation step when constructing the time series.

It turns out that promptus needs validation as much as their favourite AI)

By the way, I was very disappointed with the AI when I asked about macroeconomics - there was an order of magnitude more fantasies and lies than in coding answers. In fact, it was only useful for specifying what exactly to google, although there were sometimes lies in that too.

[Deleted]  
Aleksey Nikolayev #:

If we bluntly apply to deterministic functions the definition of stationarity from the theory of random processes (at least in the broad sense, at least in the narrow sense), then only functions equal to a constant will be stationary.

Informally, we can construct a time series on its function, taking values at equal time intervals. Then the sine of the square will give an oscillatory-trend component near zero and a purely noise component starting from some moment depending on the time sampling step when constructing the time series.

It turns out that promptus needs verification as much as their favourite AI)

By the way, I was very disappointed with the AI when I asked about macroeconomics - there was an order of magnitude more fantasies and lies than in coding answers. In fact, it was only useful for specifying what exactly to google, although there were sometimes lies in that too.

Got it, I'll remember it. Yeah, here's a comical example of results from two different AIs above :)
 
Maxim Dmitrievsky #:
Reply from DeepSeek. I don't know much about phs.



And a reply from Mistral:


The first answer is complete nonsense) The expectation from a constant is equal to the constant. That is, in this case, the expectation is equal to the function itself. The variance and atocovariance functions are equal to zero. That is, the requirement of expectation constancy is violated.

That is why there is a concept of trend-stationarity, when a deterministic trend is subtracted from a series and a stationary series is obtained as a residue. In this particular case, the residue will be identically zero, which is certainly stationary.

The second answer is correct.

 

The answer from Deep Seek is correct (the analytical solution is easily verified by numerical calculations, construct mo and disp as a sampling function for example) but it is not complete, because he only mentioned the definition of stationarity in a broad sense. The same function is also stationary in the narrow sense, that is, the distribution is independent of time.

Here is a histogram of the first 5000 data and the next 5000 data. As we can see the distribution is independent of time.

hist 1-5000-5001-10000 SIN(t)/4 + COS(t^2)/4 + 0.5

 
Maxim Dmitrievsky #:
I don't know anything about phs.

Why then, not for the first time already, you are getting cleverly involved in something you don't understand - I don't understand.

Books on MO state that a neural network with one hidden layer can describe (approximate) any smooth function, and a network with two hidden layers can describe (approximate) any function, including those with discontinuities. It follows from this that if there is a network set that describes reliably any part of the function under study, then the same set reliably describes any arbitrarily small part of this function. Above I just said that MO methods are not able to reliably detect such a set. It is known that this set exists, but how to detect it MO does not give an answer.

Though you don't know what you are posting about, but maybe at least this will come to your mind.

[Deleted]  
Andrey Dik #:

Why then it's not the first time you're getting clever about something you don't understand - it's incomprehensible.

Books on MO state that a neural network with one hidden layer can describe (approximate) any smooth function, and a network with two hidden layers can describe (approximate) any function, including those with discontinuities. It follows from this that if there is a network set that describes reliably any part of the function under study, then the same set reliably describes any arbitrarily small part of this function. Above I just said that MO methods are not able to reliably detect such a set. It is known that this set exists, but how to detect it MO does not give an answer.

Even though you don't know what you're posting about, maybe at least you'll get it. But I doubt it.

Approximate with or without signs, by the will of the Almighty? The I.O.D. can't do anything without the I.O.D. person.

Andraeus has a great trait of clinging to words like a child.
[Deleted]  
Evgeniy Chernish #:

The answer from Deep Seek is correct (the analytical solution is easily verified by numerical calculations, construct mo and disp as a sampling function for example) but it is not complete, because he only mentioned the definition of stationarity in a broad sense. The same function is stationary in the narrow sense, i.e. the distribution does not depend on time.

Here is a histogram of the first 5000 data and the next 5000 data. As you can see the distribution is independent of time.

But there is a cyclic component in the graph of the function. I thought that stationarity for phs is described in some other way. It turns out that there is no such definition for them at all.

 
Maxim Dmitrievsky #:

Approximate with or without signs, by god's will? The MO is unable to do anything without the MO person.

Maxim Dmitrievsky #:
I don't know anything about phs.

That's what I've been talking about for several pages in a row, and if the moshnik is swimming in theory, it's a dead end)).