Machine learning in trading: theory, models, practice and algo-trading - page 1172

 
Alexander_K2:

So you think you can put whatever you want into the NS and it's all over? Have you eaten too much henna, uncle?

And most importantly, teach me 24 hours a day, I don't know what.

these gentlemen are all over the forum

 
By the way, Kolmogorov was also involved in neural networks.)
теорема Колмогорова-Арнольда НЕЙРОННЫЕ СЕТИ | fkn+antitotal
теорема Колмогорова-Арнольда НЕЙРОННЫЕ СЕТИ | fkn+antitotal
  • fkn.ktu10.com
А.Н. Колмогоровым и В.В. Арнольдом в 1957 году была доказана теорема о представимости непрерывных функций нескольких переменных суперпозицией непрерывных функций одной переменной, которая в 1987 году была переложена Хехт–Нильсеном для нейронных сетей: Из теоремы Колмогорова–Арнольда–Хехт–Нильсена (КАХН) следует, что для любой функции многих...
 
Alexander_K2:

So you think you can put whatever you want into the NS and it's all over? Have you eaten too much hemp, uncle?

Once again, read carefully and thoughtfully, the requirement of stationarity-non-stationarity is a requirement for the existence of a solution to the problem, not for the mechanism of its solution. NS is a mechanism, it doesn't care.

Have you eaten too much henbane, uncle? (c) You don't understand anything anymore?

 
Yuriy Asaulenko:

Once again, read carefully and thoughtfully, the requirement of stationarity-non-stationarity is a requirement for the existence of a solution to the problem, not for the mechanism of its solution. NS is a mechanism, it does not care.

Have you eaten too much henbane, uncle? (c) Don't you understand anything anymore?

You don't have to read thoughtfully, but just think sometimes, just with your head.

The stationarity of the model's residuals must be maintained, at least (on the new data). If input-outputs ratios are not stationary at least bang your head against the wall trying to build a neural network model on such data.

How many times we can talk about the same thing? I'll start swearing soon. You're a grown man.
 
Novaja:
Pastukhov's dissertation is a help.

The work is good, but in my opinion it is not applicable in the case of non-stationarity.

I would also like to look at asymptotics of convergence of H-volatility for Wiener process, but I did not see it in the abstract - maybe it is in the full text of the thesis.

 
Maxim Dmitrievsky:

You should not read thoughtfully, but just think sometimes, just with your head

The stationarity of model residuals must be maintained, at least (on new data). If input-output ratios are not stationary, then at least bang your head against the wall trying to build a neural network model on such data.

How many times we can talk about the same thing? I'll start swearing soon. You're a grown-up.

Another one who can't read.)

 
Yuriy Asaulenko:

Another one who can't read)).

You're just writing nonsense and nonsense, and it's just not even possible to think about it when you're sober. You can't even think about it when you're sober.

if only because the activation functions will be stuck in one position on obviously unsteady samples.

Stationarity, preprocessing is exactly a requirement for the solution mechanism

 
Maxim Dmitrievsky:

Stationarity, preprocessing is exactly a requirement for the Solution Mechanism

I absolutely agree.

 
Maxim Dmitrievsky:

You're just writing some nonsense and nonsense, and you're not making any sense at all. You can't even think about it in a sober state

if only because the activation functions will stick in one position on clearly nonstationary samples

Stationarity, preprocessing is exactly a requirement for the Mechanism

Alexander_K2:

I absolutely agree.

How do you read books? Do you put it under your ass?

It's quite popular. The problem either has a solution or it does not. If you don't, you can't solve it. If there is, you won't be able to solve it. And the NS has nothing to do with it.

 
Alexander_K2:

I absolutely agree.

So what, you think it'll help?

I doubt it ;)

here's the full insider's view:


Reason: