Machine learning in trading: theory, models, practice and algo-trading - page 723

 
Maxim Dmitrievsky:

Training with a teacher is in principle not suitable for working with non-stationary processes, it is written about it in any book

Is there any justification of the impossibility to apply deep nets to financial series?)

 
Belford:

Is there any justification anywhere for not applying deep nets to financial rows)?

There is at least proof in the form of this thread, in which the smartest people on the forum have made copies

 
SanSanych Fomenko:

Where is it written that training with a teacher requires stationarity?

What you call kamalaniya has been proven many times, mountains of publications, but there is nothing at all about learning without a teacher for the trade.

My God, everywhere, in someone's favorite Haykin, for example.

in my examples, where I showed the inability of NS to extrapolate.

What I call kamlanie is always a loss of information on transformations, by which then it is impossible to restore back the unsteady BP, since it is very sensitive to initial conditions and small fluctuations

But I do not convince anybody, this is my opinion.

And about without a teacher I can't say anything at the moment, terra incognita

 
Maxim Dmitrievsky:


in my examples, on which I showed the inability of NS to extrapolate

Your examples are proof of your personal ability or inability, and only at the current moment. Your examples have nothing to do with the ability of NS as a whole, because the proof is not based on examples, and the available proof is TESTED by examples.

What I call kamlanie is always a loss of information on transformations, by which then it is impossible to restore back the unsteady VR, as it is very sensitive to initial conditions and small fluctuations

You are not familiar with the problem, as it is exactly the opposite. Moreover, the idea of working with non-stationary series is the DIVISION of series, not the transformation.

 
SanSanych Fomenko:

in my examples, on which I showed the inability of NS to extrapolate

Your examples are proof of your personal ability or inability, and only at the current moment. Your examples have nothing to do with the ability of NS in general, because the proof is not based on examples, and the available proof is TESTED by examples.

What I call kamlanie is always a loss of information on transformations, by which then it is impossible to restore back the unsteady VR, as it is very sensitive to initial conditions and small fluctuations

You are not familiar with the problem, as in not the case is exactly the opposite. Moreover, the idea of working with non-stationary series is the DIVISION of the series, not the transformation.

Stop being so clever and show me the signal.

it is impossible to reconstruct the REAL piece of BP from the constructed model, since the process is non-stationary

i.e. in essence, the NS should be able to predict any random walk graph by learning from other
 
Maxim Dmitrievsky:

Oh stop being clever, show the signal

It is impossible to reconstruct the INVISIBLE BP from the built model, because the process is non-stationary.

Read GARCH.

The model is RETURNED by the time series, not the other way around. Although there is an inverse mode, called "simulation" when a BP is generated from a model with specified parameters, which is then used to test the real model. But this is a test that allows you to test the behavior of the model on different types of trends, different variants of variance behavior and their distributions. This is a completely different idea of model testing, which is not discussed here at all.

 

I'll say it now.

Gentlemen! The topic has long since dried up.

Do you know why? None of you even try to work with the intensity of the quote flow. There just sits the notorious non-stationarity, which is almost impossible to transform into a stationary Poisson flow, but must be taken into account in the calculations.

Your inputs are full of junk. What do you want?

You have to work with incremental velocities, as the great Feynman bequeathed, before which you all are like the moon. That's it!
 

You are using ONLY NS, and ONLY one of the variants, and making generalizations to all machine learning here.

Besides NS there are hundreds of machine learning models, under the caret shell there are about 200 of them. Besides this preparation of initial data, besides this evaluation of models - you have a very limited idea about everything, as you limited yourself to a tool from some farm.


PS.

Learning without a teacher cannot in principle be applied in trading, because there is always a teacher. It can be veiled as NS with reinforcements, but there must be a PRINCIPLE.

 
SanSanych Fomenko:

Read GARCH.

The model is GENERATED by the time series, not the other way around. Although there is an inverse mode, called "simulation" when a BP is generated from a model with specified parameters, which is then used to test the real model. But this is such a test, which allows you to test the behavior of the model on different types of trends, different variants of variance behavior and their distributions. This is a very different idea of model testing, which is not discussed here at all.

we should mutually agree to give advice to someone only when providing, in advance, a report, okay?

otherwise it's just an opinion, one of hundreds of others

I do something and then write my results and my opinion, and I don't impose anything on anyone

 
Maxim Dmitrievsky:

we should mutually agree to give advice to someone only if we give, in advance, a report, okay?

Otherwise it's just an opinion, one of hundreds of others.

Above I stated the single opinion of thousands and thousands of people, which I can back up not only with publications in the field of currency pairs trading, but also with ready-made software packages.

If you're specifically referring to GARCH, the matlab toolbox called"Econometrics" is GARCH.

Reason: