Machine learning in trading: theory, models, practice and algo-trading - page 3648

 
Andrey Dik #:
Well, you have confirmed my words. MO methods (and the theory confirms it) allow to approximate any continuous function with any accuracy, but they cannot extrapolate with the same accuracy as in the approximation area (they cannot reliably choose the set that will allow to extrapolate on the oos, because this set exists, but there are no ways to choose it). And this is the point of forecasting, to first approximate the available information and then extrapolate. That's exactly what I was talking about.

Another misconception is that forecasting is necessarily extrapolation. And that it describes the task of forecasting with ML.

Neural networks can't extrapolate at all. And there is no point in first approximating and then extrapolating with ML.

This was discussed at the very beginning of the thread.

///

That is, an unattainable incorrect problem is first set and then actively preached. From it, there arises a gag with realisation of the impossibility of extrapolation of phs (there is simply no such task!).

There is so much crazy information that there is not enough time and energy to respond to it all :)

 
Maxim Dmitrievsky #:
Neural networks can't extrapolate at all.

What makes you conclude that?

A neural network is de facto a bunch of special matrices (for completely arbitrary methods it is not, but for similar methods it is invented) and all its training is just a selection of coefficients.

and the training itself (in our case) is done for the purpose of extrapolations. As it was taught, so it is received...Quite an extrapolation, neurons are not to blame :-)

 
Maxim Kuznetsov #:

What makes you say that?

Neuronka is de facto a bunch of special matrices (for completely arbitrary methods it is not, but for such methods it is invented) and all its training is just a selection of coefficients.

and the training itself (in our case) is done for the purpose of extrapolations. As it was taught, so it is received...Quite an extrapolation, neurons are not to blame :-)

Where do they teach that?

Of course attempts are made, but the MLP in question is certainly not capable of it.
 
Maxim Dmitrievsky #:

Where do they teach that?

You could start with the wiki:

Extrapolation, extrapolation (from Latin extrā - outside, outside, behind, apart from and Latin polio - straighten, change[ 1]) - in mathematics and statistics, a special type of approximation in which a function is approximated outside a given interval rather than between given values. In other words, extrapolation is the approximate determination of the values of the function f ( x ) {\displaystyle f(x)} at points x {\displaystyle x}, lying outside the interval [ x 0 , x n ] {\displaystyle [x_{0},x_{n}]}, by its values at points x 0 < x 1 < . . . . < x n {\displaystyle x_{0}
    <x_{1}
    <...
    <x_{n}}[ 2].

And then continue to study MO books.

In a general way it is done, it looks like Fmlp (X0, X1,...Xn) = X(n+1), where Fmlp is MLP, (X0, X1,...Xn) are the input values of the network (usually increments), and X(n+1) is the extrapolated value of the function (increment to the value at the previous step). Training minimises, for example, the RMS error of all predictions (or other metrics).

 

We google one thing, we google another, and so on until we win. I know what extrapolation is, you don't know why NS can't extrapolate. The vector of motion is set.

Unlike D.'s riddles, which lead to nowhere, this information will be very useful for you, since you are not familiar with the basics of NS.

 

DeepSeek now has deep think, I understand it's the analogue of gpt-o1. 50 requests a day for free.

It takes a long time to think.

 
Maxim Dmitrievsky #:

Another misconception is that forecasting is necessarily extrapolation. And that it describes forecasting tasks using ML.

Neural networks can't extrapolate at all. And there is no sense in first approximating and then extrapolating with ML.

This was discussed at the very beginning of the topic.

///

That is, first an unattainable incorrect problem is set and then actively preached. From this problem, there arises a gag with realisation of the impossibility of extrapolation of phs (there is simply no such problem!).

There is so much crazy information that there is not enough time and energy to respond to it all :)

Dick freely or not freely camouflages his essence: he is a COCnik who believes that financial markets are noisy signals (I quoted from his posts above). And in full accordance with the DSP (or automatic regulation??) Dick tries to isolate the signal in one way or another and then extrapolate it to the current moment, by means of ML (NS). And since there is no signal and consequently there is no corresponding function, there is nothing to extrapolate.


If we keep in mind that we are dealing with a DSP, then everything will fall into place, namely: that all this nonsense, which is rotten on this form.

 

there's no noise in the market

Unfortunately and disappointingly at the same time.

any DSP will only distort the real price and introduce error, nothing more.

similarly with averaging

only financial literacy will help in trading and everything else is absolutely useless.

I hope that the limit for developing a professional trading system has been found.

20-30% per day is the maximum

 
СанСаныч Фоменко #:

Dick freely or not freely camouflages his essence: he is a DSP who believes that financial markets are noisy signals (I quoted from his posts above). And in full accordance with the DSP (or automatic regulation??) Dick tries to isolate the signal in one way or another and then extrapolate it to the current moment, by means of ML (NS). And since there is no signal and, consequently, there is no corresponding function, there is nothing to extrapolate.


If we keep in mind that we are dealing with a DSP, then everything will fall into place, namely: that all this nonsense, which is rotten on this form.

Even worse is that there is no signal, and even if there is, there is nothing to extrapolate with the help of NS :)
 
Maxim Dmitrievsky #:
Even worse, there is no signal, and even if there is, there is nothing to extrapolate with NS :)

Maybe it means that if you pre-train for 20 years, the first 10 years have the same set. It turns out that the second 10 years is an extrapolation if the set is found in some way that does not include the second period (about retraining - as an example of the set working on a large period).