Neuro-forecasting of financial series (based on one article) - page 7

 
Vizard:

this is a black box...

That's for sure...yo, it would have been more or less stable, but it's not.... .....
 
Tell that to LeoV, for example ))
 
Well, what about Leonid, he is struggling to choose one or the other training. Of course, his results in NS are impressive, so he's with stev wards on a short hand.... It seems to me that all broken NS involves some kind of instability mechanisms. In any case, when trying to train his charts, I got a completely different result even though his koters were the same. The difference was that he had a legal NS and I had NO!!!!
 
nikelodeon:


Basically, whatever you want to call it...


Got it. It's like thinking 'like yesterday' instead of like now.
 
Integer:

Got it. It's like it thinks like it did yesterday, not like it's doing it now.

A neural network thinks the way it has been taught. And you can teach it anything, including: 2 x 2 = 5.

We're building a grid with two inputs. Teach the multiplication table, in which, everything is correct except that twos on the inputs give a five on the output. Trained, we get a neural network for which 2 x 2 = 5.

 
Reshetov:

A neural network thinks the way it has been taught. And you can teach it anything, including: 2 x 2 = 5

It is not how the network works, but what the phrase "like yesterday" means in relation to networks.

 
And if you give a neural network a sequence of symbols of English alphabet, for example, from Shakespeare's work, and give at the output the next symbol (its number, of course), then the network learns to probabilistically produce the next symbol. In English, like in Russian, there are regularities in the sequence of letters.
 
Integer:

It is not how the network works, but what the phrase "like yesterday" means in relation to networks.

Yesterday you taught the net to trade on the trend, today it successfully plummets on the sideways. This is not uncommon. And you can't teach the net everything, e.g. to anticipate interventions, sudden changes in volatility, etc., etc.
 
Reshetov:
Yesterday you taught the grid to trade on the trend, today it is successfully dumping on the sideways. This is not uncommon. And you can't teach the net everything, such as how to anticipate interventions, sudden changes in volatility, etc. etc.

Why, it's just that the signal in such a case will be random. If it is correct.....It's just luck, let's say. But to find during optimization those parameters which will work in the future, it is certainly difficult....It is not difficult to find them, it is difficult to choose them.....
 
nikelodeon:

Why not, the signal will just be random.

The signal of a neural network trained only on trend sections will not be random, but will be the way the network is trained. Namely it will follow the movement and dump in sideways.

Reason: