Machine learning in trading: theory, models, practice and algo-trading - page 3647

 
Ivan Butko #:
Question for MO experts:

Each time Python finishes training, the error has a unique number.

Each time training ends with a unique set of weights.



If you loop the training with constant reinitialisation of weights, is there any chance that after some time the error will go to the bottom on both the training sample and the test sample ?
Python error?) if you throw weights randomly, it will fall on both at some point. But we'll have to wait until the end of the expansion of the universe.
 
Maxim Dmitrievsky #:
Python error?)
There, when it runs an NS script (of any architecture), an error is printed during training, which is decreasing-decreasing.... decreases. And then Python is like "okay, I'm shutting it down, the error doesn't decrease anymore."

I was chatting with the code.
 
Ivan Butko #:
There, when it runs the NS script (of any architecture), an error is printed during training, which is decreasing-decreasing.... decreases. And then Python is like, "okay, I'm shutting it down, the error is not decreasing anymore."

I was chatting with the code.
It's like, no, it's not going to drop. It'll just be a spread of +-. But if you wait for infinity and randomly initialise the weights, without training as such, then maybe.
 
Maxim Dmitrievsky #:
Well, like, no, it won't. But if you wait for infinity and randomly initialise the weights, without training as such, it might.

Chat said the same thing.

Says wait a billion years.

 
Ivan Butko #:

Chat said the same thing.

He says wait a billion years.

That makes sense.
 
Don't use neurons for TC, it's a dead end. They take a long time to learn. Ask chat to do random forest or bousting. The results will be better and faster.
 
Ivan Butko #:

Each time the training ends with a unique set of weights.
If we loop the training with constant reinitialisation of weights, is there a probability that after some time the error will fly to the bottom both on the training sample and on the test sample?
Ivan Butko #:
There, when it runs the NS script (of any architecture), an error is printed during training, which decreases-decreases.... decreases. And then Python is like "good, I'll switch it off, the error doesn't decrease anymore".

This is called "sticking" of the optimisation method used during training, so the results are always different, sometimes significantly different.

There are two ways to go about this:

1) To loop the training and wait for further error reduction to occur.

2) Change the optimisation method to a more advanced one, which will skip the problematic place in the search space in a much smaller number of iterations.

The second method is much less costly in terms of time, energy and nerves.

Optimisation methods are like petrol, if the petrol is bad, the car can simply stall (get stuck), and if it is good, the car goes faster and does not stall. Where exactly the car will go is another matter and is related to the set of metrics used.

 
On tabular data, and tabular data is kind of heterogeneous. Volumes, prices, indicators, other stuff, wins forest and bousting. You don't even have to normalise. Neurons are better on homogeneous like sensor signals, pictures (pixels). And speed. Neurons are slopes in learning. If you are, say, an inventor and you have 100500 ideas, you will never test them all through neurons. There's not enough time.
 

The MO is for children nowadays.


I watched a few videos - it is doubtful, of course, that it is a child, but if it is NS, it is very impressive - the voice is indistinguishable from a live one, with intonations, turns, words with parasites....

And here we are all in our own sandbox ...

Here, by the way, the task for NS would be more difficult....


 

There is such a question on variable length features.

Will it be different from the division of BP into several states?

After all, in fact, the different lengths of the attributes should somehow differentiate these states for the model.

From this point of view, there is no point in bothering, is there? If there is already a division into states.