Machine learning in trading: theory, models, practice and algo-trading - page 540

 
Dr. Trader:

I came across an understandable description of the LSTM neuron, wrote a little code to test it. Article -http://datareview.info/article/znakomstvo-s-arhitekturoy-lstm-setey/

The code takes a 100-bar eurusd m5, counts bar increments and the lstm neuron learns to predict the next increment based on the last known one.
I trained it without complex analytical equations, neuron weights are fitted using discrete optimization lbfgs, it is worse, but it will do for a simple test.

The prediction estimate (R2) turned out to be a bit more than zero, which is very low, but still better than a random guess. Taking into account that the lstm neuron takes not some indicators or array of increases, but only one value from which it predicts the next one, and it's repeated for every bar, and generally everything is very simple - the result is better than I expected. But if I take thousands of bars then R2 score turns out to be < 0, too bad. It seems to me that the result of such a model worsens much at Forex using new data, I have to invent some bicycle with crossvalidation, the profit will not be obtained in such a simple way as now.

I should now make up a network somehow using these neurons, but that was not mentioned in the article.



Judging by the picture it predicts the previous bar. Or maybe I do not understand something?

 
Maxim Dmitrievsky:

I don't understand the usefulness of these nets for anything yet :)

I've got a friend who's always been burned by them, learned keras, took a simple series at work with seasonal profit, taught the net for almost a day and then swore at it.

I do not know how he builds the model. My model takes 1 to 3 minutes to learn from 100 epochs.

PS. And model convergence can be stopped at 20-40 epochs, so it is possible to reduce training time.
 
If anyone is interested, I keep a repository with the project on githab.
I haven't updated it for a long time, so the code there is old, but the principle of collecting, processing, and saving the forecast, as well as examples of recursive network models are there.
 
Maxim Dmitrievsky:

I don't understand the usefulness of these nets for anything yet :)

I have a friend who was always excited by them, learned keras, took a simple series at work with seasonal profit, taught the net for almost a day and then swore at it.

Judging by articles about lstm - they can describe time series quite accurately, better than all other models, for example they can even learn to reproduce digital audio signal (voice, music), while for forest or ordinary neurons this is an impossible task.
I think that with a large number of neurons such a network will describe the graph of growth accurately enough, but at the same time there is a possibility of overfeeding - on new bars the model may be useless.


SanSanych Fomenko:

Judging by the picture it predicts the previous bar. Or is there something I don't understand?

Lstm neuron predicts the next value based on the previous one, and in this case it is based so strongly that even the eye can see it. If you don't use one neuron, but a whole network of them, this problem will disappear.


Aleksey Terentev:
If you are interested, I have repository with project on githab.
I haven't updated it for a long time, so the code is old, but there are some principles of collecting, processing and saving forecasts and examples of recurrent models.
Thanks, I'll try to run it. Before that, I tried to figure out mxnet, but their examples were for working with texts and not time series, so I didn't get very far.
 
Dr. Trader:

Judging by articles about lstm - they can describe time series quite accurately, better than all other models, for example, they can even learn to reproduce digital audio signal (voice, music), while for a forest or ordinary neurons this is an impossible task.
I think that with a large number of neurons
such a network will describe the graph of growth accurately enough, but at the same time there is a possibility of overfeeding - on new bars the model may be useless.


He thought so too, judging by his articles

 

Doesn't anyone have a normal cpp lib with linear regression? that would allow to check the feature importance... or discriminant analysis... or scaffolding... something that can be ported :) algib is too simple

regression + scaffolding would be better.

 
Maxim Dmitrievsky:

alglib is too simple

Simple is bad and complex is bad...

You're one of my few students who has something in the way, like a bad dancer.

Although I've warned many times about all this ML and programming, that it's dabbling has nothing to do with real trading, people are timid, weak-willed and poor on top of that, who are generally forbidden to trade (in developed countries at the level of legislation), Then when the market will figure out who is who, and the market never cheats and cannot be cheated, it will be too late, the savings of ten years of hired labor will evaporate, they won't hire them back and will have to beg or to collect glass junk, to be homeless in general.

 
Vasily Perepelkin:

Simple is bad and complex is bad...

You are one of those few students of mine, who is hindered by something, like a bad dancer.

Although I've warned many times about all this ML and programming, that it's dabbling has nothing to do with real trading, people are timid, weak-willed and poor on top of everything else, who should not trade at all (in developed countries at the level of legislation), Then, when the market will figure out who is who, and the market never cheats and cannot be deceived, it will be too late, the savings from ten years of hired labor will evaporate, they will not be hired back and will have to beg or to collect glass jars, to be homeless in general.


You're like a clingy chick who was told "no" five times and she still texts you on facebook and sms.

 
Maxim Dmitrievsky:

Doesn't anyone have a normal cpp lib with linear regression? that would allow to check the feature importance... or discriminant analysis... or scaffolding... something that can be ported :) algib is too simple

regression + scaffolding is better

https://github.com/Artelnics/OpenNN is an easy to learn library. But a lot of modern techniques are missing. Regression is available, but forests are not.
https://github.com/Microsoft/CNTK - Multitool. Haven't studied it. As a dll option.
https://github.com/BVLC/caffe - Also quite powerful, for a dll option.
 
Vasily Perepelkin:

Simple is bad and complex is bad...

You are one of those few students of mine, who is hindered by something, like a bad dancer.

Although I've warned many times about all this ML and programming, that it's dabbling has nothing to do with real trading, people are timid, weak-willed and poor on top of everything else, who should not trade at all (in developed countries at the level of legislation), Then when the market will figure out who is who, and the market never cheats and cannot be cheated, it will be too late, the savings of ten years of hired labor will evaporate, they will not be hired back and will have to beg or to collect glass jars, to be homeless in general.


Teacher, stop flubbing. Or are you a spam and flooding teacher?

Reason: