Machine learning in trading: theory, models, practice and algo-trading - page 449

 
Maxim Dmitrievsky:

Something too fast, it should be trained by maybe an hour or several hours, by what algorithm L-BFGS? I also did 15 inputs but only one hidden layer of 15-20 neurons, I had an Alglibian NS to learn... in short, I did not wait and reduced the size of input vectors) My 3 inputs with 10k vector took me 5-10 minutes to train, and that with one hidden layer. It is not slow backpropagation but fast with 1-3 epochs. i5 process

Imagine that even with 10 minutes you don't have a ready-made strategy and have to search through N number of predictors, vector lengths, number of hidden layers, etc... in your optimizer to find a strategy...

I know about algorithm.

Algorithms for feedforward nets. OBJECTIVE To provide engines for feedforward ANN exploration, testing and rapid prototyping. Some flexibility is provided (e.g. the possibility to change the activation or error functions).

- The algorithms themselfs are not described here, there are many books which describes them (e.g. get mine "Matrix ANN" wherever you may find it ;-). - Hypermatrices are slow, however there is no other reasonable way of doing things; tests performed by myself show that using embedded matrices may increase speed but the manipulation of submatrices "by hand" is very tedious and error prone. Of course you may rewrite the algorithms for yourself using embedded matrices if you want to. If you really need speed then go directly to C++ or whatever.

But in general it is the same BP.

Athlon processor, 2 cores. The laptop was bought in 8-9 years. On modern computers is usually 2 times faster flying than mine.

As for the finished strategy, it also takes 2-3 months or more on logic. It's not a big deal). Yes, and the NS has already spent probably more while figuring out where to harness the horse).

 
Yuriy Asaulenko:

About the algorithm we know that -

And in general, the same BP.

Athlon processor, 2 cores. The laptop was bought in 8-9 years. On modern computers is usually 2 times faster than mine.

As for the finished strategy, it also takes 2-3 months or more on logic. It's not a big deal). Yes, and the NS has already spent probably more while figuring out where to harness the horse.)


If you really need speed then go directly to C++ or whatever. :))) Well, if you like it, then fine :) And the forests are much easier to set up, by the way, there is only one parameter - the number of trees :)
 
Maxim Dmitrievsky:

If you really need speed then go directly to C++ or whatever. :)) well, if you like it, then fine :) The forest is much easier to set up, by the way, there's only one parameter - the number of trees :)
If you really need speed then go directly to C++ or whatever. :)) What is meant here is a direct call to the algorithm from C++, not from an interpreted environment such as R or the like. The algorithm itself is implemented in C++ anyway). And you saw yourself that the speed is normal.
 
Yuriy Asaulenko:
If you really need speed then go directly to C++ or whatever. :)) What is meant here is a direct call of the algorithm from C++, not from an interpreted environment such as R or the like. The algorithm itself is made in C++ anyway).

You can send calculations to a video card, it should be 5 times faster, if it's not built into your CPU :)
 
Maxim Dmitrievsky:

You can send your calculations to a video card, it should be 5 times faster, if it's not built into your CPU :)

Of course it is, but why, when~0.0003 c/sample. It is enough for any trading.

And RF read theoretically, but I'm not familiar with any package in practice. You are quick to switch and master. I'm giving you a standing ovation)). In general I have to master it, too.

 
Yuriy Asaulenko:

Of course it is, but why, when~0.0003 c/sample. It is enough for any trading.

And RF read theoretically, but I'm not familiar with any package in practice. You are quick to switch and master. I'm giving you a standing ovation)). In general I should master it too.

I'm in alglib, it's real to change 2 lines in the code from mlp to RF :) I've copied all my basic MO models (except complex ones like RNN LSTM) in AzureStudio for about 1 week, compared results, realized that RF is better + people write it...
 

I think it was you who posted in the thread a big table with ticks of different currencies and suggested to predict them on the basis of each other?

Please post that table again, I want to use it to check one technique for selecting predictors.

 
Dr. Trader:

I think it was you who posted in the thread a big table with ticks of different currencies and suggested to predict them on the basis of each other?

Please, post that table again, I want to check a predictor selection technique on it.

Files:
data.zip  3772 kb
 
I will:

Thank you. Started the analysis, with such a large number of rows the result will be in a couple of days. At the same time I'll try to teach the model to logloss by analogy with numerai, and then check it on a test table.

 
Dr. Trader:

Thank you. Started the analysis, with such a large number of rows the result will be in a couple of days. At the same time, I'll try to teach the model to logloss by analogy with numerai, and then check on the test table.

Hmmm... do you use the late Yury Reshetov's software? XGB grinds that set to 65-67% accuracy in a minute. When ML is running more than an hour, I believe that something is done wrong, so neural networks have long chilled.

Reason: