Machine learning in trading: theory, models, practice and algo-trading - page 1607

 
mytarmailS:

Max ! Have you tried using associative rules to find patterns like the arithmetic algorithm or something like that, I'm really impressed with this approach

Well, Bayesian networks... they take a long time to learn. If you don't know what to teach, you don't give a shit.

Imho, you have to use clustering (HMM, Gaussian mixtures), divide the market into several clusters and train for each. Then it works. I don't have time yet.

 
Maxim Dmitrievsky:

there are special separate libs for generating fictitious features, then you will get the same thing

mgua itself is a weak algorithm in that it uses ordinary regression, so it multiplies fixtures out of the box

and what's the English word for this process ?

 
mytarmailS:

and what is the English name for this feature provisioning process ?

somewhere in the preprocessing section, for example for python

https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.PolynomialFeatures.html

or kernel methods

https://github.com/gmum/pykernels

 
secret:

Then I see nothing new or original in this methodology.

New is well forgotten old!!!!!
 
Maxim Dmitrievsky:

Well, Bayesian networks... they take a long time to learn. If you don't know what to teach, you don't give a shit.

Imho, you have to use clustering (HMM, Gaussian mixtures), divide the market into several clusters, train for each. Then it works. There is no time yet.

Here you are absolutely right Maximka, not in the sense of specific methods, but in the sense of separation in principle when approaching the market. But this requires a team, when the team is big, you can do a tremendous amount of work and research and find methods and approaches that will be unique. You have to be different in the marketplace.... Unique. Don't you think? :-)

 

In the spirit of the day, when the quality of the system is its ability to stand on the trend.....


 
Mihail Marchukajtes:

Here you are absolutely right, Maximka, not in the sense of specific methods, but in the sense of separation in principle when approaching the market. But this requires a team, when the team is big, you can do a lot of work and research and find such methods and approaches that will be unique. You have to be different in the marketplace.... Unique. Don't you think? :-)

when it is a big team, you get tired of doing everything for everybody

 
Maxim Dmitrievsky:

When the team is big, you get tired of doing everything for everyone

So it's not a team anymore..... not our method....
 

For those who follow the topic. Continuing to stubbornly look down....


 
mytarmailS:

Well, it's hard to ask anything, it all starts with preprocessing data, and that's what you do not want to talk about. (

Well... I'm interested in

1. Whether the algorithm works for currencies

2. Does it build the forecast for a fixed length of n forward candlesticks or the network itself will decide on how many candlesticks

3. Why it takes so long to process the signal 12-13 sec per candle

4. Why do you want to broadcast deals publicly?

5. for the forecast, use the data in the form of a function (price, indicator) or something trickier.



the best way to visualize deals

Okay, here goes...
First the big picture:
- It starts easy, data is collected by a bot in the tester, we make a csv, each line is one vector;
- Keras network via Tensoflow, you do not need super knowledge, one book on neural networks + a couple of manuals;
- You can use Google Colab to make calculations, it is ok to start with, but it has its own nuances;
- Next comes the AD, if you have an awesome unique idea about what data to feed the network, then come up with 99 more equally unique because it will work 101, and even that is not a fact;
- The standard result: the network is not trained.

Some tips (washed in blood):
- don't look for sophisticated solutions, everything is simple:
-- I got the first result on a single-layer Sequential,
Don't try to predict the price -- this is utopia, the network should put a simple question up or down, then, if you get it, dig further.
-- chips in a vector from 100-200, you don't need more, you can't get less,
--running all night for 1000 epochs will do nothing, you will see if it is learning or not after 100,
-- catch the first slightest signs of trainability and dig in this place.
-- don't help neural network with crutches like diverters, it will not help, it has to learn itself,
-- Increasing the number of input data will not help, the sufficient minimum is 50-60 thousands for 100 features.

Now the answers to the questions:
1. Does the algorithm works for currencies
I got the first results for EURUSD, but then it turns out that the bitcoin is trained a little better for short predictions, I don't know why this is so.

2. Make a prediction for a fixed length of n candles or the network will tell you how long it is
Yes, of course it is a fixed one because we give it a fixed answer during training

3. Why it takes so long to process the signal 12-13 seconds per candle
this is because i have a forecast based on the combined opinion of 20 models, it takes 0.5 sec to get an answer from one of them, i can use asynchronous replies, but i don't know how yet

4. Why are you aiming at public broadcasting of trades?
I have spent a lot of resources; it should be recouped

5. The data for the forecast should be used in the form of a function (price, indicator) or something trickier.
Completely candlesticks + indicators + something trickier.