Machine learning in trading: theory, models, practice and algo-trading - page 2292

You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Make a second output of the net to calculate the lot. Or use grid confidence as a lot multiplier.
averaging and grids are not always about lot size
there is one noise in the increments
How do you find 24-period cycles in 1-period increments
Easy. Going to increments is differentiation. The ratio of amplitudes to frequencies changes, but they don't go anywhere. It gets even better: slow cycles don't clog fast ones, you don't need to do filtering by dummies.
there is one noise in the increments
how do you find 24-period cycles in 1-period increments
window 48
Colleagues,tell me from experience.
I wondered if it makes sense to monitor the weights of the input layer (inputs are normalized) during training? Does it give something realistic to assess the significance of inputs?
I use the library from Dmitriy Gizlykfor experiments.
I know that by unloading the data into R or Python I can calculate all sorts of niche indices. But so far has not reached them, and it is convenient that his solution on the video card is almost "flying".
In general, does it make sense to monitor weights of inputs for simplicity, or in any case I should first make a detailed analysis of the inputs?
For a preliminary analysis of the inputs you can use
you can estimate the influence of the signs through the weights
I see, the more weights the more influence. Is it possible to get more information?
For example to understand that the problem does not have the best solution, or is not convex (if I do not confuse the term). Maybe somehow the weights will go to infinity, or maybe with the same network errors the weights may vary (i.e. then at a particular input very small, and then in another training approach (starting over) on the contrary very large), etc.
Practically for the time being I'm struggling with a problem where two classes are asymmetrically distributed (one is over 60%) and the grids "burn out" in 100% of cases producing one class.
I filter input data in different ways and pick up new data. Question may help in filtering "bad" inputs, what inputs should be discarded or filtered differently.
For preliminary analysis of inputs you can use scaffolding
Yes, I know, I did it in R, too lazy to go back and forth and long. and Alglib on MT5 allows it without any problems?
But I thought maybe some er training will tell me if to look at weights of inputs.
I see, the more weight the more influence. Can we get more information?
For example to understand that the problem does not have the best solution, or is not convex (if I do not confuse the term). Maybe somehow the weights will go to infinity, or maybe with the same network errors the weights can change differently (i.e. then at a particular input very small, and then in another training approach (starting over) on the contrary very large), etc.
Practically for the time being I'm struggling with a problem where two classes are asymmetrically distributed (one is over 60%) and the grids "burn out" in 100% of cases producing one class.
I filter input data in different ways and pick up new data. the question may help in filtering "bad" inputs, which inputs should be discarded at all or filtered differently.
There are many things you can do, I can't tell you the soda, because there are special packages.
Classes should be balanced for NS. Add examples of the missing
Yes, I know, I did it in R, lazy to go back and forth and long. and Alglib on MT5 allows it without problems?
I'd rather learn python.
I have learned to use python:
If I want to solve a problem where two classes are asymmetrically distributed (one is more than 60%) and my nets "burn out" in 100% of cases with one class.
Balance classes, or redo the metric, which would give more points to a rare class
Easy. Going to increments is differentiation. It changes the ratio of amplitudes of frequencies, but they do not disappear. Even better it becomes slow cycles do not clog fast ones, you do not need to filter with maches.
strange then why you did not find them if it is easy
Then it's strange why you haven't found them, since it's easy
until you experiment with sine waves you won't understand