Machine learning in trading: theory, models, practice and algo-trading - page 640
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
In all seriousness, please don't make a fuss, Mikhail. This is an important moment. If this idea, no matter for what reason (whether because of my lack of skill or complete stupidity before the new opportunities) does not work, then the next one will come to the community of traders very, very soon. Absolutely sure of it.
Hold me seven!!!! And mark this day on your calendar with a red pencil, because that's the day I downloaded R and will be spinning it little by little...
On Sensei, a giveaway from the guys))) h2o.automl.
The rattle is average, but it's all on automl...
http://playground.tensorflow.org
visualization of ns learning, seems to be just for fun or as a teaching example
something she has obvious problems with spiral classification :)
And such an architecture can already
This is just like Poincaré - if the feature space is incoherent, you need at least two layers, there was already a question about this from elibrarius
And this architecture can already
Also, make the learning speed slower when the network starts to vibrate.
Last summer I played with this thing. Very visual thing).
Maxim, what about the feature selection? Aye-aye.
Also, make the learning curve slower when the network starts to vibrate.
Last summer I played with this thing. Very illustrative thing).
Well, yes, if you put the sines, it can be with one layer
EMVC doesn't do what I wanted, it doesn't do what it looks like from a cursory reading of the description.
EMVC takes a table with predictors and targets (classes only, no regression), and calculates the probability if each training example really belongs to a specified class. You can thus find the rows in the training table that contradict most of the other training examples (outliers, errors), and remove them so as not to confuse the model during training.
I assumed that it was possible to find a set of predictors that would give the highest probability estimates, but the found sets of predictors were unsatisfactory. I won't experiment with this further, there are better tools for selecting predictors. I can't see the cross-entropy estimation, although the package uses it in some way, it doesn't return that answer to the user.
But we got an interesting tool for sifting out training examples rather than predictors.
EMVC doesn't do what I wanted, it doesn't do what it looks like from a cursory reading of the description.
EMVC takes a table with predictors and targets (classes only, no regression), and calculates the probability if each training example really belongs to a specified class. You can thus find the rows in the training table that contradict most of the other training examples (outliers, errors), and remove them so as not to confuse the model during training.
I assumed that it was possible to find a set of predictors that would give the highest probability estimates, but the found sets of predictors were unsatisfactory. I won't experiment with this further, there are better tools for selecting predictors. I can't see the cross-entropy estimation, although the package uses it in some way, it doesn't return that answer to the user.
But at least we got an interesting tool for screening out training examples rather than predictors.
Too bad.
Once again you've proved the idea that miracles don't happen, you have to pick everything up from scratch.
You can thus find lines in the training table which contradict the majority of other training examples (spikes, errors), and remove them in order not to confuse the model during training.
Does it really need to be done on forex data where regularities are difficult to find? It seems to me that half of the examples can be eliminated by such a program. And outliers can be searched for with simpler methods: do not delete them, but, for example, equate them to the maximum allowed.