Machine learning in trading: theory, models, practice and algo-trading - page 1049

You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Wanting to breathe life into this thread and fill your pockets on a neural network trading signal, I give:
Algorithm of input data preparation for the Grail
1. In the Erlang flow of order 300 and higher for tick quotes (analog OPEN/CLOSE M5) there is a stable Laplace distribution on the increments.
2. The sum of moduli of such increments will give a xy-squared distribution.
In the limit - normal distribution.
3. Thus, the sum of the moduli for a given flow, in a sliding window, say, 1440 such values = week (determined from Chebyshev inequality), will form a nearly normal distribution with a known quantile function and expectation.
4. Surely unthinkable cash nets can be extracted from such a process.
So why don't I use this algorithm to calculate foreshortenings, outliers, etc. nonsense?
Yes, because it's a VERY long process of waiting for a single trade. The window is a week! Nah, I don't have the patience for that.
And the neuronet just has to bring the Grail in a hurry on such inputs.
Good luck to all of you!
Ahh, so much effort, and all for nothing, but I wrote about tics and strategy tester, and no... The grail is right here, I will find it myself, look:
1. tick quotes may not contain all information - tick filtering from different datafeeds, and tick quotes may contain additional information, not relevant to the analyzed process - smoothing filters from brokerage companies, and algorithms of orders closing
2,3,4 Strategy Tester and Strategy Tester again
And after performing steps 1-4 there will be no "Grail", there will only be a mathematical model of the process under study, to "go to the money" you will need to develop a strategy
Well, drop the code in your personal message and I'll see what I can do. Is this for personal use or with free access for all?
emailed you
Trying different models (predictors), e.g. building multiple models and choosing the best one, on different transformed input data. Like passwords are picked from accounts. When there is no a priori knowledge of the subject and the patterns.
Handmade by hand.
Wapnick's video in English was about thisMaxim if you want you may read Ivakhnenko's works, this is what you are talking about but in a structured and optimized, the best form.
https://dic.academic.ru/dic.nsf/ruwiki/1034678
I even know a man (not personally) who built a very good robot on these principles
----------------------------------
This is what this guy's robot does
Maxim if you want you can read Ivakhnenko's works, this is what you are talking about only in a structured and optimized, the best form
https://dic.academic.ru/dic.nsf/ruwiki/1034678
I even know a man (not personally) who has built a very good robot based on these principles
Thanks, I'll read it. I built a system for correlated instruments first. It means that predictors are similar instruments, for example dollar index for EURUSD and the system tried to find patterns between them. The best result so far is about 100% of OOS of a trace length, and mole errors are about the same, then the system gradually starts to break down (not abruptly)
Different transformations give, at best, an error reduction of 0.1 on OOS. Obviously it's necessary to change not only inputs but also outputs, but it's already resource consuming
Maxim if you want you can read Ivakhnenko's works, this is what you are talking about only in a structured and optimized, the best form
https://dic.academic.ru/dic.nsf/ruwiki/1034678
I even know a man (not personally) who built a very good robot on these principles
----------------------------------
This is the kind of deal this man's robot makes.
it's basically a nuclear machine
Thanks, I'll read it. I was building a system for correlating instruments first. That is the predictors are similar instruments, for example dollar index for EURUSD, and the system tried to find patterns between them. The best result so far is about 100% of OOS of a trace length, and mole errors are about the same, then the system gradually starts to break down (not abruptly)
Different transformations give, at best, an error reduction of 0.1 on OOS. It's obvious, that not only inputs but also outputs have to be adjusted, but it's already resource consuming.
I was doing the same thing, like taking DAX (Europe) and SP500 (Netherlands) as predictors and tried to predict the euro-dollar, but with hidden markov models (HMM) and not neural networks, but it didn't work))
I have a feeling that there's something wrong with us, something fundamental in our view on how to build a forecasting system and therefore we hit the wall
this is essentially a nuclear machine
What is a nuclear machine? I do not know (
What is a nuclear machine? I don't know(
Well, it builds different polynomials from input data, Reshetov uses it in his predictor too
I have a feeling that there is something wrong with us, that we are missing something fundamental in our vision of building predictive systems, and therefore we are hitting the wall.
Let me remind you that Aleshenka and Koldun (who seem to be the only ones who have some success in neural network trading) spend a lot of time preparing input data.
Honestly, I do not know what they do there and, on purpose, with my posts, I provoke them to feedback :))) Alas, keep that secret...
Well it builds different polynomials from the input data, Reshetov's predictor uses the same
And Reshetov? Well, yes, he is familiar with the "MGUA" he said somehow.
The very idea of enumerating predictors by creating models and then creating models of increasing complexity, I think is very correct.
But maybe I made a mistake in the fact that I need to search not predictors but trading system solutions in my environment or something else...