Discussion of article "Metamodels in machine learning and trading: Original timing of trading orders" - page 10
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
I don't see any prognosis. You can get epilepsy like in some Japanese cartoons, so be careful with that.
It's from a textbook on nonlinear dynamics. What's interesting is that the whole picture is given by one recurrence formula. That is, knowing the place where you are and the direction of movement, you can say with a high probability where you will be in some time.
Regarding the previous picture. I have no desire to amaze anyone with my superprediction or grail, and I pointed out that 10% of the amount of work has already been done.
This is from a textbook on nonlinear dynamics. What is interesting is that the whole picture is given by one recurrence formula. That is, knowing the place where you are and the direction of movement, you can say with high probability where you will be in some time.
Regarding the previous picture. I have no desire to impress anyone with my superprediction or grail, and I pointed out that 10% of the work has already been done.
I just wish you good luck.
thank you.
Hello!
Interested in your articles, it was interesting to study. Thanks for such work.
I have never come across the translation of a network into *.mqh library. Is it possible to translate CNN network in this way?
I have a connection between the Jupiter and the terminal via data files, which is not very convenient.
I would like to implement it as well. Please tell me where to look for it.
Thank you.
Hello!
Interested in your articles, it was interesting to study. Thank you for such work.
I have never come across the translation of a network into *.mqh library. Is it possible to translate CNN network to *.mqh library?
I have a jupiter connection with the terminal via data transfer files, which is not very convenient.
I would like to implement it as well. Please tell me where to look.
Thank you.
hello, of course you can rewrite the network architecture and then save the weights for it at each retraining to a file or to mqh immediately
maybe it is possible to use a ready-made, like
I don't deal with networks myself, so I can't tell you about the complexity of migration, it probably depends on the complexity of the network.
I'm getting a warning
Not clear if this is critical or not for correct operation, and how to fix it?
def get_prices() -> pd.DataFrame: p = pd.read_csv('EURUSDMT5.csv', delim_whitespace=True) pFixed = pd.DataFrame(columns=['time', 'close']) pFixed['time'] = p['<DATE>'] + ' ' + p['<TIME>'] pFixed['time'] = pd.to_datetime(pFixed['time'], infer_datetime_format=True) pFixed['close'] = p['<CLOSE>'] pFixed.set_index('time', inplace=True) pFixed.index = pd.to_datetime(pFixed.index, unit='s') pFixed = pFixed.dropna() pFixedC = pFixed.copy()I'm getting a warning.
Not clear if this is critical or not for correct operation, and how to fix it?
In the new pandas, replace with
In the new pandas, replace with
Yes, it worked, thank you.
There are some unclear points in the article:
"The selection of predictors and markup of trades is automatic." - I could not find in the article where the automatic method of selecting predictors is described?
"The data is now prepared for training. It is possible to make additional repartition of the main labels ('labels') according to the second labels ('meta_labels'), i.e. delete from the dataset all deals that turned out to be unprofitable."
And how does this deletion happen when applying the model on new data in the Expert Advisor?
I can't understand why the markup is different at the initial stage. Is it just artificially done, and it is possible not to do so?
Yes, it worked, thank you.
There are some unclear points in the article:
"Selection of predictors and markup of trades is automatic." - I couldn't find in the article where the automatic method of selecting predictors is described?
"Now the data is prepared for training. You can do additional repartitioning of the main labels ('labels') according to the second labels ('meta_labels'), i.e. remove from the dataset all trades that turned out to be unprofitable."
And how does this deletion happen when applying the model on new data in the Expert Advisor?
I can't understand why the markup is different at the initial stage. Is it just artificially done and it is possible not to do that?