Discussion of article "Metamodels in machine learning and trading: Original timing of trading orders" - page 10

 
Maxim Dmitrievsky #:
I don't see any prognosis. You can get epilepsy like in some Japanese cartoons, so be careful with that.

It's from a textbook on nonlinear dynamics. What's interesting is that the whole picture is given by one recurrence formula. That is, knowing the place where you are and the direction of movement, you can say with a high probability where you will be in some time.

Regarding the previous picture. I have no desire to amaze anyone with my superprediction or grail, and I pointed out that 10% of the amount of work has already been done.

Files:
w6r5f.gif  82 kb
[Deleted]  
Inquiring #:

This is from a textbook on nonlinear dynamics. What is interesting is that the whole picture is given by one recurrence formula. That is, knowing the place where you are and the direction of movement, you can say with high probability where you will be in some time.

Regarding the previous picture. I have no desire to impress anyone with my superprediction or grail, and I pointed out that 10% of the work has already been done.

It remains to wish you good luck
 
Maxim Dmitrievsky #:
I just wish you good luck.

thank you.

 

Hello!

Interested in your articles, it was interesting to study. Thanks for such work.

I have never come across the translation of a network into *.mqh library. Is it possible to translate CNN network in this way?

I have a connection between the Jupiter and the terminal via data files, which is not very convenient.

I would like to implement it as well. Please tell me where to look for it.

Thank you.

[Deleted]  
djgagarin #:

Hello!

Interested in your articles, it was interesting to study. Thank you for such work.

I have never come across the translation of a network into *.mqh library. Is it possible to translate CNN network to *.mqh library?

I have a jupiter connection with the terminal via data transfer files, which is not very convenient.

I would like to implement it as well. Please tell me where to look.

Thank you.

hello, of course you can rewrite the network architecture and then save the weights for it at each retraining to a file or to mqh immediately

maybe it is possible to use a ready-made, like

I don't deal with networks myself, so I can't tell you about the complexity of migration, it probably depends on the complexity of the network.

Нейросети — это просто (Часть 3): Сверточные сети
Нейросети — это просто (Часть 3): Сверточные сети
  • www.mql5.com
Продолжая тему нейронных сетей, предлагаю рассмотреть сверточные нейронные сети. Данный тип нейронных сетей был разработан для поиска объектов на изображении. Рассмотрим, как он может нам помочь в работе на финансовых рынках.
 

I'm getting a warning

Warning (from warnings module):
  File "D:\FX\Python\meta_modeling.py", line 26
    pFixed['time'] = pd.to_datetime(pFixed['time'], infer_datetime_format=True)
UserWarning: The argument 'infer_datetime_format' is deprecated and will be removed in a future version. A strict version of it is now the default, see https://pandas.pydata.org/pdeps/0004-consistent-to-datetime-parsing.html. You can safely remove this argument.

Not clear if this is critical or not for correct operation, and how to fix it?

def get_prices() -> pd.DataFrame:
    p = pd.read_csv('EURUSDMT5.csv', delim_whitespace=True)
    pFixed = pd.DataFrame(columns=['time', 'close'])
    pFixed['time'] = p['<DATE>'] + ' ' + p['<TIME>']
    pFixed['time'] = pd.to_datetime(pFixed['time'], infer_datetime_format=True)
    pFixed['close'] = p['<CLOSE>']
    pFixed.set_index('time', inplace=True)
    pFixed.index = pd.to_datetime(pFixed.index, unit='s')
    pFixed = pFixed.dropna()
    pFixedC = pFixed.copy()
[Deleted]  
Aleksey Vyazmikin #:

I'm getting a warning.

Not clear if this is critical or not for correct operation, and how to fix it?

In the new pandas, replace with

format='mixed'
 
Maxim Dmitrievsky #:

In the new pandas, replace with

format='mixed'

Yes, it worked, thank you.

There are some unclear points in the article:

"The selection of predictors and markup of trades is automatic." - I could not find in the article where the automatic method of selecting predictors is described?

"The data is now prepared for training. It is possible to make additional repartition of the main labels ('labels') according to the second labels ('meta_labels'), i.e. delete from the dataset all deals that turned out to be unprofitable."

And how does this deletion happen when applying the model on new data in the Expert Advisor?

I can't understand why the markup is different at the initial stage. Is it just artificially done, and it is possible not to do so?

[Deleted]  
Aleksey Vyazmikin #:

Yes, it worked, thank you.

There are some unclear points in the article:

"Selection of predictors and markup of trades is automatic." - I couldn't find in the article where the automatic method of selecting predictors is described?

"Now the data is prepared for training. You can do additional repartitioning of the main labels ('labels') according to the second labels ('meta_labels'), i.e. remove from the dataset all trades that turned out to be unprofitable."

And how does this deletion happen when applying the model on new data in the Expert Advisor?

I can't understand why the markup is different at the initial stage. Is it just artificially done and it is possible not to do that?

It probably means that signs and labels are built through functions, so it is automatic.

When applied on new data, the already trained model is used, you don't need to delete anything.

The markup is different - a random sampler of trades is chosen because we do not know how to mark up the chart correctly. If we randomly mark up the chart many times and restart training many times, then, according to the law of large numbers....

[Deleted]  
If you want the same - make min and max duration of the transaction the same, min=max
If we knew how to do it right, but we don't know how to do it right....

You can substitute any autopartitioning function, that's the flexibility of the approach.