Discussing the article: "Mastering ONNX: The Game-Changer for MQL5 Traders"

 

Check out the new article: Mastering ONNX: The Game-Changer for MQL5 Traders.

Dive into the world of ONNX, the powerful open-standard format for exchanging machine learning models. Discover how leveraging ONNX can revolutionize algorithmic trading in MQL5, allowing traders to seamlessly integrate cutting-edge AI models and elevate their strategies to new heights. Uncover the secrets to cross-platform compatibility and learn how to unlock the full potential of ONNX in your MQL5 trading endeavors. Elevate your trading game with this comprehensive guide to Mastering ONNX.

It is undeniable that we are in the age of AI and machine learning, on every single day, there is a new AI-based technology deployed in finance, arts and gaming, education, and many more aspects of life.

To us traders learning to harness the power of Artificial Intelligence could give us an edge over the market, letting us detect patterns and relationships that we couldn't see with the human eye.

Despite AI seeming cool and magical, behind the models, there are complex mathematical operations that require a huge amount of work and a high degree of accuracy and focus to figure out and implement right if you were to implement these machine learning models from scratch, something which you don't have thanks to open-source.

Nowadays, you don't even need to be a math and programming genius to build and implement AI models, You need a basic understanding of a certain programming language or tools you want to use for your project, and a pc in some cases you don't even have to own a pc thanks to services like Google Colab, you can code, build and, run AI models for free using python.

As easy as it is to implement Machine Learning models using Python and other popular and matured programming languages, it is not that easy to do it in MQL5, to be honest. Unless you want to reinvent the wheel by creating Machine Learning models in MQL5 from scratch something we do in this article series, I would strongly advise using ONNX to integrate AI models built in python to MQL5. ONNX is now supported in MQL5, I'm so excited, I believe you should too.

Author: Omega J Msigwa

 
MetaQuotes:

Check out the new article: Mastering ONNX: The Game-Changer for MQL5 Traders.

Author: Omega J Msigwa

You're absolutely right, this is a game changer. I almost disbanded the idea of applying modern machine learning into the financial markets because before you taught me about ONNX, the only way forward was going be re-writing all these algorithms all over again with no margin for error whatsoever, and that's a suicide mission for the optimist's point of view. But this is cause for celebration!

 
Gamuchirai Zororo Ndawana #:

You're absolutely right, this is a game changer. I almost disbanded the idea of applying modern machine learning into the financial markets because before you taught me about ONNX, the only way forward was going be re-writing all these algorithms all over again with no margin for error whatsoever, and that's a suicide mission for the optimist's point of view. But this is cause for celebration!

I appreciate it

 
Hello, I am trying to build my first red neuronal. I I would like to use ONNX it for my neural network, I am also using python and tensorflow
 
Sarah Vera #:
Hello, I am trying to build my first red neuronal. I I would like to use ONNX it for my neural network, I am also using python and tensorflow

yes you can do it

 

Omega J Msigwa appreciate your detailed article.

This article calculates a set of normalization parameters of all the historical data, then normalises the data with it for training, and applies the same normalization parameters to the live data; which is very logical, because the model was trained like so. There are a few doubts which I hope you can clarify please?

  1. If live (future) data experiences higher Max or lower Min, we would need to retrain the model?
  2. This normalization method differ from the way mentioned in a few articles (12433,12484 etc), i.e. normalization calculation was applied to every set of samples before training and live prediction, and (if needed), de-normalize after live prediction. What is your view on those approach compare to this article's please?
Many thanks for your time and effort. Well done.
 
68360626 #:

Omega J Msigwa appreciate your detailed article.

This article calculates a set of normalization parameters of all the historical data, then normalises the data with it for training, and applies the same normalization parameters to the live data; which is very logical, because the model was trained like so. There are a few doubts which I hope you can clarify please?

  1. If live (future) data experiences higher Max or lower Min, we would need to retrain the model?
  2. This normalization method differ from the way mentioned in a few articles (12433,12484 etc), i.e. normalization calculation was applied to every set of samples before training and live prediction, and (if needed), de-normalize after live prediction. What is your view on those approach compare to this article's please?
Many thanks for your time and effort. Well done.

01:Good question:  the data experiences new higher max or lower min, you might need to retrain the model to make it more relevant

02: I would say according to ML theory the normalization calculation that was applied in this article is the correct one, and it makes sense. Normalizing for different parameters for every sample isn't acceptable (not my words)

 
Is it possible to retrain the model from inside your expert advisor? So that it can optimize itself as it walks forward? 
 
Gamuchirai Zororo Ndawana #:
Is it possible to retrain the model from inside your expert advisor? So that it can optimize itself as it walks forward? 


Training is done from the python side of things and its where the trained model is saved, so the answer is No.

 

Hi, very very good article

I wonder if it is possible to export the data for a certain period, like by 2018 to 2020

thank you! 

 
Emanuele Mastronardi #:

Hi, very very good article

I wonder if it is possible to export the data for a certain period, like by 2018 to 2020

thank you! 

On copyrates and copybuffers set time from 2018 to 2020
Reason: