Discussion of article "Neural Networks Made Easy"

 

New article Neural Networks Made Easy has been published:

Artificial intelligence is often associated with something fantastically complex and incomprehensible. At the same time, artificial intelligence is increasingly mentioned in everyday life. News about achievements related to the use of neural networks often appear in different media. The purpose of this article is to show that anyone can easily create a neural network and use the AI achievements in trading.

The following neural network definition is provided in Wikipedia:

Artificial neural networks (ANN) are computing systems vaguely inspired by the biological neural networks that constitute animal brains. An ANN is based on a collection of connected units or nodes called artificial neurons, which loosely model the neurons in a biological brain.

That is, a neural network is an entity consisting of artificial neurons, among which there is an organized relationship. These relations are similar to a biological brain.

The figure below shows a simple neural network diagram. Here, circles indicate neurons and lines visualize connections between neurons. Neurons are located in layers which are divided into three groups. Blue indicates the layer of input neurons, which mean an input of source information. Green and blue are output neurons, which output the neural network operation result. Between them are gray neurons forming a hidden layer. 

Simple neural network example

Author: Dmitriy Gizlyk

 
Awesome! I was waiting for this a lot. Have wrote similar myslef but you have better quality! I would be happy to have this as part of MQL5 language with offical support from MetaQuoes team! We need to gave good AI core as part of thos platform - Ibhave created so many workaround to run with Tensorflow and to feed data in and out... so much better if all can be done from MQL5!!!!
Thank you!
 
Karlis Balcers:
Awesome! I was waiting for this a lot. Have wrote similar myslef but you have better quality! I would be happy to have this as part of MQL5 language with offical support from MetaQuoes team! We need to gave good AI core as part of thos platform - Ibhave created so many workaround to run with Tensorflow and to feed data in and out... so much better if all can be done from MQL5!!!!
Thank you!

In my opinion it is more realistic to ask MetaQuotes team to implement an easy interface between Keras and MQL5. However, I guess the current Python interface in MQL5 could be used to some extent to run Keras based models (at least for predictions).

 

Mr. Dmitriy,

Very interesting article, i really appreciate, Congratulations, but, could you please tell us how can we use these function classes in a basic model inside OnInit(), OnDeinit(), OnTick() and OnTimer()? How should i call your BPB NeuralNet Functions the right manner inside an EA code?! Let's suppose a 'XOR Problem', or even a basic structure with last ten ticks value being placed on input, so i can understand how it works, how to define quantity of hidden layers, where do i insert input's and recall the output?

I've been integrating via Python socket one BPB external code, but i think is reasonable to believe your code could be very effective directly inside my EA.

Thanks in advance!

 
Phillip Kruger:

Mr. Dmitriy,

Very interesting article, i really appreciate, Congratulations, but, could you please tell us how can we use these function classes in a basic model inside OnInit(), OnDeinit(), OnTick() and OnTimer()? How should i call your BPB NeuralNet Functions the right manner inside an EA code?! Let's suppose a 'XOR Problem', or even a basic structure with last ten ticks value being placed on input, so i can understand how it works, how to define quantity of hidden layers, where do i insert input's and recall the output?

I've been integrating via Python socket one BPB external code, but i think is reasonable to believe your code could be very effective directly inside my EA.

Thanks in advance!

True. I would really appreciate it. 

This is a very inspiring article. I dedicated two days to read and analyse the codes thereign, and trust me, he took his time. 

However, I tried implementing the functions in an EA but keep getting stuck. I'm not sure how the functions flow. A sample input and training session would really help. 

Actually, today I came back hoping to find more articles on the same. I have faith in the author. I will wait for more articles.

 

It is not a must that you publish a sample. Just briefly explain how input arrays are fed into the system, and how the results can be viewed. I have studied the code daily in a week, I understand the codes to some extent, but still cannot figure out how the methods are interconnected. It's an urgent matter, but I guess I can't rush you. If the answer comes after I've figured it out, I'd still be thankful. 

Please respond whenever you can.

 
Nelson Wanyama:

It is not a must that you publish a sample. Just briefly explain how input arrays are fed into the system, and how the results can be viewed. I have studied the code daily in a week, I understand the codes to some extent, but still cannot figure out how the methods are interconnected. It's an urgent matter, but I guess I can't rush you. If the answer comes after I've figured it out, I'd still be thankful. 

Please respond whenever you can.

I figured it out. All the work has been simplified down to CNet class. 

Thank you for this awesome article. 

Peace.

 
Nelson Wanyama:

I figured it out. All the work has been simplified down to CNet class. 

Thank you for this awesome article. 

Peace.

Could you maybe share some of your information how to initialize the CNet class within the expert or indicator init function?
 
Would be great if someone could made a pratical use of this article
 

Is there something wrong with with the backpropagation function?

im building a very simple expert using this library but the output layer always outputs the same values... im using only two output layers. 

//--- open
   m_Trade.SetExpertMagicNumber(symbol.magicnumber);
   
   BackPropagation(symbol);
   
   CArrayDouble *inputVals = new CArrayDouble();
   
   for(int i=0;i<inputNeurons;i++)inputVals.Add((iClose(symbol.name,PERIOD_CURRENT,i)-iClose(symbol.name,PERIOD_CURRENT,i+1))/iClose(symbol.name,PERIOD_CURRENT,i+1));
   
   symbol.Net.feedForward(inputVals);
   
   CArrayDouble *results = new CArrayDouble();
   
   symbol.Net.getResults(results);
   
   Print(results.At(0)," ",results.At(1));
//+------------------------------------------------------------------+
//|                                                                  |
//+------------------------------------------------------------------+
void BackPropagation(CSymbol &symbol)
{
   
   CArrayDouble *targetVals = new CArrayDouble();
     
   if(iClose(symbol.name,PERIOD_CURRENT,0)-iClose(symbol.name,PERIOD_CURRENT,1) > 0)
   {
      targetVals.Add(1);targetVals.Add(0);
   }
   else
   {
      targetVals.Add(0);targetVals.Add(1);
   }
   
   symbol.Net.backProp(targetVals);
}


It would be great if someone could tell me if i'm missing a step in the process.

thanks.

 
Kept throwing memory leaks. I'm working on my own model rynna
Reason: