Discussion of article "Neural Networks Cheap and Cheerful - Link NeuroPro with MetaTrader 5" - page 5

You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
As for the article itself, not NS in general. What's the catch? The number of coefficients to be fitted is comparable to the amount of history.
Let's take the number of coefficients equal to the amount of history. I think, then the adjustment will be just perfect. It will not have a single losing trade and will squeeze the maximum possible out of history.
If we approach the NS construction as a selection of a wild number of coefficients, we don't need such a good thing.
Your arguments would be fair if the article was called "Creating the Grail". :)
But the article is not about that at all. So the structure of the NS is fully adequate to the purpose of the article and allows:
1) clearly demonstrate the stages of EA creation (in particular, the difference between the accuracy of the untrained and trained network is perfectly noticeable - there would be no such noticeable difference if there were only a few neurons in the network);
2) demonstrate the work with networks of large sizes. All these manipulations on mass text replacement in Notepad would be incomprehensibly necessary if there were only a couple of neurons in the example. And who knows what size NS will be built by the readers. As it is, I've taught them everything in advance.
I wonder what period of time the forward doesn't pour into the future? If it's a month, that's good.
Most traders cannot and do not know how to open a Buy Stop order correctly ....
And at the sight of such an article they will be simply incontinent ...
Your arguments would be valid if the article was titled "Creating the Grail". :)
But that's not what the article is about at all. So the structure of the NS is fully adequate to the purpose of the article and allows:
1) clearly demonstrate the stages of EA creation (in particular, the difference between the accuracy of the untrained and trained network is perfectly noticeable - there would be no such noticeable difference if there were only a few neurons in the network);
2) demonstrate the work with networks of large sizes. All these manipulations on mass text replacement in Notepad would be incomprehensibly necessary if there were only a couple of neurons in the example. And who knows what size NS will be built by the readers. As it is, I taught them everything in advance.
Your article is very useful for finally sober view of NS. Realisation that the "multiply and add" logic is extremely primitive, so it requires many more input parameters for an acceptable fit than more meaningful approaches.
Certainly, world experience has shown that even such a simple logic can give remarkable results in recognising a finite number of patterns - captchas, images and so on. But when you need to operate with an endless number of patterns (time series), you get something similar to what you get in the article.
If the goal was to get interested in NS as applied to tasks not related to BP, but still using BP as an example, then this is a somewhat strange approach. But perhaps your article is the most honest about NS. Too bad no one looks at the source code and understands the point. Discussions NS vs NS for the sake of filling the inner need to discuss something on NS topic are well demonstrated by some comments to the article.
Well, some people do read the code. From the point of view of the article's usefulness for me personally, I give it 100%.
Even from the point of view of "modern" NS, this approach still has the right to exist.
If some people don't see the + and * operations in the code, so what should I tell them...? The article is aimed at developers, not traders (especially those who don't know about buy-stops).
Thank you to the author. I added it to my favourites, as I will return to this material many times.
Oh, my, what passions what....., neurons, genes, natural mutations, artificial replications, with a colony of chromosomes)) it is how much you need to know to understand that they are divorced on the classics, and you can not simplify?
What does the neural network control if you teach it, the character of the pair? Or have I missed something? If so, it's not worth the money and trouble spent on it.
Hi Andrew,
Thanks for a very interesting article...! Nice to see how to connect MT5 with neural nets.Have you considered doing your neural net example to connect with MT4...?
MT4 has a much bigger user base and that would encourage more people to appreciate what your great article really offers.
Also...I tried to search for NeuroPro to try it...but it's hard to find and doesn't look like very much support for it?
You may consider using a free neural net program called Neuroph instead...
http://neuroph.sourceforge.net/
Neuroph is a newer Java supported program with a good user friendly GUI to build and test the networks (without needing to code anything).Actually...Neuroph looks a lot like the NeuroPro examples you posted...so the conversion hopefully would be fairly easy.
Neruoph also works with current and older windows and has both 32bit or 64bit versions for multi-core...so no compatibility problems that I've read about.In any case...I hope you consider doing the Neuroph/MT4 article...that would be a tremendous help...!
Meanwhile...thanks for a very informative article with lots of possibilities using neural nets...!
Take care,Robert
And great tip on Neuroph thanks Robert, looks VERY interesting!
Cheers
Stu