Discussion of article "Third Generation Neural Networks: Deep Networks" - page 5

You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
No, of course not. I am only fighting attempts to replace science, no matter whether applied or theoretical, with shamanism to the best of my ability. And this is exactly what is going on in the field of NS, which has actually been stuck in both applied and theoretical for about a decade now.
1. About stuck. It seems to me that the widespread practical application of voice and handwriting recognition based on deep networks disproves this. I'm not talking about facial recognition. Right now it seems to me that this topic has gained new momentum.
2. Regarding science and shamanism. Since Soviet times, the difference in approach to the practice of Soviet science and Western science has been radically different. This is best seen in scientific literature. Back in the institute (70 years of the last century) I paid attention to how accessible and understandable are described complex issues in Western publications and how abstruse and twisted, with a scattering of complex formulas it is done in the domestic literature. This approach has not changed until now. The more complicated and incomprehensible the more scientific?
I am neither a programmer nor a mathematician. I'm a practitioner. It is important for me that new ideas should be presented in an accessible way, confirmed by authoritative application and help me to solve necessary questions with the least glavobolism and loss of time for their implementation. And this is all in the topic of deep neural networks in R language packages.
I agree that everything in the article is in a heap. Nobody cancels the desire to embrace the vastness. And not all representatives of the new generation remember the topic of neural networks. I wanted to remind them.
Well, as it turned out.
Good luck
Great article!
A nice sequence of DM articles lately.
Thus far have attempted to load everything and no matter what I do, cannot get it to load. All paths are the same as your instructions, I've tried both 3.1.1 which is the same version as you used along with a newer version 3.1.3 and all scripts, DLLs, indicator, headers, etc. are in their correct locations according to your instructions.
Whenever the EA is dropped onto a chart, it comes up with "Rterm has crashed" as an alert window, which on looking through the code says the R isn't loading.
Are there any additional steps required such as a DLL that's required to load R that is missing?
I've also checked all R scripts to ensure correct paths (and case-sensitivity for folder names) and it still doesn't working.
Very impressed by your article and the depth you went to in explanation. I've done a lot of work with Jeff Heaton's Neural Network so wanted to take a look at R as well.
Any advice you can offer would be appreciated.
Thus far have attempted to load everything and no matter what I do, cannot get it to load. All paths are the same as your instructions, I've tried both 3.1.1 which is the same version as you used along with a newer version 3.1.3 and all scripts, DLLs, indicator, headers, etc. are in their correct locations according to your instructions.
Whenever the EA is dropped onto a chart, it comes up with "Rterm has crashed" as an alert window, which on looking through the code says the R isn't loading.
Are there any additional steps required such as a DLL that's required to load R that is missing?
I've also checked all R scripts to ensure correct paths (and case-sensitivity for folder names) and it still doesn't working.
Very impressed by your article and the depth you went to in explanation. I've done a lot of work with Jeff Heaton's Neural Network so wanted to take a look at R as well.
Any advice you can offer would be appreciated.
Hi/
I am glad that you are interested in article.
I debug cases fall Rterma as follows:
- Comment out everything in the start () except that inspection work Rterma
- Comment out in the init () all but run Rterm ()
- If Rterm is up and running, since Init() uncomment one operator and checked. You specify at what happens operator crash.
Thereafter easier to determine the cause crash. Usually there are two: the script syntax error or lack of necessary libraries.
I again refer to Appendix to the article.
Ready to help you in the future if you tell which falls Rterm.
Best regards
Ne to chtobi statja ustarela, but luche izpolzovat recurenement LSTM i delphi decition. Mne lichno ochen nenravitsa MT4
Greetings.
Outdated??? It is new.
Better how, why. If you can be more detailed with details, preferably. I'm just curious.
We are not discussing Delphi at all.
The question is not whether we like MT4 or not. The task is to fulfil on what we have (i.e. MT4 whatever it is...) what we need quickly and reliably.
Good luck
Hi/
I am glad that you are interested in article.
I debug cases fall Rterma as follows:
- Comment out everything in the start () except that inspection work Rterma
- Comment out in the init () all but run Rterm ()
- If Rterm is up and running, since Init() uncomment one operator and checked. You specify at what happens operator crash.
Thereafter easier to determine the cause crash. Usually there are two: the script syntax error or lack of necessary libraries.
I again refer to Appendix to the article.
Ready to help you in the future if you tell which falls Rterm.
Best regards
This is a very interesting and useful article. I got the system to work, however in Zorro, not in MT4. This simplifies the script a lot and I can backtest it from Oct 14, 2014, up to today.
There is a problem though: It seems you have trained on the ZZ of the same bar, not the ZZ of the next bar. So the system is very good in predicting the bar that just ended. If I trade on the past bar, I get this balance curve:
A perfect system! But if I use the returned Sig for trading on the next bar, I get this slightly more realistic balance curve:
(The red part is the underwater equity).
I've used the October 14, 2014 model. Have you already tried training a model on the ZZ of the next bar?
This is a very interesting and useful article. I got the system to work, however in Zorro, not in MT4. This simplifies the script a lot and I can backtest it from Oct 14, 2014, up to today.
There is a problem though: It seems you have trained on the ZZ of the same bar, not the ZZ of the next bar. So the system is very good in predicting the bar that just ended. If I trade on the past bar, I get this balance curve:
A perfect system! But if I use the returned Sig for trading on the next bar, I get this slightly more realistic balance curve:
(The red part is the underwater equity).
I've used the October 14, 2014 model. Have you already tried training a model on the ZZ of the next bar?
Hi/
We must bear in mind the following.
1. We get the signal from the zigzag.
2. We shift it to another bar in the future.
sig <- Hmisk::Lag(sig, shift=-1)
3. We train the neural network to the signal from the next bar.
The quality of education need to increase the selection of indicators, their parameters, the parameters of the neural network.
The article shows the path and method. The potential of these networks is huge.
Best regards
Vladimir
I habe now trained a new model with prediction of the next bar, and it seems that it indeed works. The accuracy is still in the 74% range. This is the equity curve now:
It behaves just as I would expect: the system is profitable immediately after training, and then slowly deteriorates as the market changes.
So the next step is a WFO test with regular re-training of the model. For this the training must be integrated in the strategy script.
This is the corrected function for calculating the Sig of the next bar:
The "Compute" function that is executed every 30 mins by the strategy script:
The strategy script, the "EA":