
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Chris70:
If you compare to manual trading: do you have 10000 candle on your chart at any time? Probably not. But you have an understanding of general market behaviour / trading rules / current market phase. Regarding neural networks, this all this is stored in the network weights.No, I do not look at 10000 candles on a chart, because I use indicators. And the indicators I use are calculated based on the 10000 most recent candles. The alternative (which most traders do) is to periodically switch to several higher time-frames, in order to keep their perspective. But ... when you are Scalping manually on multiple Symbols, switching time-frames becomes an obstacle, so using indicators that combine all that data into the time-frame you are actively trading is an absolute requirement.
As for Neural Networks, I have to admit that my knowledge is outdated, since my most "recent" experience with ANNs was about 30 years ago ;)
Anyway ... did you get any useful results from your ANN so far? I mean ... something you'd actually use to trade with (your) real money? That was the whole point of this Exercise. Or am I wrong?
No, I do not look at 10000 candles on a chart, because I use indicators. And the indicators I use are calculated based on the 10000 most recent candles. The alternative (which most traders do) is to periodically switch to several higher time-frames, in order to keep their perspective. But ... when you are Scalping manually on multiple Symbols, switching time-frames becomes an obstacle, so using indicators that combine all that data into the time-frame you are actively trading is an absolute requirement.
As for Neural Networks, I have to admit that my knowledge is outdated, since my most "recent" experience with ANNs was about 30 years ago ;)
Anyway ... did you get any useful results from your ANN so far? I mean ... something you'd actually use to trade with (your) real money? That was the whole point of this Exercise. Or am I wrong?
While Chris rejects this idea i still see a usefullness in Feeding a timeseries of higher period Indicators (eg a 200 period blablabla) into the Net, because training on more than a 100 bars per sequence would be relatiely costful in the computing performance aspect. (surely feeding a bigger timeframe would be possible too)
Btw the result a few pages ago looked quite convincing if it was done out of sample.
However i still think that adding specific indicators could save training time and even find new (less dominant) patterns (faster), though a simple LSTM could find most of them too.
@Chris70
Also suddenly the MetaLabeling Part reminds me of GAN networks where a distributor competes against the generator. implementing a CNN as a distributor could even improve or create more realistic predictions :/
Btw especially becauseyour broker is an ECN, you should be able to use any kind of ("ECN" like) FOREX data you could get. Maybe even ForexFutures Data...
Btw the result a few pages ago looked quite convincing if it was done out of sample.
You mean the Screenshot with Bars and the projected high/low/close prices? Unless my eyes are deceiving me, these results don't look any better than a 3-period moving average. Why would you need an ANN for that?
Pretty Sure it was #97.
Neural Network weight distribution charts? I do see how that could be useful to visualize the data currently stored by the ANN, but I fail to see the usefulness of that data when trading. Or am I missing the point? As for the "results", if the range is between 0% and 64% accuracy, I find it rather disappointing. Throwing a dice would probably give you similar results. IMO, this was a failed experiment, unless the goal was to confirm the "random walk" theory.
Neural Network weight distribution charts? I do see how that could be useful to visualize the data currently stored by the ANN, but I fail to see the usefulness of that data when trading. Or am I missing the point? As for the "results", if the range is between 0% and 64% accuracy, I find it rather disappointing. Throwing a dice would probably give you similar results. IMO, this was a failed experiment, unless the goal was to confirm the "random walk" theory.
64% strike rate and always a positive CRV is not too bad. Show me something better only based on priceaction with a solid foundation( /working long term)
The 64% is not fixed long term either, so you are comparing apples with oranges.
Chris did not established (or posted) a base line, so there is no way of knowing if this "not too bad" 64% is actually an improvement over the base line.
The base line could be anything. From classic statistical prediction models to simply counting the percentage red/green candles from the training data for example.
Okay, there's something going on here... First of all, it's not a dispute - we all want the same thing. This being said, I recall having been very open minded and conservative about too optimistic claims, especially with my introductory words in the beginning of this thread. If it doesn't work, I walk away without regrets and I also don't see the point in convincing anybody. You all trade with your own money, not mine. We all fight our one struggle, just as I'm okay with being wrong.
The main question is how much the markets can be anticipated AT ALL - based on history and current price action. Based on this query, I'm just trying to find the best possible information that there is. Somehow, we're all trying to "predict", with whatever method in place - indicators, funddamentals, machine learning... If on the other hand we had no opinion about the market at all, we should all stop at this point, walk away and never trade again.
With this in mind, the problem with classic "indicators" is, that we're trying to impose a "formula" onto the recent market data, that we BELIEVE to reveal something useful to us (and maybe it does), instead of trying to find the BEST possible formula under the given circumstances.
I'm less of an advocate for "belief" as long as there is a mathematical method for better knowledge. Machine learning is nothing less.
And if there just is no knowledge to be found? Okay, then machine learning shows us exactly that! But luckily (at least this is what my personal experience is showing me) there actually is some useful hidden knowledge.
Selections of indicators are usually based on personal judgement, trial and error and lack objectivity (at least if we neglect genetic indicator selection methods at this point).
This is something where I don't share Bayne's opinion: indicators are an unnecessary limitation upon the available information. Just take something as simple as a moving average or RSI: you can always calculate those values from a series of prices, but not the other way around. Neural networks on the other hand might even find relationships within the raw(!), most possible redundant information, that just didn't come to our thought.
Again, you can all do what you want with your time and money, I'm not trying to prove anything. This is an open discussion about methods.
Possible accuracy results (on unseen data!) beyond 60% are on average (64% ain't 0-64%). Of course, concrete results also depend on the timeframe and exact strategy. I'm trying different things here and they're not all the same. So far, I can say that looking at multicurrency data has helped a lot.
When I began this thread I also had not known that performance is so little of an issue with big neural networks under Mql5. When I chose the autoencoder+LSTM approach I wasn't really aware of this. I have learned that Mql5 is powerful enough that we don't really need the autoencoder method. Mql5 can very well handle big recurrent networks in reasonable training time. Anybody wants to implent machine learning in Mql5? Just do it, it works.
Do I currently use any of the mentioned preliminary results for actual real money trading? Clear honest answer: No (for that at this time I have other methods in place, mostly based on polynomial regression and momentum). But the results are promising enough to tell that I will soon. I just try a lot of things in order to find the best solution. Who's not? Lastly, it ain't a pissing contest, is it?
Happy trading everybody (after a hopefully nice weekend)!
Cheers,
Chris.
Ok, then ... here's my "contribution" to this effort ;) I don't find it very useful, but ... here is a very simple Neural Network, written in pure MQL5, without any external dependencies (untested) ...
Hey, thanks a lot for your contribution.
Just a few thoughts:
1. Why only one fixed hidden layer? It's easy at this point to add a layers dimension "l" to your weights and hidden neurons and it just adds one more 'for' loop. It's harder if you change this once the code gets more complicated.
Another suggestion: don't chose different variable names for input layers, hidden layers and outputs. Just assign an index [0] to the input layer, [layers-1] to the output layer (assuming that "layers" is the total number of layers) and anything in between for the hidden layers. This seems unnecessary with only one hidden layer, but makes thinks easier with more complex networks. If you then do the backpropagation, you can still declare the output errors separately before cycling through the hidden layers, i.e. by specifically referring to [layers-1] for the outputs. Different names are really unnessary. By the way, I also first did it just like you did and changed it when I came upon some disadvantages later. Yes, of course there are differences how the layers are handled, like e.g. an input layer has no inputs on it's own, but this can all be addressed by array indices.
2. What is the purpose of the "error threshold"? I get that you calculate the error of each output neuron by the statement
because this is the derivative of the MSE error function (or more precisely: the correct derivative is Error = FOutputLayer[k]- FTargetLayer[k]; so the difference just the other way around, i.e. -(label-output) which is the same as +(output-label), so you get the wrong sign, but this detail doesn't matter depending on with which sign you handle the error later).
but then with MSE the total training error should be something like
I can't find this in your code. Adding a threshold and just taking the square of the error instead of half the square seems wrong (because the derivative of x² is 2x, so the derivative of 0.5x² is just x).
3. Why the e!=-1 statement in the activation function? MathExp(-x) can never be negative (so never -1.0), because the range for e^x is between plus infinity for high x values and asymptotically approaching zero for very negative values ( https://en.wikipedia.org/wiki/Exponential_function).
By the way: a nice website for visualization of functions and computing derivatives is http://www.derivative-calculator.net. For example just input your (2.0/(1.0+exp(-x)))-1.0 there. It's fun to create your own custom activation functions there.
All in all: seems like a good starting point for a growing project - with neural networks you never run out of possibilities... (saving/loading the weights to/from a file, different weight initialization methods, more flexible network architecture, memory cells, other loss functions... you get the idea).
Here's something if you want some more activation functions to play with (a shared an earlier version of this code somewhere on the forum, but this version is a little better). The advantage: you can keep the choice of activation function flexible, for example as an input variable such as "input ENUM_ACT_FUNT actfunct", then later in your code you just write "Activate(x,actfunct) or e.g. Activate(x,f_sigmoid), Activate(x,f_ReLU)... and get the result for whatever activation function you selected (accordingly: DeActivate(x,actfunct) for the corresponding derivative).