
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2024.08.01 12:44
Neural networks made easy (Part 80): Graph Transformer Generative Adversarial Model (GTGAN)
The recently published paper "Graph Transformer GANs with Graph Masked Modeling for Architectural Layout Generation" introduces the algorithm for the graph transformer generative adversarial model (GTGAN), which succinctly combines both of these approaches. The authors of the GTGAN algorithm address the problem of creating a realistic architectural design of a house from an input graph. The generator model they presented consists of three components: a message passing convolutional neural network (Conv-MPN), Graph Transformer encoder (GTE) and generation head.
Qualitative and quantitative experiments on three complex graphically constrained architectural layout generations with three datasets that were presented in the paper demonstrate that the proposed method can generate results superior to previously presented algorithms.
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2024.08.01 12:55
Neural Networks Made Easy (Part 81): Context-Guided Motion Analysis (CCMR)
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2024.08.08 12:51
Neural networks made easy (Part 82): Ordinary Differential Equation models (NeuralODE)
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2024.08.27 07:30
Neural Network in Practice: Secant Line
Although many may think that it would be better to release a series of articles on the topic of artificial intelligence, I cannot imagine how this could be done. Most people have no idea about the true purpose of neural networks and, accordingly, about the so-called artificial intelligence.
So, we will not go into this topic in detail here. Instead, we will focus on other aspects.
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2024.09.01 07:58
Neural Networks Made Easy (Part 83): The "Conformer" Spatio-Temporal Continuous Attention Transformer Algorithm
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2024.09.04 12:29
Neural Networks Made Easy (Part 84): Reversible Normalization (RevIN)
In the previous article, we discussed the Conformer method, which was originally developed for weather forecasting. This is quite an interesting method. When testing the trained model, we got a pretty good result. But did we do everything right? Is it possible to get a better result? Let's look at the learning process. We are clearly not using the model forecasting the next most probable timeseries values for its intended purpose. By feeding the model input data from a timeseries, we trained it by propagating the error gradient from models using the prediction results. We started with the Critic's results.
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2024.09.06 07:27
Neural Networks Made Easy (Part 85): Multivariate Time Series Forecasting
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2024.09.06 07:32
Neural Networks Made Easy (Part 86): U-Shaped Transformer
Forum on trading, automated trading systems and testing trading strategies
Better NN EA
Sergey Golubev, 2024.09.12 07:38
Neural Networks Made Easy (Part 87): Time Series Patching
Forecasting plays an important role in time series analysis. Deep models have brought significant improvement in this area. In addition to successfully predicting future values, they also extract abstract representations that can be applied to other tasks such as classification and anomaly detection.
The Transformer architecture, which originated in the field of natural language processing (NLP), demonstrated its advantages in computer vision (CV) and is successfully applied in time series analysis. Its Self-Attention mechanism, which can automatically identify relationships between elements of a time series, has become the basis for creating effective forecasting models.
This thread won't be about a question or problem, but rather about the anouncement of the presentation and documentation of an exciting trading concept. I plan to do a series of postings here in order to keep you guys updated.
Anybody who has an opinion on the topic, please don't hesitate to comment, even if you don't have profound machine learning knowledge (I'm still learning, too - which never ends).
To those of you who are more familiar with machine learning, the particular topic of this series will be about
Forex price FORECASTING with AUTO-ENCODERs combined with MULTIVARIATE FULLY-CONNECTED STACKED LSTM-networks. To those who are already intimidated by these fancy words: don't worry, it's not so complicated after all and I'm pretty sure that you will grasp the concept after a few introductory explanations. In order to make it easily understandable, I won't go into any calculus details. This is more about the idea. I still remember how it was when I first encountered neural network and how abstract and complicated it all seemed. Believe me: it's not - not after you are familiar with some basic terms.
I know that there are many EA's in the market that work with neural networks. Most of them work with "multilayer perceptrons" in their simplest form - which is nothing bad per se, but none of them is the holy grail and they usually suffer from the "garbage in / garbage out" problem and any good results often consist in the same overfitting as many other "less intelligent" EA's. If you feed the network with data from lagging indicators, don't expect any real magic to happen. Some of you who remember me from earlier posts might remember that I have a strong opinion about the limitations of predicting the future, when it comes to trading. As of today, I think that there is more money to be made by reacting to the status quo, i.e. statistical anomalies as they happen, instead of forecasting tomorrow's anomalies. This particularly comprises personally much preferred various break-out and mean reversion techniques. When it comes to forecasting, the task can statistically speaking be broken down to a time series analysis problem, just like we know them in many other fields, like weather forecasting or forecasting of future sales, flights, etc.. Fore those time series that have some kind of repetetive pattern, methods like the so called ARIMA-model or Fast-Fourier-Transformation can very well do the job, or also some kinds of special neural networks (recurrent networks like GRU and LSTM). However, the problem with stock prices or currency pairs is the immense amount of noise and randomness, that makes valid predictions so difficult. In my earlier experiences with time series forecasting in trading (also with LSTM networks) my final conclusion was, that the method does in fact work, but there is not much money left after substracting spreads/commissions and that the method is not superior to other trading methods like e.g. polynomial regression chanel break-outs, that I have good practical experience with. This is why I left the idea of price forecasting for some time. However, it's never a bad idea to put one's opinion to a validity retest. In this project, I want to test if I can make better predictions by making some adjustments to the classic LSTM forecasting concept. The combination of autoencoders with stacked LSTMs is nothing new and therefore not my invention, but I don't know of any realisation in a dedicated trading environment like Metatrader. I don't know what the outcome will be and I might stop the project at any time if I should realize that it doesn't work, so please understand this project is more like a fun "scientific" investigation that stands apart form my real trading and not (yet?) a readily made expert advisor.
I am very well aware that the programming language "Python" is the go-to language when it comes to machine learning, especially with it's powerful "Keras" library. I have some Python knowledge, which is why I could also do the same thing purely in Python, so it's more of a conscious personal choice to realize it all on Metatrader only. I will also do it this way because I already have my own libraries for MLP and LSTM networks complete and working from earlier projects, so it won't be that much additional work.
Okay... having these words gotten out of the way, let's start with a few topics that I plan to write about in the next posts, so that anybody, even without any previous machine learning knowledge, will understand what it is about:
1. What is a "neuron" and what is it good for?
2. What is a "multilayer perceptron"?
3. What is "backpropagation" and how do neural networks learn?
4. What is an "autoencoder" and how can it be used in trading and time series analysis?
5. What is a recurrent neural network (LSTM,GRU...) and what are the benefits?
6. Putting it all together
Next steps:
- practical realisation, debugging and making the networks "learn"
- hyperparameter optimization
- implementation of the networks in a trading system
- backtesting and forward-testing on unseen data
Have fun following me on the journey with the upcoming postings ... and please excuse any mistakes with my mediocre "Netflix English" (german is my main language, but the german part of the forum is less active, which is why I decided to post it here).
Chris.
This project sounds fascinating! I'm keen to see how combining autoencoders with stacked LSTMs will improve Forex forecasting. Excited for your breakdown of the concepts and your approach to handling overfitting and noise. Looking forward to your updates!