
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Neural networks made easy (Part 78): Decoder-free Object Detector with Transformer (DFFT)
In previous articles, we mainly focused on predicting upcoming price movements and analyzing historical data. Based on this analysis, we tried to predict the most likely upcoming price movement in various ways. Some strategies constructed a whole range of predicted movements and tried to estimate the probability of each of the forecasts. Naturally, training and operating such models require significant computing resources.
But do we really need to predict the upcoming price movement? Moreover, the accuracy of the forecasts obtained is far from desired.
Our ultimate goal is to generate a profit, which we expect to receive from the successful trading of our Agent. The Agent, in turn, selects the optimal actions based on the obtained predicted price trajectories.
Neural networks made easy (Part 80): Graph Transformer Generative Adversarial Model (GTGAN)
The recently published paper "Graph Transformer GANs with Graph Masked Modeling for Architectural Layout Generation" introduces the algorithm for the graph transformer generative adversarial model (GTGAN), which succinctly combines both of these approaches. The authors of the GTGAN algorithm address the problem of creating a realistic architectural design of a house from an input graph. The generator model they presented consists of three components: a message passing convolutional neural network (Conv-MPN), Graph Transformer encoder (GTE) and generation head.
Qualitative and quantitative experiments on three complex graphically constrained architectural layout generations with three datasets that were presented in the paper demonstrate that the proposed method can generate results superior to previously presented algorithms.
Neural Networks Made Easy (Part 81): Context-Guided Motion Analysis (CCMR)
Neural networks made easy (Part 82): Ordinary Differential Equation models (NeuralODE)
Neural Network in Practice: Secant Line
Although many may think that it would be better to release a series of articles on the topic of artificial intelligence, I cannot imagine how this could be done. Most people have no idea about the true purpose of neural networks and, accordingly, about the so-called artificial intelligence.
So, we will not go into this topic in detail here. Instead, we will focus on other aspects.
Neural Networks Made Easy (Part 83): The "Conformer" Spatio-Temporal Continuous Attention Transformer Algorithm
Neural Networks Made Easy (Part 84): Reversible Normalization (RevIN)
In the previous article, we discussed the Conformer method, which was originally developed for weather forecasting. This is quite an interesting method. When testing the trained model, we got a pretty good result. But did we do everything right? Is it possible to get a better result? Let's look at the learning process. We are clearly not using the model forecasting the next most probable timeseries values for its intended purpose. By feeding the model input data from a timeseries, we trained it by propagating the error gradient from models using the prediction results. We started with the Critic's results.
Neural Networks Made Easy (Part 85): Multivariate Time Series Forecasting
Neural Networks Made Easy (Part 86): U-Shaped Transformer