Machine learning in trading: theory, models, practice and algo-trading - page 3105

You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
It's about time we all moved to the bright side - to the matstat!)
The dark side, as always, opposes it) Dark in the sense that it always tries to reduce everything to the dark and unclear - in the extreme version to a certain "gut feeling").
What does matstat have to do with it?
The man is exercising on STATIONARY rows, and we are discussing the clip in all seriousness! It has nothing to do with us at all, along with his null hypotheses.
It's about time we all moved to the bright side - to the matstat!)
or reproducible examples in the form of code
What's matstat got to do with it?
The video is a successful attempt to explain things important for understanding matstat on a simple but meaningful level.
A man exercises on STATIONARY rows, and we are discussing the video in all seriousness! It has nothing to do with us at all along with his null hypotheses.
You, I recall, were exercising on garches, which are usually stationary too) And "null hypotheses" is just basic matstat terminology that you just have to know and understand.
The video is a successful attempt to explain things important for understanding matstat on a simple but meaningful level.
From a general educational point of view, of course, but it is much more important to discuss only what is applicable to financial time series.
You, I recall, were exercising with garches, which are usually stationary as well)
Since when are garchas stationary?
The premise in garchas is that the original series is NOT stationary, moreover, a differentiated time series is NOT stationary. And garch is an attempt to model the NOT stationarity of the original series. Let's look at rugarch, there in the function itself is modelling of three features of the pre-differentiated series, which (features) refer the series to non-stationary.
Since when are garci stationary?
It has always been stationary (GARCH(p,q)) provided that the sum of all p+q coefficients is less than one.
The feeling (and it is not a feeling) is that negative profdeformation has reached such proportions that no material is no longer perceived "as is", but takes a complex path through the adhesions of former neuronal victories, and this enriched "truth" is ejected back through the mouth under pressure
So true) And it shows with frightening clarity that intellectually most of us may well be replaced by AI)
It is) And it shows with frightening clarity that intellectually most of us may well be replaced by AI)
that intellectually most of us could very well be replaced by AI)
Yes....
But we still have a few years, or months, to go.))
For now, there are two problems for launching a strong AI
1. Too voracious architectures
2. Too weak hardware
These are essentially two sides of the same coin...
But work is underway to solve both the first problem and the second...
They are not in a hurry to change the architecture (neural networks are our everything), but theywill have to, but with fast hardware (quantum computers) everything is much more active.