Machine learning in trading: theory, models, practice and algo-trading - page 1482

You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Unfortunately I do not know the essence of your TS, I just stated the fact ***.
I`ll call you Pinocchio because you`re wooden ))))
I put the data in the last figure as predictors to the SMM, I did not give the price at all, the algorithm knows nothing about the price at all, I trained it without a teacher, the number of states in the SMM is only two.
SMM hasn't managed to find fast bounces as in the previous picture but it has managed to identify market trend states very well.
The figure shows SMM identifying two market states, OOS sampling
State 1 is marked in red, state 2 is marked in green
The Eurodollar 5 min.
I think my grail has been found...
Here's a large plot of 7500 points.
Of course it is not perfect, but it's just the beginning, there are more levels as entry points, stoploops, I think that with this idea I will make a sweet
Eurobucks chart from 6/12/19 2018 to 16/01/19? There if you just put the wands there, you get similar signals, in some places even better.
a peek into the futurehttps://smart-lab.ru/blog/305752.php
all calculations in p-ku, i just load into tslab a vector with signals purely to visualize buy-sell as 000110000
so bypass
if prices are converted to 2-integer or, there, 8-digit system, will it somehow change their representation for NS or tree?
at least the network configuration will change, right?
let us have 8 bits, we feed 0-255 to 1 input, and if in binary code, then 8 inputs and each 0-1 will be fed
and the weights of the NS will behave sharper, the activation function will now only work at the edges
SZY: try to learn the multiplication table ;)
at least the network configuration will change, right?
let us have 8 bits, we feed 0-255 to 1 input, and if in binary code, then 8 inputs and each 0-1 will be fed
and the weights of the NS will behave sharper, the activation function will now only work at the edges
ZS: try to learn the multiplication table ;)
Well, the multiplication table for the classification will not work
i wanted to try it for fun, maybe not 8 inputs, but all 8 values in 1
or recurrence graphs, say, each column or row as 010101010
it's cool that with Gaussian noise and the inputs will be full noise, but otherwise some structural formations. And the inputs will be binarized, well, partially, which mb will reduce overtraining
But if you feed columns or rows, it turns out that it is wrong. And if every point is a separate input, dimensionality is too high.
Well, the multiplication table is not suitable for classification
I wanted to try for fun, maybe not 8 inputs but all 8 values into 1
or recurrence graphs, say, each column or row as 010101010
it's cool that with Gaussian noise and the inputs will be full noise, but otherwise some structural formations. And the inputs will be binarized, well, partially, which mb will reduce overtraining
But if you feed columns or rows, it turns out that it is wrong. And if each point is a separate input, then the dimensionality is too large.
I tried a similar thing, the result was the same as with the price
I tried a similar thing, the result was the same as with the price
to be expected )
Was just going through the options of how to scale the price without normalization, eternal nagging
expected )
I was just going through the options of how to scale the price without normalization, eternal nag
expected )
I was just trying my hand at scaling the price without normalization, I always hit a snag.
I even turned it into letters once), there's a SAX method