Machine learning in trading: theory, models, practice and algo-trading - page 2025

 
Maxim Dmitrievsky:

Don't be a useless cluster.

So how do you know if it's useless or not? )))

 
mytarmailS:

So, how do you know if he's a no-brainer or not? )))

You have, conventionally, only 2 of them in a 2-dimensional curve - rise or fall. But you have many combinations of them, alternations. That's the most important thing for the model.

Because you don't need so much noise, you'll overfit on it.

The recurrence model does not learn well when there are a lot of patterns. It is more important to it to have a small number of interleaving patterns, sequences. Not the number of patterns. Got it?

 
Maxim Dmitrievsky:

You have, conventionally, only 2 of them in the 2-dimensional curve - growth or decline. But you have many combinations, alternations. That's the most important thing for the model.

Because you don't need so much noise, you overfit on it.

The recurrence model does not learn well when there are a lot of patterns. It is more important to it to have a small number of interleaving patterns, sequences. Not the number of patterns. Get it?

So in the clusters of "events" you can sew anything, not just the price...

Anyway. I've almost created an algorithm for searching for profitable sequences in the noise of 1-2-3-yes type.

Each sequence will be in the form of rules, then combining into a pool of rules and then summing up the signal.)

I just don't know how to train, I don't understand RL at all (

 
Maxim Dmitrievsky:

Well, you only send the cluster number to the network, not its contents. It doesn't care about the content.

Have you seen my example? )) What was it? The contents or the number? )))

 
Maxim Dmitrievsky:

Try to predict with a recurrence grid this simple grid

If it finds a pattern, then it can be used

A pattern is 1 2 3 4 , this sequence... if it is in the string, then "YES"


you don't even have to try Forrest.

test

  Reference
Prediction NO YES
       NO  58  71
       YES 57  64
                                          
               Accuracy : 0.488   
Files:
DT2.csv  1021 kb
 

I read the correct title today.
Neural network based databases.
No predictions, just database searching. The only difference from conventional databases is the ability to generalize/combine the most similar data.

 
Maxim Dmitrievsky:
Consistency is when numbers go one after another

Imagine that this is exactly what a sequence is, all the rest is noise (noise is stuff we threw in and think they mean something)

but they don't mean anything! but we don't know that until we find the pattern 1234

 
mytarmailS:

Imagine that this is exactly the sequence, all the rest is noise (noise is different chips that we have thrown and think that they mean something)

but they don't mean anything! but we don't know that until we find the pattern 1234

In your set the answers are found with a simple search, don't do bullshit
 
Maxim Dmitrievsky:
In your set the answers are searched by simple search, do not do bullshit

what if the range is not 1 to 20 but 1 to 5k ?

and the sequence is more than 10 ?

show me that simple search )) and where to rent clusters ))

 
mytarmailS:

What if the range is not 1 to 20 but 1 to 5k ?

and the sequence is more than 10 ?

show me that simple search )) and where to rent clusters ))

In a sequence, each successive element must be related to the previous ones, such as words in a sentence. Otherwise it's just unstructured garbage, what to look for there. What you have is a dumb search for the trash cans. Take out the green one, then the red one. That's a microsecond search, even at 20k.
Reason: