Machine learning in trading: theory, models, practice and algo-trading - page 2024

 
Maxim Dmitrievsky:
No one gives grants for Forex bots
Forex bot will pay for itself if it is effective.
The problems there are more solvable, and the local AI specialists have experience in more complex or even unsolvable tasks. It will be easier with simple tasks.
There are tasks for audio, photo and video recognition, diagnosing symptoms, controlling robots and robot swarms, etc.
 
elibrarius:
Forex bot will pay for itself if it is effective.
The tasks there are more solvable, and the local AI guys have experience in more complex or even unsolvable tasks. It will be easier with simple tasks.
There are tasks for audio, photo and video recognition, diagnosis by symptoms, control of robots and robot swarms, etc.

Right now, there are very few good specialists at all. It has come to the point that in order to get a good job at SIBUR (including in IT), you simply have to pass an iq test :D

 
Maxim Dmitrievsky:

There are very few good specialists at the moment. It has come to the point that in order to get a good job at SIBUR (including in IT), you simply have to pass an iq test :D

Good professionals gain experience, they set up their own business and receive 100% of orders, and not 10-20% working for an "uncle".
Here is another option - to become a grant-eater)).
 
elibrarius:

If anyone has ideas for AI projects or other digital projects that could be really in demand and there are some customers/sales markets, it is possible to get a grant for startups up to 3 million rubles with the possibility to pay employees up to 100 thousand rubles per month.
As a result, an object of intellectual property (program, invention, etc.) must be registered
https://ит-гранты.рф/

Only on accountability, coordination, evaluation, business plan, etc. paper will have to spend a bunch, if not one)

Priority Directions:
https://ит-гранты.рф/pnp

Everything related to trading is of no fucking use to anyone, the topic has already passed its rise and is dying, even if it is neural networks. Unfortunately.
 
Maxim Dmitrievsky:

Look how I see the problem...

Data I see as "Slices" of observations, and not always of the same length.

Inside these "Slices" there are cluster numbers, cluster number can be interpreted as a state, or better some event...

Event itself means nothing, it is important the correct sequence of events + MUST remember that 99% of events are garbage generated from our own selves.


So let's assume the market has a winning sequence of events (in a pile of garbage) as "1" - "2" - "3" - "YES".

This is what I understand by "brewed coffee"

"1"- pour water, "2" - heat "3" ... Obviously the sequence has to be correct.


The data looks like this, but the lines will be much longer.

Picture


So, I'm working on an algorithm that will look for such sequences hidden in the noise...


So questions for you...

Can RNN find such sequences in noise?

and can RNN take vectors of different lengths as input?

It seems to me that noise is too much for networks here, even the coolest ones like ltsm,gru, because they are for working with texts, and there is no noise there at all...

Maybe I'm reinventing the wheel?

 
mytarmailS:

See how I see the problem...

I see the data as "Slices" of observations, and not always the same length.

Inside these "Slices" there are numbers of clusters, the number of cluster can be interpreted as a state, or better some event...

Event itself means nothing, it is important the correct sequence of events + MUST remember that 99% of events are garbage generated from our own selves.


So let's assume the market has a winning sequence of events (in a pile of garbage) as "1" - "2" - "3" - "YES".

This is what I understand by "brewed coffee"

"1"- pour water, "2" - heat "3" ... Obviously the sequence has to be correct.


My data looks like this, but my lines will be much longer


(I can't insert a picture yet...)

The pictures don't work, it didn't work either.

Yeah, like that. The sequence of conditional events, it needs memory. But we reduce the number of events by clustering them to a few alternating ones.

you get a Boolean-type function 00001101010110011 where 0 and 1 are alternating events. Where the input is a series of n-events, we predict the next one. But this requires a recurrence mesh, not a classical one. It is possible to make more clusters than 2
 
Maxim Dmitrievsky:

The pictures don't work, it didn't work either.

Just click on the picture, there are a few more words at the bottom.

Maxim Dmitrievsky:

But we reduce the number of events by clustering them into several alternating ones.

Everything is already reduced, it's already clusters, can be clusters from clusters, whatever.
 
mytarmailS:

Such questions for you.

Can RNN find this in noise?

and can RNN take vectors of different lengths as input?

It seems to me that noise is too much for networks here, even the coolest ones like ltsm,gru, because they are for working with texts, and there is no noise there at all...

Maybe I'm reinventing the wheel.

The noise should be removed through the clustering layer

The layer can take vectors of different lengths (I haven't done it, but I know it's possible), but if the clustering layer is used, then the question disappears.

 
Maxim Dmitrievsky:

Noise must be removed through the clustering layer

noise , it's useless clusters

1 22 44 55 42 2 54 65 23 75 3 53 76 43 "YES"

But they are useless and superfluous, you will know only after the training, and so far they prevent learning, no matter how

 
mytarmailS:

noise , it's a noisy cluster

1 22 44 55 42 2 54 65 23 75 3 53 76 43 "YES"

don't be useless clusters

Reason: