Machine learning in trading: theory, models, practice and algo-trading - page 929

 
Mihail Marchukajtes:
The basic strategy is needed ONLY for selecting a moment (time) for analysis. It can be static and have no optimization parameters. If we optimize the basic strategy, we get a shitload of models. There is no point in optimizing the basic strategy. The loading is taken by the NS. It is enough to set a comfortable set of parameters in basic strategy, in terms of number of deals per day, and already use it to train NS....

Yes, the basic strategies should be very simple... to modify framework a bit and you can add any, you need diversification

 
Maxim Dmitrievsky:

She says I'm smart and I have to take her to Australia because she has a friend there.

we need a fake marriage for that

You're so smart... I would've had a fake marriage with a hottie like you once... good for you!!!! check it out guys,what a ride........ That's what you're good at, Maximka. -)))))

 
Dr. Trader:

In the forecast ovals, you still need to round (>=0.5 -> 1; <0.5 -> 0) I will try mnogovhodov, I think it will be better, there classes 0 and 1 more equal number.

What I do not understand, and where two values greater than 0.5, what?

 
Mihail Marchukajtes:

You're so smart... I too would have had a fake marriage with such a pretty girl once... good for you!!!! check it out,guys,what a ride........ That's what you're good at, Maximka. -)))))

The funny thing is, if you want to ruin your mood and your self-esteem - go to the stock market

It is better not to abuse. It is good as a hobby. Neural networks are interesting, but nothing more. The more you complicate it, the more you get confused.

 

Then both will be 1 when rounded.

If you don't care about probabilities, you can drop whole branches altogether if they all end with the same class after rounding.

 
Dr. Trader:

mnogovhodov_02 2016 arr_Buy turned out like this:


y_pred
y_true01
010179752445
12431024208

I don't even know how to evaluate the results... It seems to me that the tree can still branch, because there are still a whole carload of predictors. Why not?

The table lacks information about reinforcement, i.e. how many such options in the sample as a percentage, for example you can see that the rightmost branch gives a correct entry in 77%, which is very good, but it is not clear how many times that was the case.


Dr. Trader:

Alternative. The result immediately in classes, without probabilities. That seems worse to me.


y_pred
y_true01
08174472498
11861829900

This makes more sense if the scheme is complete, but when the odds are 49 to 51, it's like nothing

 
Maxim Dmitrievsky:

The funny thing is, if you want to ruin your mood and your self-esteem, go to the stock market

So it's best not to overuse it. As a hobby it is fine. Neural networks are interesting, but nothing more... The more you complicate them, the more you get confused.

I don't recognize you. It's really good for you to understand a simple truth. "Everything brilliant is simple," even in the field of machine learning. The simpler the network, the better it works.....

 
Dr. Trader:

Then both will be 1 when rounded.

If you do not care about probabilities, you can throw out whole branches if they all end up with the same class after rounding.

And if you find branches with good probability and encode them into one predictor, and then once again let the tree to grind, then the tree will think how to make an improvement from the results found (predictors should like it then, since they have a large percentage of correct decisions), what do you think?

 
Aleksey Vyazmikin:

It seems to me that the tree can still be branched, because there are still a whole carload of predictors left. Why not?

If you branch further, then the accuracy on this data will certainly increase. But most likely it will fall on the new data.

We have reached a certain optimum when the tree has learned something, but there is no overfit yet, and we can expect similar results on new data.

 
Mihail Marchukajtes:

I don't recognize you. Indeed, you were well influenced by her to understand a simple truth. "Everything brilliant is simple," even in the field of machine learning. The simpler the network, the better it works.....

The main thing is to try to fuck everybody and show that the market has won. Of course, the rest is nonsense... )