Libraries: BestInterval - page 19

 
Andrey Khatimlianskii:

And it absolutely makes sense to go in the opposite direction and look for different intervals for different days of the week. Then you can increase the number of intervals so that you get 2-3 segments for each day.

But not for each of the days, but just throw out pieces from the week. Right now, the chunks are thrown out of days, but we need to throw them out of a week. I.e. from any cyclic time interval.

 
fxsaber:

NS has nothing at all to do here, even hypothetically.

I did exactly the same thing as you, but through the NS. I say no to losing trades, yes to profitable ones.

After a run, all (almost) losses are cancelled. Almost because the network may have some approximation error. Another thing is that the input is not time (though why not), but arbitrary signs

In machine learning, this approach is called meta-partitioning. When the second model corrects the first one.

It is clear that this can simply improve the performance of a good TS, but you can't make a candy out of g... candy.

 
Maxim Dmitrievsky:

did exactly the same as yours, but through the NS.

I'm afraid my task was not understood.

 
fxsaber:

I'm afraid my task has not been understood.

I understand, I'm just saying it's possible to do the same thing in a different way. How to analyse the results is a separate question. I wouldn't analyse it in any way, because it's not a primary model, it's just a tweaking model.

 
Maxim Dmitrievsky:

it's not a primary model, it's just a tweak.

The task is not to tune, but to find common sense in any TC, if there are any.

 
fxsaber:

And I didn't see it until later. It turned out that cutting a lot and giving it away as it turned out is pure tinkering. But if you treat a lot of cuts as an intermediate stage of analysing a good 2-3 intervals, things become somewhat different.

I can't see what use can be made of such raw data. Trying to cluster a few neighbouring intervals?


fxsaber:

Just not for each of the days, but just chunks out of the week. Right now the chunking is done from days, but we need to do it from a week. I.e. from any cyclic time interval.

That's exactly what I had in mind.

 
Andrey Khatimlianskii:

I can't see what use can be made of such raw data. Trying to group several neighbouring intervals?

Right now the optimisation criterion would be the sum of profits over 20 intervals. And I need to select intervals from these 20 intervals where the probability of fitting is lower. And, accordingly, to take them into account in the criterion.

 
fxsaber:

And I need to select intervals out of those 20 where the probability of fitting is lower.

How is this possible? An inbuilt wolf-forward?

 
Andrey Khatimlianskii:

In what way is this possible? An inbuilt wolf-forward?

If, for example, I see 500 trades between 12:13-14:05 with a PF of 2.4, I tend to view the situation as at least interesting.

 
fxsaber:

If, for example, I see 500 trades in the interval 12:13-14:05 with PF 2.4, I tend to consider the situation at least interesting.

If this interval stands out from the rest, it will be highlighted when cutting out 2 plots (not 20).

If not, why should it be considered separately?