Machine learning in trading: theory, models, practice and algo-trading - page 266

 
mytarmailS:


Where did you getthe candlesticks? I don't have any on CRAN and RSUDIO.
 
mytarmailS:

When differentiating, the shift is automatic, as the series becomes one element shorter, and then all that is needed is to shorten the sample (table with observations) by the last element

here's an example

SomeData <- c(10,20,30,20,10,20,30,40,50,40)

Y <- diff(SomeData)

cbind.data.frame(  Y , SomeData[-length(SomeData)])


get

   Y                   SomeData[-length(SomeData)]
1  10                          10
2  10                          20
3 -10                          30
4 -10                          20
5  10                          10
6  10                          20
7  10                          30
8  10                          40
9 -10                          50

Incorrect. It should be like this

> SomeData <- c(10,20,30,20,10,20,30,40,50,40)
>

> Y <- diff(SomeData)
>

> Y
[1]  10  10 -10 -10  10  10  10  10 -10
> require(magrittr)
Loading required package: magrittr
> Y <- diff(SomeData) %>% c(., NA)
> dt <- cbind(SomeData, Y) %>% na.omit()
> dt
      SomeData   Y
[1,]       10  10
[2,]       20  10
[3,]       30 -10
[4,]       20 -10
[5,]       10  10
[6,]       20  10
[7,]       30  10
[8,]       40  10
[9,]       50 -10
attr(,"na.action")
[1] 10
attr(,"class")
[1] "omit"
> Y
[1]  10  10 -10 -10  10  10  10  10 -10  NA

The target has now been shifted forward by 1 bar.

 
SanSanych Fomenko:

It is not the predictorsthat should be shifted to the left, but the target.

Let me try to explain again.

I do not know, I still do not understand the problem, maybe I overheated, but I did as you say. I got

Confusion Matrix and Statistics

          Reference
Prediction    0    1
         0 1862  487
         1  487 2164
                                          
               Accuracy : 0.8052          
                 95% CI : (0.7939, 0.8161)
    No Information Rate : 0.5302          
    P-Value [Acc > NIR] : <2e-16          
                                          
                  Kappa : 0.609          
Mcnemar's Test P-Value : 1              
                                          
            Sensitivity : 0.7927          
            Specificity : 0.8163          
         Pos Pred Value : 0.7927          
         Neg Pred Value : 0.8163          
             Prevalence : 0.4698          
         Detection Rate : 0.3724          
   Detection Prevalence : 0.4698          
      Balanced Accuracy : 0.8045  

What is your error?

Maybe I did something wrong again, it's too optimistic.

 
SanSanych Fomenko:
Where did you getcandlesticks? I don't have any on CRAN and RSUDIO.

There's a lot missing on the crane, unfortunately...

install.packages("candlesticks", repos="http://R-Forge.R-project.org")
 
Vladimir Perervenko:

Wrong. It should be like this

Now the target is shifted to the future by 1 bar.

Well, if instead of adding NA at the end of"Y" and then deleting the same NA, I just delete the last line in SomeData, won't it be the same?

I really do not understand the difference, maybe already overheated completely (

 
mytarmailS:

I don't know, I never understood the problem, maybe I overheated already, but I did as you say. Got

I have not counted - there is no package.

And the result is very decent and also very similar to the truth. There guys are struggling to get close to 70% (30% error). And this is clearly less than 30%. And from the wheels, on the principle of "as is".

 
SanSanych Fomenko:

I have not counted - no package.

And the result is very decent and also very similar to the truth. These guys are struggling to get close to 70% (30% error). And this is clearly less than 30%. And with wheels, on the principle of "as is".

I don't know.... I don't believe in miracles for a long time... I think I'm stuck again, that's why I want someone to double-check
 
mytarmailS:

There's a lot missing from the tap, unfortunately...

install.packages("candlesticks", repos="http://R-Forge.R-project.org")


Thanks, all downloaded.

Totally new thought in forming predictors. Will do. Very interesting for me is the question of the prescaling power of each of the predictors. I will post it as soon as I calculate it. If the predictive power is too good, I will post it.

If you don't mind, pin the .RData
 
mytarmailS:
I don't know.... I don't believe in miracles for a long time... I think it's a glitch again, that's why I want someone to double-check it.
I'm getting below 25% error by pre-cleaning the predictors. That sounds about right. Send me the RData, and I'll do the math. But the main thing is the predictive ability of predictors exactly to the specified target variable
 
SanSanych Fomenko:
If you do not feel sorry, attach the .RData
Not sorry, but I do not know how, just can not do it how many times I've tried, you try it your way, what I did the target you know, and tell me how you do it
Reason: