Machine learning in trading: theory, models, practice and algo-trading - page 2323

 
elibrarius:

There's a great site that archives almost the entire internet.

super!!! thanks

 

In my opinion, there is little point in predicting series 100-200-500 days in advance. Too much can happen... to drastically change the series and the forces influencing its movement.

Can you repeat this code (better from article 3, it's more stable) for M1-M15 forecasting 1-10 hours ahead? Even 1 hour is enough to beat the spread and make a profit.
If it will be successful, then you can take the method to work.

 
sibirqk:

As I understand it, the authors do not reveal the algorithm itself too much, getting off with maxims like:

Therefore, the GenericPred method uses two basic rules:

R1: Always endeavour to keep the value of a nonlinear measure as stable as possible during prediction(Fig. 3).

R2: The new value must be chosen from a set of potential values generated from a probability distribution.

The prediction has to be pursued one step at a time because the predicted value in the current step is needed for determining the valid range of change for the next step.


As far as I guess, at first some logistic linear component is selected, and then at each step a nonlinear component is simulated, the main criterion being the stability of some set of stochastic characteristics of the series. In general, it is vague, but the result is impressive.

In my opinion the approach is somewhat similar to the one used in the package "prophet" in R.

I looked a little more closely - I see that I was somewhat mistaken. They make a series of sliding nonlinear metrics from the original series (they write about fractal dimensionality and Lyapunov's indices). This new series they consider (based on practical observations) similar to SB. And they multiply this series by Monte Carlo type method in the future and take from the resulting set a variant with the greatest closeness to the original one.

The secret is the type of specific transformation of the original series into a series of metrics and, more importantly, the inverse transformation.

In general, all this looks suspicious (first of all, the style of presentation of the results) and does not cause much desire to further study the question.

 
Aleksey Nikolayev:

I looked a little more closely - I see that I was somewhat mistaken. They make a series of sliding nonlinear metrics from the original series (they write about fractal dimensionality and Lyapunov's indices). This new series they consider (based on practical observations) similar to SB. And they multiply this series by Monte Carlo type method in the future and take from the resulting set a variant with the greatest closeness to the original one.

The secret is the type of specific transformation of the original series into a series of metrics and, more importantly, the inverse transformation.

In general, all this looks suspicious (first of all, the style of presentation of the results) and does not cause much desire to further study the question.

It looks like some nauseating feces, to paraphrase )

If you calculate the cattle, it will be worse than the naive price prediction of the last bar

 
Aleksey Nikolayev:

The secret remains the type of specific transformation of the original series into a series of metrics and, more importantly, the inverse transformation.

There is a code!

 
Aleksey Nikolayev:

Well, the servers of DC and ECN do not let us in) We have to make everything up by ourselves.)

Even if they let us in, it will not give much) The speed in getting prices and their some correctness will not change the essence. But the tendency is that the power of research prices and algorithms to evaluate FA data are growing, the latter, unfortunately, much slower. But the trend is there))) And who is the first who will be able to combine results will be on top for a while)))).

 

2014 article. I definitely read its Russian translation, or a summary in the work of some popular "trader-teacher".

I vaguely remember it boiled down to the fact that when there is a trend, the forecast is good, but when there is a trend change, the forecast is wrong. They didn't go far enough)

 

The topic of MO has descended to the level of DSP and Arim, funnily enough

although the authors of the article could not even properly use Arima or Garch so they do not show a straight line

Write more about sine waves and so on.
 
mytarmailS:

The code is there!

I don't know what was wrong with the author of the article when he wrote the code in part 3, but the code was "broken" in three places, tin....

I fixed it.

library(quantmod)
library(fractaldim)


getSymbols("SPY",src="yahoo", from="2019-01-01")


N <- 10  #  Predict the last N bars

mainData <- SPY$SPY.Close



colnames(mainData) <- c("data")
endingIndex <- length(mainData$data)-(N+1)

TEST <- mainData[1:endingIndex]
total_error <- 0
error_per_prediction <- matrix(ncol = 1,nrow = 0)

#These  are the fractal dimension calculation parameters
#see  the fractaldim library reference for more info

method <- "rodogram"
Sm <- as.data.frame(TEST, row.names = NULL)
delta <- c()

#  calculate delta between consecutive Sm values to use as guesses
for(j in 2:length(Sm$data)){
  delta <- c(delta, (Sm$data[j]-Sm$data[j-1])/Sm$data[j-1])
}

Sm_guesses <- delta

#do 100 predictions of next values in Sm
for(i in 1:N){
  
  #update  fractal dimension used as reference
  V_Reference <- fd.estimate(Sm$data, method=method)$fd
  minDifference = 1000000
  
  #  check the fractal dimension of Sm plus each different guess and
  #  choose the value with the least difference with the reference
  for(j in 1:length(Sm_guesses)){
    
    new_Sm <- rbind(Sm, Sm_guesses[j]*Sm$data[length(Sm$data)]+Sm$data[length(Sm$data)])
    new_V_Reference <- fd.estimate(new_Sm$data, method=method)$fd
    
    if (abs(new_V_Reference - V_Reference) < minDifference ){
        Sm_prediction <- Sm$data[length(Sm$data)]+Sm_guesses[j]*Sm$data[length(Sm$data)]
        minDifference <- abs(new_V_Reference - V_Reference)
    }
  }
  
  print(i)
  #add  prediction to Sm
  Sm <- rbind(Sm, Sm_prediction)
 
 
}

id <- endingIndex:(endingIndex+N)

pred <- Sm$data[id]
real <- as.data.frame(mainData$data[id], row.names = NULL)

plot(pred, type="l",col=2,ylim = range(pred,real),lty=3)
lines(real,lwd=2)

play along

 
Maxim Dmitrievsky:

The topic of MO has descended to the level of COC ......

When you don't know much, there are only wonders...