Machine learning in trading: theory, models, practice and algo-trading - page 488

 
Maxim Dmitrievsky:

Well there is a question of the correct chips and targets, although it would seem what could be easier than the multiplication table, but there is not a small error

I can't be sure that you are learning correctly, and that's why there is no error.

I will never be able to check the correctness of your training, hence the mistakes.
 
Andrey Kisselyov:
I can't be sure that you are learning correctly, and that's why there are errors.

I will never be able to check the correctness of your training, hence the mistakes.

Well, yes, considering that RF is not able to extrapolate at all

 
Vizard_:

can...


(It's written everywhere that, like, no. )

 
Vizard_:

You also wrote a rattle))). But you decided to make it produce another.
You want to set it up.
х = 1 0 1 0 1 0 1 0 1 0
target = 1 0 1 0 1 0 1 0 1 0 1 0
then -
х = 1 0 1 0 1 0 1 0 1 1
target = 1 0 1 0 1 0 1 0 1 0 1 0
etc...
On an interpretable example in short look.
accuracy,lloss, kappa...and p.r. whatever you like. Well and earlier rightly written-
there's a lot to see in the forest...


All right, if so, now I'll finish the strategy at once and then we'll see what's what :)

 

Greetings neuronists! Great minds ))

Here's a movie about a neuralist who created a super-predictive program and "helped" a bank get "rich".



 
Alexander Ivanov:

Greetings neuronists! Great minds ))

here's a movie about a neuralist who created a super-predictive program and "helped" the bank "get rich" .




you should watch "The Texas Chainsaw Massacre", it's a new movie, it's relaxing.

 

I can't help thinking that a number of problems are common to both classification and regression models.


One such problem is multicollinearity, which is usually interpreted as a correlation between input variables, but this may not be entirely true.


Multicollinearity in common parlance leads to extremely unpleasant consequences that negate our modeling efforts:

  • model parameters become uncertain
  • standard estimation errors become infinitely large


If multicollinearity is understood as a linear relationship between input variables (explanatory variables, predictors), then we have the following picture

  • Although the OLS estimates are still unbiased, they have large variance and covariance, making an accurate estimate difficult
  • As a result, the confidence intervals tend to be wider. Therefore, we may not reject the "null hypothesis" (i.e., the true sampling ratio is zero),
  • Because of the first effect of t the ratios of one or more coefficients tend to be statistically insignificant
  • Even though some regression coefficients are statistically insignificant, the R^2 value can be very high
  • OLS estimation tools and their standard errors can be sensitive to small changes in the data


Here is an article that provides R tools to recognize the presence of multicollinearity.

Multicollinearity in R
Multicollinearity in R
  • Bidyut Ghosh
  • www.r-bloggers.com
One of the assumptions of Classical Linear Regression Model is that there is no exact collinearity between the explanatory variables. If the explanatory variables are perfectly correlated, you will face with these problems: However, the case of perfect collinearity is very rare in practical cases. Imperfect or less than perfect...
 
SanSanych Fomenko:

I can't help thinking that a number of problems are common to both classification and regression models.


One such problem is multicollinearity, which is usually interpreted as a correlation between input variables, but this may not be entirely true.


Multicollinearity in common parlance leads to extremely unpleasant consequences that negate our modeling efforts:

  • model parameters become uncertain
  • standard estimation errors become infinitely large


If multicollinearity is understood as a linear relationship between input variables (explanatory variables, predictors), then we have the following picture

  • Although the OLS estimates are still unbiased, they have large variance and covariance, making an accurate estimate difficult
  • As a result, the confidence intervals tend to be wider. Therefore, we may not reject the "null hypothesis" (i.e., the true sampling ratio is zero),
  • Because of the first effect of t, the ratios of one or more coefficients tend to be statistically insignificant
  • Even though some regression coefficients are statistically insignificant, the R^2 value can be very high
  • OLS estimation tools and their standard errors can be sensitive to small changes in the data


Here is an article that provides R tools to recognize the presence of multicollinearity.


thanks for the new word, already had a couple of glosses today :)

what other problems are there?

 

Today I decided to check my network based on Percetron. Optimized to May-early June 2016, EURUSD, spread 15 pips.

The tail itself.

I am still confused by the result.

 
forexman77:

Today I decided to check my network based on Percetron. Optimized to May-early June 2016, EURUSD, spread 15 pips.

The tail itself.

So far I am confused by the result.

I am spoiled too, even somewhere in a kind of shock. I have tried it with random samples and the results are amazing. I have not done TC yet.

Maxim says it takes a long time to learn. I have about 23 hours. But even if I do it once every 3 months - what a rubbish).

The difference is that I should have had it for 3 months.

Reason: