Machine learning in trading: theory, models, practice and algo-trading - page 2379
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Install correctly
Thank you, I will try it that way. It's just not clear why they didn't install automatically on demand, if they know which package the function is from - a mystery.
Thank you, I will try it that way. It's just not clear why they didn't install automatically on demand, if they know which package the function is from - a mystery.
The same function (rather the name) can be in many packages. Try downloading the dplyr package for example. You will see a lot of conflicts in function names.
Thank you, I will try it that way. It's just not clear why they didn't install automatically on demand, if they know which package the function is from - a mystery.
Take the mlpack package. It has almost everything you need. It's a very good library.
Good luck at .
The same function (rather the name) can be in many packages. Try downloading the dplyr package for example. You will see a lot of conflicts in function names.
I tried your method and it does not work:
Do you teach regression to ones and zeros?
As far as I understand, there is an attempt to transfer the idea of lasso regression to the classification problem in the most thoughtless way possible.)
Good, you need to learn how to add different penalties (you need to figure out what kind) to the target function already used in the classification problem and see how the results change. Otherwise we get something strange - we teach one model, but select features for it by completely different one - just because there is already a ready package in R)
Well, or I've got it all wrong)
As far as I understand, there is an attempt to transfer the idea of lasso regression to the classification problem in the most thoughtless way possible.)
Good, we need to learn how to add different penalties (we need to understand what kind of penalties) to the target function already used in the classification problem and see how the results change. Otherwise we get something strange - we teach one model, but select features for it by completely different one - just because there is already a ready package in R)
Well, or I've got it all wrong.)
Here is a paradoxical situation that even if you accidentally get it right, no one will appreciate it.
because there are no evaluation criteria)What does this line mean:
?
Create a vector with indexes from 1 to 1300 to train the model
Oh, I see, you submitted the first 200 lines, right?
But they were kind of involved in the training.
not the first 200, but the last "tail"
it's a test date
take indexes from 1 to 1300
Can't you take all of them and subtract the last n pieces - it's more convenient, because the number of columns here is very different for different samples.
Can't you take everything and subtract the last n pieces - it's more convenient, because the number of columns here is very different for different samples.
What do you mean?
there is a trace, there is a test
If all the data are defined as a trace, how can we test it?
What do you mean?
there is a track there is a test
If all the data is defined as a trace, how can we test it?
I mistakenly thought we were talking about columns.
Still, can't you do all the training on the sample file, and the test on another file?