Machine learning in trading: theory, models, practice and algo-trading - page 1830

 
Evgeny Dyuka:

The sad thing is that you say regularities, regularities, but ask you what exactly drives the market and you won't say... because your understanding of the regularity ends with the answer from the network in the form of probability.

Your whole solution is to build hundreds of networks on different data and analyze their outputs in the form of probability...

What if you understand the market and its laws?

Then one log rule can describe what you describe with hundreds of networks...


For example, here's a rule with only three elements, and it trades just as well as a hundred of your nets.



What if I find 1000 of such rules and create a sort of "Random Forest", or an ensemble of rules, will the quality improve?

And what if you try to create an ensemble of 1000 neurons with 100 neurons each? you get 100k neurons, first of all you won't find a unique dataset for 100k neurons, and secondly you will wait forever for an answer. ....

Bottom line - my approach is "cleaner", faster and more scalable + it can be explained, your approach is impossible to develop (!

Unfortunately

 
mytarmailS:

In the end - my approach is "cleaner", faster and more scalable, your approach is not possible to develop (

To be honest, your approach is not very clear either, though it may be only for me....

 
Evgeny Dyuka:

Here is a living example of how you can compress information , this is not what I do, but I strive for this concept

The data is Fisher's irises.

iris[sample(100,100),] 
    Sepal.Length Sepal.Width Petal.Length Petal.Width    Species
51           7.0         3.2          4.7         1.4 versicolor
31           4.8         3.1          1.6         0.2     setosa
79           6.0         2.9          4.5         1.5 versicolor
33           5.2         4.1          1.5         0.1     setosa
73           6.3         2.5          4.9         1.5 versicolor
80           5.7         2.6          3.5         1.0 versicolor
16           5.7         4.4          1.5         0.4     setosa
74           6.1         2.8          4.7         1.2 versicolor
30           4.7         3.2          1.6         0.2     setosa
17           5.4         3.9          1.3         0.4     setosa
25           4.8         3.4          1.9         0.2     setosa
75           6.4         2.9          4.3         1.3 versicolor
57           6.3         3.3          4.7         1.6 versicolor
65           5.6         2.9          3.6         1.3 versicolor
96           5.7         3.0          4.2         1.2 versicolor
.........
......
...
..
..


Training a Random Forest, for clarity, but let's pretend it's an ensemble of your neural networks

Trained....

We get predictive rules, there are about 700 of them.

condition                                              pred        
  [1,] "X[,3]<=2.45"                                          "setosa"    
  [2,] "X[,3]>2.45 & X[,3]<=4.85"                             "versicolor"
  [3,] "X[,3]>2.45 & X[,3]<=4.85 & X[,4]<=1.6"                "versicolor"
  [4,] "X[,4]>1.6"                                            "virginica" 
  [5,] "X[,3]>2.45 & X[,3]<=4.95 & X[,4]<=1.75"               "versicolor"
  [6,] "X[,3]>4.95 & X[,4]<=1.55"                             "virginica" 
  [7,] "X[,3]>4.85 & X[,3]<=5.15 & X[,4]<=1.75 & X[,4]>1.55"  "versicolor"
  [8,] "X[,4]>1.75"                                           "virginica" 
  [9,] "X[,3]>5.15"                                           "virginica" 
 [10,] "X[,3]<=2.45"                                          "setosa"    
 [11,] "X[,3]<=4.95 & X[,3]>2.45 & X[,4]<=1.65"               "versicolor"
 [12,] "X[,4]>1.65"                                           "virginica" 
 [13,] "X[,3]>4.85 & X[,4]>1.65"                              "virginica" 
 [14,] "X[,4]>1.9"                                            "virginica" 
 [15,] "X[,3]>4.95 & X[,4]<=1.65"                             "virginica" 
 [16,] "X[,3]>4.95 & X[,4]<=1.75 & X[,4]>1.65"                "versicolor"
 [17,] "X[,3]>4.95"                                           "virginica" 
 [18,] "X[,4]<=0.8"                                           "setosa"    
 [19,] "X[,3]<=4.75 & X[,4]>0.8"                              "versicolor"
 [20,] "X[,3]>4.75 & X[,3]<=5 & X[,4]<=1.7"                   "versicolor"
 [21,] "X[,3]>5 & X[,4]<=1.55"                                "virginica" 
 [22,] "X[,3]>4.75 & X[,3]<=5.45 & X[,4]<=1.7 & X[,4]>1.55"   "versicolor"
 [23,] "X[,3]>5.45"                                           "virginica" 
 [24,] "X[,4]>1.7"                                            "virginica" 
 [25,] "X[,3]<=5.05 & X[,4]>0.8 & X[,4]<=1.75"                "versicolor"
 [26,] "X[,3]>4.95"                                           "virginica" 
 [27,] "X[,2]>2.6 & X[,3]<=5.05 & X[,3]>4.95"                 "versicolor"
 [28,] "X[,4]>1.75"                                           "virginica" 
 [29,] "X[,2]>3.1 & X[,3]<=5.05 & X[,4]>0.8"                  "versicolor"
 [30,] "X[,3]>5.05 & X[,4]<=1.55"                             "virginica" 
 [31,] "X[,2]<=2.85 & X[,3]>5.05 & X[,4]<=1.7 & X[,4]>1.55"   "versicolor"
 [32,] "X[,3]>5.05"                                           "virginica" 
 [33,] "X[,3]>5.05"                                           "virginica" 
 [34,] "X[,4]<=0.75"                                          "setosa"    
 [35,] "X[,3]<=4.95 & X[,4]>0.75 & X[,4]<=1.7"                "versicolor"
 [36,] "X[,4]>1.7"                                            "virginica" 
 [37,] "X[,2]>3.1 & X[,3]<=4.95 & X[,4]>0.75"                 "versicolor"
 [38,] "X[,3]>4.95"                                           "virginica" 
 [39,] "X[,3]<=4.95 & X[,4]>0.8 & X[,4]<=1.7"                 "versicolor"
 [40,] "X[,4]>1.7"                                            "virginica" 
 [41,] "X[,3]>4.95"                                           "virginica" 
 [42,] "X[,4]<=0.7"                                           "setosa"    
 [43,] "X[,2]<=2.25 & X[,4]<=1.25"                            "versicolor"
 [44,] "X[,2]<=2.25"                                          "versicolor"
 [45,] "X[,2]>2.25 & X[,4]>0.7 & X[,4]<=1.75"                 "versicolor"
 [46,] "X[,3]>5.3"                                            "virginica" 
 [47,] "X[,4]>1.75"                                           "virginica" 
 [48,] "X[,3]>2.45 & X[,3]<=4.95 & X[,4]<=1.75"               "versicolor"
 [49,] "X[,3]>4.95 & X[,4]<=1.55"                             "virginica" 
 [50,] "X[,3]>4.95 & X[,3]<=5.45 & X[,4]<=1.75 & X[,4]>1.55"  "versicolor"
 [51,] "X[,3]>4.95"                                           "virginica" 
 [52,] "X[,4]>1.75"                                           "virginica" 
 [53,] "X[,2]>3 & X[,3]>2.45 & X[,3]<=4.85"                   "versicolor"
 [54,] "X[,4]>1.75"                                           "virginica" 
 [55,] "X[,3]<=4.85 & X[,4]>0.8 & X[,4]<=1.65"                "versicolor"
 [56,] "X[,3]<=4.65 & X[,4]>1.65"                             "virginica" 
 [57,] "X[,4]>1.65"                                           "virginica" 
 [58,] "X[,3]<=5.3 & X[,4]<=1.75"                             "versicolor"
 [59,] "X[,2]>2.6 & X[,3]>4.85 & X[,3]<=5.3 & X[,4]<=1.75"    "versicolor"
 [60,] "X[,3]>5.3"                                            "virginica" 
 [61,] "X[,4]>1.75"                                           "virginica" 
 [62,] "X[,3]<=2.5"                                           "setosa"    
 [63,] "X[,3]>2.5 & X[,3]<=4.95 & X[,4]<=1.75"                "versicolor"
 [64,] "X[,3]>4.95 & X[,3]<=5.05 & X[,4]<=1.65"               "virginica" 
 [65,] "X[,4]<=1.75"                                          "versicolor"
 [66,] "X[,3]<=4.75 & X[,4]>1.65"                             "virginica" 
 [67,] "X[,3]>4.75 & X[,4]<=1.75 & X[,4]>1.65"                "versicolor"
 [68,] "X[,3]>5.35"                                           "virginica" 
 [69,] "X[,4]>1.75"                                           "virginica" 
 [70,] "X[,3]<=4.75 & X[,4]>0.7"                              "versicolor"
 [71,] "X[,4]>1.65"                                           "virginica" 
 [72,] "X[,3]>4.75 & X[,3]<=4.95 & X[,4]<=1.7"                "versicolor"
 [73,] "X[,2]<=2.65 & X[,3]>4.95"                             "virginica" 
 [74,] "X[,2]<=2.75 & X[,2]>2.65 & X[,4]<=1.7"                "versicolor"
 [75,] "X[,3]>4.75"                                           "virginica" 
 [76,] "X[,4]>1.7"                                            "virginica" 
 [77,] "X[,2]>3.1 & X[,3]>4.75 & X[,3]<=4.85"                 "versicolor"
 [78,] "X[,4]>1.7"                                            "virginica" 
 [79,] "X[,3]>2.45 & X[,3]<=5 & X[,4]<=1.65"                  "versicolor"
 [80,] "X[,4]<=1.65"                                          "versicolor"
 [81,] "X[,3]>5"                                              "virginica" 
 [82,] "X[,4]>1.65"                                           "virginica" 
 [83,] "X[,3]>2.45 & X[,3]<=5.05 & X[,4]<=1.75"               "versicolor"
 [84,] "X[,4]>1.75"                                           "virginica" 
 [85,] "X[,2]>3.1 & X[,3]>2.45 & X[,3]<=5.05"                 "versicolor"
 [86,] "X[,3]>5.05"                                           "virginica" 
 [87,] "X[,3]<=4.95 & X[,4]>0.8 & X[,4]<=1.65"                "versicolor"
 [88,] "X[,3]>4.95 & X[,4]<=1.55"                             "virginica" 
 [89,] "X[,3]<=5.45 & X[,4]<=1.65 & X[,4]>1.55"               "versicolor"
 [90,] "X[,3]>4.95"                                           "virginica" 
 [91,] "X[,4]>1.65"                                           "virginica" 
 [92,] "X[,4]>0.75 & X[,4]<=1.65"                             "versicolor"
 [93,] "X[,4]>1.65"                                           "virginica" 
 [94,] "X[,2]>3.1 & X[,3]<=4.85 & X[,4]>0.75"                 "versicolor"
 [95,] "X[,4]>1.65"                                           "virginica" 
 [96,] "X[,3]<=4.95 & X[,4]>0.8 & X[,4]<=1.75"                "versicolor"
 [97,] "X[,3]<=4.95 & X[,4]<=1.75 & X[,4]>1.65"               "virginica" 
 [98,] "X[,3]>4.95 & X[,4]<=1.55"                             "virginica" 
 [99,] "X[,3]>4.95 & X[,3]<=5.45 & X[,4]<=1.75 & X[,4]>1.55"  "versicolor"
..........................
..............
.......
....
..

now the magic, there is an algorithm that can make 7 of these 700 rules, with minimal loss of quality

learner[,-c(1:3)]
     condition                                pred        
[1,] "X[,3]<=2.45"                            "setosa"    
[2,] "X[,3]<=4.95 & X[,3]>2.45 & X[,4]<=1.65" "versicolor"
[3,] "X[,3]>4.95 & X[,4]>1.7"                 "virginica" 
[4,] "X[,2]<=3.1 & X[,3]<=4.95 & X[,4]>1.65"  "virginica" 
[5,] "X[,3]>4.95 & X[,4]<=1.55"               "virginica" 
[6,] "X[,3]<=5.3 & X[,4]<=1.75"               "versicolor"
[7,] "X[,1]==X[,1]"                           "versicolor"

That's our entire 700-rule random forrest

not bad ? :)

 
mytarmailS:

The sad thing is that you say regularities, regularities, but ask you what exactly drives the market and you won't say... because your understanding of the regularity ends with the answer from the network in the form of probability.

Your whole solution is to build hundreds of networks on different data and analyze their outputs as probability...

Yes, that's right, for the sake of real practical results

And what if to understand the market and understand its laws?

Why? Maybe it's not even possible.

Then one log rule can describe what you're describing with hundreds of nets.

this is fantasy, flying in the clouds

replied above in the text...

 
Evgeny Dyuka:

answered the above text...

You write :

Evgeny Dyuka:

2. In practice, using my method, you can accurately obtain the answer in only about 1% of questions. Simply speaking, if you ask the net "where the price will be in 5 min up or down" at every min candlestick, then only 1 time out of 100 will the net reply.

Doesn't it seem to you that the reason is that your data contains 99% of garbage and 1% of useful information? Don't you think that this 1% can be described by 1-3 log rules?

 
mytarmailS:

You write :

Evgeny Dyuka:

2. In practice, using my method, you can receive with acceptable accuracy the answer in only about 1% of questions. Simply speaking, if you ask the net "where the price will be in 5 min - higher or lower?", then only 1 time out of 100 net answers.

Doesn't it seem to you that the reason is that in your data there is 99% of garbage and 1% of useful information? Don't you think that this 1% can be described by 1-3 logging rules?

Unfortunately, this is not the case.

99% of trash is an intrinsic property of the object you're studying. This is how it is built, this is its nature.
We all would like to hope that it has harmony inside and understandable simple rules, which we have not found, but will find. There are no such rules. Philosophically, of course they may exist, everything has a reason, but they are beyond our current and prospective capabilities.

This 1% is not described by simple rules either, the network is trained on 500,000 examples to reach it - these are clearly not simple rules.

 
mytarmailS:

Here is a living example of how to compress information , this is not what I do, but I strive for this concept

The data is Fisher's irises.


Training a Random Forest, for clarity, but let's pretend it's an ensemble of your neural networks

Trained....

We get predictive rules, there are about 700 of them.

now the magic, there is an algorithm that can make 7 of these 700 rules, with minimal loss of quality

That's our entire 700-rule random forrest

not bad ? :)

To describe 150 rows of data (in the Iris data there are 150) maximum you need 150 rules (if all the rows are unique).
Where did you get 700 from?

 

To describe 150 rows of data (there are 150 rows in Iris data) you need 150 rules at most (if all rows are unique).
Where did you get 700 from?

I don't know exactly how the forrest package works, but with 100 trees it generates 400-700 rules, most likely every branch is considered a rule

Evgeny Dyuka:
Unfortunately, this is not true.

99% of the trash is an intrinsic property of the object you're examining. That's the way it is, that's its nature.
We all would like to hope that it has harmony inside and clear simple rules that we have not found yet, but we will. There are no such rules. Philosophically, of course they may exist, everything has a reason, but they are beyond our current and prospective capabilities.

This 1% is not described by simple rules either, the network is trained on 500,000 examples to reach it - these are clearly not simple rules.

dunghill.

 
mytarmailS:

So that's our entire 700-rule (tree) random forrest

not bad ? :)

What is the principle of leaf reduction? Grouping by similarity and selecting the best option from the group?

 
mytarmailS:

I don't know exactly how the forrest package works, but when set to 100 trees it generates 400-700 rules, most likely every branch counts as a rule

I don't know...

Apparently 700 is the total for 100 trees.

If you build one tree, you get the same 7 rules that you consider magic))

Here's what 1 tree gave me for irises (Accuracy 96% or 6 errors out of 150 examples)


 if(x[3]<1.800000){
  if(x[3]<1.000000){v[0]=1.000000;v[1]=0.000000;v[2]=0.000000;s=50;}
  else{
   if(x[2]<4.700000){
    if(x[2]<4.500000){v[0]=0.000000;v[1]=1.000000;v[2]=0.000000;s=29;}
    else{v[0]=0.000000;v[1]=0.909091;v[2]=0.090909;s=11;}}
   else{v[0]=0.000000;v[1]=0.714286;v[2]=0.285714;s=14;}}}
 else{
  if(x[0]<6.300000){v[0]=0.000000;v[1]=0.090909;v[2]=0.909091;s=11;}
  else{v[0]=0.000000;v[1]=0.000000;v[2]=1.000000;s=35;}}
 

Reason: