Discussion of article "Third Generation Neural Networks: Deep Networks" - page 6

 
Lorentzos Roussos:
1,2,3 and 4 , i believe whatever indicators and settings are passed in , inherently adjust to the underlying asset.
...
Our averages will essentially be the adjustment of RSI regardless of settings to that asset .
The question is not if indicators should be optimized in my humble opinion ,but whether or not the indicator is utilizable 
fundamentally.

In the above example you can grasp my point by viewing the Averages for a RSI(3) versus an RSI(16) . 
The RSI(3) will constantly trigger our optimized levels versus the RSI(16).

As your example - if I understand it correctly - tells me RSI(3) is of no help as it will not distinct between 'good' (potential profit > ??) and 'bad' (potential profit < ??) but RSI(16) does.

But if so there has been an optimization as after that we know 16 is better than 3 - or from where do you know that?

Now do you train the NN with RSI(3)? It will be deleted probably. Or are you trying RSI(3) (NN-input 1) and RSI(16) (NN-input 2) and if RSI(3) will be deleted (NN-input 1 is set to 0 e.g.)  RSI(x) has been optimized to 16 - even in very simple way. Do we need an NN for this having the MT-optimizer?

Or am I missing something in your example?

 
Carl Schreiber:

As your example - if I understand it correctly - tells me RSI(3) is of no help as it will not distinct between 'good' (potential profit > ??) and 'bad' (potential profit < ??) but RSI(16) does.

But if so there has been an optimization as after that we know 16 is better than 3 - or from where do you know that?

Now do you train the NN with RSI(3)? It will be deleted probably. Or are you trying RSI(3) (NN-input 1) and RSI(16) (NN-input 2) and if RSI(3) will be deleted (NN-input 1 is set to 0 e.g.)  RSI(x) has been optimized to 16 - even in very simple way. Do we need an NN for this having the MT-optimizer?

Or am I missing something in your example?

Reffering to RSI(3) and RSI(16) as an example of possible fundamental utilization gaps in real time.
The ideal would be a variable period RSI in this example 
 
Lorentzos Roussos:
Reffering to RSI(3) and RSI(16) as an example of possible fundamental utilization gaps in real time.
The ideal would be a variable period RSI in this example 

ok - so what is sent to the NN?

RSI(..) with a fix value (how did get it) with a variable value - can one optimize the calculation or not?

This all influences the danger of over adapting - therefore sorry being so nasty.

 
Is there an English version from the resources you attached ?
 

Hi Vladimir, I am very impressed with your article.

I managed to get it installed and tried with the various steps in R.  I have some doubts about spatialSign transformation hopefully you can help me to understand.  

I tried to learn the effect of preProcess with spatialSign, so I tried the following codes:

try=cbind(X=c(1,2,3,4),Y=c(10,20,30,40))
predict(preprocess(try,method="spatialSign"),try)

 I get the following results: 

              X          Y
[1,] -0.7071068 -0.7071068
[2,] -0.7071068 -0.7071068
[3,]  0.7071068  0.7071068
[4,]  0.7071068  0.7071068

 I was very surprised with this result, intuitively, I would expect 1 and 2 should not be the same in spatialSign.  I know it first center and scale then apply spatialSign, is the result correct?

 

fantastic article indeed even in these days.


but  I got a question why my Kzz equals -Inf?


sig.zz<-ifelse(tail(dt[  , ncol(dt)], 500) == 0, 1, -1)

bal.zz<-cumsum(tail(price[  , 'CO'], 500) * sig.zz)

Kzz<-mean(bal.zz / bal)

Kzz  -Inf

 

Using Rsi could be the wrong approach.

Maybe its a better one to let it trade directly, like as if it needs to learn to walk, or play chess or any other game.

 

  pr.sae<-nn.predict(SAE, x.ts)   

sig<-ifelse(pr.sae>mean(pr.sae), -1, 1)   

sig.zz<-ifelse(y.ts == 0, 1,-1 )   

bal<-cumsum(tail(price[  ,'CO'], bar) * sig)   

bal.zz<-cumsum(tail(price[  ,'CO'], bar) * sig.zz)

sir, above code, when calc bal you havent move the sign backword like you did in the ariticle DEEP NEURAL NETDEEP NEURAL NETWORK WITH STACKED RBM.

sth do I miss?


 
Vladimir Perervenko:

Hi Fabio,

I'm sorry.

I do not write on MKL5.

Best regards

Vladimir


Vladimir Perervenko:

Hi Fabio,

What's not clear?

Hi Vladimir,

I don't have the vector of (o,h,l,c) in RStudio, How to download data in R and work with them in RStudio? when I run "head(price)" this massge shown:

> head(price)

Error in head(price) : object 'price' not found

> # The MT4 numbering bars from the most recent to oldest. The R on the contrary,

>     # from the old to the new, new bar last, then use "rev" function to reverse this.

>     price <- cbind(Open = rev(o), High = rev(h), Low = rev(l), Close = rev(c))

Error in rev(o) : object 'o' not found

 

Hi,

Thanks for sharing the information

On this:

save(SAE, prepr, file="SAE.model")


How we can save the prepr?

I'm afraid the one I'm saving is not the same like the one in your model.

Could you please elaborate a little bit more.

Thanks.

Reason: