Machine learning in trading: theory, models, practice and algo-trading - page 2299

 

It is funny - in Statistica package a block of neural networks like MPP(multilayer direct-directed perseptron) can not find the dependence of the type y = x1/x2.

During the search the package tried 600 networks, the best network showed less than 80% accuracy.

The number of hidden neurons is from 2 to 8.

 
denis.eremin:

It is funny - in Statistica package a block of neural networks like MPP(multilayer direct-directed perseptron) can not find the dependence of the type y = x1/x2.

During the search the package tried 600 networks, the best network showed less than 80% accuracy.

The number of hidden neurons is from 2 to 8.

Come on, don't be absurd... how many layers?

 
mytarmailS:

Come on, don't be ridiculous... how many layers?

3


 
denis.eremin:

It is funny - in Statistica package a block of neural networks like MPP(multilayer direct-directed perseptron) can not find the dependence of the type y = x1/x2.

During the search the package tried 600 networks, the best network showed less than 80% accuracy.

The number of hidden neurons is from 2 to 8.

https://www.mql5.com/ru/forum/8265/page3#comment_5706960

https://www.mql5.com/ru/forum/8265/page2#comment_333746

 
I haven't tried RBF nets - it takes a long time to count
 
denis.eremin:

3

how many hidden layers ? )

 
mytarmailS:

how many hidden layers ? )

There are a total of 3 layers in each network.

 
denis.eremin:

In total there are 3 layers in each network.

So there's actually one hidden layer?

 
denis.eremin:

In total there are three layers in each network.

The input, the output, and the hidden one?

 
mytarmailS:

You actually have one hidden?

Yes

Reason: