Hybrid neural networks. - page 5

 
IlyaA >> :


OK. I'll wait and see :) This is very reminiscent of a ray search....

No, it's not a ray search. And, strictly speaking, not a genetic algorithm in the classical sense. I described only the main, so to speak, backbone of my algorithm. And it is very similar to GA.

 
IlyaA писал(а) >>

nice. 2 incoming bars?

10

 
gumgum >> :

10


Close only. Nadaže, and it's quite good, but imagine if you put a swing in there? Again, have you checked for overtraining?
 
IlyaA >> :
I suggest that the best filter for single moments is to integrate them. The grid will have a better chance of isolating a repeatedly occurring phenom.

then you write, but describe it in more detail:

What is noise integration? .......................

If you meant exactly suppressing noise by averaging, that's not a good idea. It's not a good idea to extrapolate from BP, and it's not the best way to use NN.

To limit a network from fuzzy nuggets searching at once (let's call it abstractionism effect), we reduce number of neurons, so we gain in generalizability of the network and make it impossible for perceptron to learn a lot of material. We will target the network to find only one, but the most profitable thing on the data.

You don't have to neuter the brain to make it smarter. It needs to be trained properly, without using averaging filters in the process. What does it mean to neuter, though? I have no idea what you're feeding into the input. Maybe 20 neurons is a lot, or maybe 10,000 is not enough. In fact, you shouldn't try to force NN to remember one or the other "thing". A properly trained network is capable of extracting data it doesn't know from the scarcity of information available to it.

"Don't read too many books "C - can't remember who said....

 
joo >> :

No, it's not a ray search. And, strictly speaking, not a genetic algorithm in the classical sense. I only described the basic backbone of my algorithm, so to speak. And it's very similar to GA.


If you made it up, then you're a big inventor :) Did you really test it?
 
IlyaA писал(а) >>

Close only. Nada, and it's pretty good, but can you imagine if you put a swing in there? Again, have you checked for overtraining?

No.

 
IlyaA >> :


If you made it up, then you're a big inventor :) Did you really test it?

Actually, I made it up when I was in 3rd or 4th grade. When do you do root extraction? Here, I've been doing square roots, cube roots... I did it on a notebook sheet with squares in it.

Tested it. The results are really impressive.

 

I teach at the initialisation stage of the indicator. And then it thinks by itself...

 
gumgum >> :

I teach at the initialisation stage of the indicator. And then it thinks by itself...

Try to write an Expert Advisor on this indicator. I think the result will surprise you. Unpleasant...

 
joo >> :

Actually, I invented it when I was in the 3rd or 4th grade. When do you extract roots? Here, I was doing square roots, cube roots... But on a notebook paper with squares in it.

Tested it. The results are really impressive.


So we'll take it into development. Report back on the noise.
Reason: