Hybrid neural networks.

 

I don't want the topic to go just to the statistics of neural network topics.

I propose to share experience and problems in working and training of non-standard neural network architectures.

Here's the first link for theory

http://cgm.computergraphics.ru/content/view/62

or pdf file

Files:
num6neiro.zip  636 kb
 
sergeev >> :

I don't want the topic to go just to the statistics of neural network topics.

I propose to share experience and problems in working and training of non-standard neural network architectures.

Here's the first link for theory

http://cgm.computergraphics.ru/content/view/62

or a pdf file


In neural networks I grew up with Heikin. I'm telling you right off the bat the book is not easy. You have to be very good at maths and you have to be on a first-name basis with a capital T.

In addition I would like to say that even in this book from a reputable publishing house there are a lot of errors. Whether they were in it by accident or not is a third question.

On neural networks, it's better to read in the originals. And this is English...

 
Something about the title doesn't match the content. There are several canonical (standard) network metaphors described in the link, and only a paragraph about hybrid metaphors, saying that everything described can be combined. That is, there is nothing hybrid or non-standard. What is the point of the new topic?
 

It's strange, either almost no one is interested in it or everyone gets drunk on what they have and no one wants to distribute the place.

more likely the second)

 
there are attempts to use hybrid networks based on fuzzy logic. It is still being refined.
 
dentraf писал(а) >>
There are attempts to use hybrid networks based on fuzzy logic. Still in the refinement stage.

Anfisa mz Matlaba or fuzzyTech trying to improve :)

 
SergNF писал(а) >>

Anfisa mz Matlaba or fuzzyTech are trying to improve :)

no, the MQL says

 
sergeev >> :

I invite you to share your experiences and challenges in working and learning Non-standard neural network architectures....

Isn't it time to start sharing experiences?

 
Gentlemen, training my perceptrons with back propagation algorithm. It works, but probability of finding global extremum is 50%-70% (from 100 neurons). Recently I finished writing genetics for XORa - I was happy. But when the average perceptron multiplied and began to mate, I realized that without parallel computation I'd sit for a month! Who overcame this limitation?
 
IlyaA >> :
Gentlemen, training my perceptrons with back propagation algorithm. It works, but probability of finding global extremum is 50%-70% (from 100 neurons). Recently I finished writing genetics for XORa - I was happy. But when the average perceptron multiplied and began to mate, I realized that without parallel computation I'd sit for a month! Who overcame this limitation?

Which limitation exactly are you talking about?

 
joo >> :

What kind of restriction are you talking about?

Time limit.
Reason: