Discussion of article "Neural Networks: From Theory to Practice" - page 4

 
gpwr:

Network learning = fitting

Self-learning = self-fitting

In principle, this is the case, but self-fitting is fully automatic, and for fitting it is necessary to run the strategy tester in the terminal from time to time.
 
Life is self-fitting...
 
joo:
Life is self-fitting...

And it always ends in death.
 
Reshetov:
And it always ends in death.

Genetic search is conducted by a population, not an individual, and death is only an instrument of genetic selection (population renewal).

So far, the human population is thriving, although many animal populations are dying out under the pressure of humanity's sappressor.

 
You're on a roll :)
 
Urain:

Genetic search is conducted by a population, not an individual, and death is only an instrument of genetic selection (population renewal).

So far, the human population is thriving, although many animal populations are dying out under the pressure of humanity's sappressor.

Maybe nature is thus getting rid of what it doesn't need on this aisle. It played around, realised it was useless and frees up resources. And so it will be with us if we don't fulfil its purpose. And perhaps even if we do, she will get rid of us anyway, as we were needed only to make more perfect variants. We're like tools-functions. Receivers of commands from the outside. So the flight is normal. )))
 
tol64:
Maybe that's nature's way of getting rid of what it doesn't need on this passageway.

Yeah. And the genetic algorithm doesn't seem to be one of the things it's getting rid of. Neural nets, too.

;)

 
Reshetov:
Yedelkin: If so, then there is no self-training of neuro-advisors to speak about. And training is called ordinary fitting of parameters.
Do you naively believe that self-training is an unusual fitting?

I "naively believe" that among native Russian speakers it is not customary to call the process of self-learning "fitting of parameters". Just as it is not customary to call the selection of parameters (with the help of external processes) for any system as learning.

Perhaps, I also "naively believe" that if an article is written for dummies, it should explain new concepts by using words in their commonly used meaning. And if an article for dummies analogises neural networks with"the ability of the nervous system to learn and correct errors", then the same article should specify, in particular, how neural networks are able to learn and correct errors on their own. In fact, it turned out that the article does not contain a single word about independent (i.e. without the use of an external optimiser program) acquisition and correction of certain information by neural networks, and the term "learning" was given a new narrowly specialised meaning, namely, the usual search (fitting) of parameters was called learning. With the same success, the vast majority of advisors at the championship can be classed as "trained", as it is unlikely that the optimiser was used to pass the tests. [This paragraph is not a stone against the author of the article, it is just a clarification to the answer for Reshetov].

 

Any mathematical model of a real process is a fitting in one sense or another

A stone thrown from the Leaning Tower of Pisa will not fly strictly in a parabola.

And the trajectory of a spaceship is not ideal from the point of view of mathematical calculations.

And yet they fly!

 
yu-sha: And yet they fly!

I don't mind, "Let them fly!" (with) Questions would not have arisen if the article from the very beginning had said what these or those terms mean in their highly specialised meaning and what specific features of neural networks (external parameter fitting) the article is devoted to. It seems that, with general help, I managed to figure it out.