"New Neural" is an Open Source neural network engine project for the MetaTrader 5 platform. - page 83

 
ivandurak:
And also add the ability for neurons to be born and die. Just like between the ears.
Not like between the ears. An analogue of inter-ear brains would feel fear and greed, and we can do that ourselves. :)
 
joo:
Like between the ears don't. The analogue of the interaural brains will feel fear and greed, and we can do that ourselves. :)
It depends on what to teach, apparently human beings have inherently ineradicable neurons and connections such as the instinct of self-preservation. )
 
With artificial neural networks, it is kind of clear that there is a CPU or GPU, which adjusts weights in neurons. But in biological neurons, who or what performs the function of neuron tuning. I did not find anything, biologists are silent too. Tried a cat, it just looked like ........
 
ivandurak:
With artificial neural networks, it is kind of clear that there is a CPU or GPU, which adjusts weights in neurons. But in biological neurons, who or what performs the function of adjusting neurons. I did not find anything, biologists are silent too. I tortured a cat, she just looked at .........
Neuron weights are tuned by the fitness function - life.
 
ivandurak:
With artificial neural networks, it is kind of clear that there is a CPU or GPU, which adjusts weights in neurons. But in biological neurons, who or what performs the function of adjusting neurons. I did not find anything, biologists are silent too. I tortured the cat, it just looked at .........
Check out the work of Sebastian Seung, Henry Markram and Kwaben Boahen. These are some of the leading modern scientists in this field and their results are very interesting.
 
joo:
Neuronal weights are tuned by the fitness function - life.

No, weights are tuned by the learning algorithm, there are many of them, which one is used in the head is not known for sure.

But it is assumed that some areas use direct propagation, some the opposite.

 
ivandurak:
With artificial neural networks it is kind of clear that there is a CPU or GPU that adjusts weights in neurons. But in biological neural networks, who or what performs the function of adjusting neurons. I did not find anything, biologists are silent too. Tried a cat, it just looked like ........

I will try to explain in order.

  1. Biological neurons send information in the form of electrical impulses (spikes).
  2. According to some scientists, information is encoded in relative times of impulses at different neuron inputs (a type of binary code). According to other scientists, information is encoded in the number of pulses per time interval in some connection. This number of pulses can be represented as an analog signal. All of the classic neural networks you read about here and in books are based on this principle of analog coding. The theory of temporal coding is quite new (since mid-90's) and promises to unlock the "secrets" of our intelligence :)
  3. Electrical impulses propagate only through the body of the neuron in the form of a potential difference between the external and internal environment (the neuron shell plays the role of a capacitor).
  4. When reaching the axon (tail) of the neuron, the electrical impulse causes release of a special chemical substance (neuro-transmitter, mediator), which migrates to the dendrite of another neuron through the gap contact (spike), called synapse. Receptors of the dendrite absorb the neuro-transmitter, which, if it exceeds the activation threshold, excites an electrical impulse in the receiving neuron. And so on.
  5. The weights of synapses (connections) between neurons depend on the amount of neuro-transmitter in the axon of the releasing neuron and the amount of receptors in the receiving neuron's dendrite. If the receiving neuron is activated and generates a pulse, this pulse spreads both to the axon (tail) of this neuron, and to its dendrites, where chemical reactions occur regulating the number of receptors and transmitters - i.e. weights. Due to the cluster structure of receptors and transmitters, the weights can be relulated from 0 (no contact) to 64. There is still a lot of ambiguity in this theory. According to another theory, the connection weight depends on the number of contacts (spikes) between two neurons. A neuron's dendrites form a branching tree. The axon of another neuron can contact this tree in several places. Each contact is binary (either it is or is not). The number of contacts determines the weight.
  6. Regardless of the theory, all scientists agree that weights change as a result of Spike-timing-dependent plasticity (STDP). According to this mechanism, the connection weight increases if the receiving neuron generates a pulse after the input pulse in a given connection. And it decreases if the output pulse occurred before the input pulse. STDP was experimentally measured in 1996, which gave rise to the theory of temporal coding.
  7. There is also global feedback. If we are satisfied with some outcome of our actions, our brain sends out dopamine, reinforcing connections between all the neurons whose activation led to that outcome. Incidentally, drugs replace the dopamine produced by the brain, reducing its production in the brain and making addicts dependent on drugs.
  8. Neurons can also form new contacts where there were none before (so-called structural plasticity). Somewhere on the internet I saw a video of scientists growing two neurons in a test tube, activating them with impulses and watching how the axon of one neuron began to grow in the direction of the dendrite of the other neuron making contacts in places where the electric field was maximum.

https://www.youtube.com/watch?v=xMCQPHb3iSw&feature=related

https://www.youtube.com/watch?v=_JgQtjhfnPE&feature=related

As you can see from this short course into the brain, classical neural networks are pretty far from biological ones.

 
gpwr:
Since you visited us, any comments on the model that I presented above?
 
Urain:
Since you visited us, maybe any comments on the model that I presented above?
I didn't really get into this model. There are no comments, other than the ones I've already made.
 
They all went on a rambling rambling rambling, like sissies, with so many words and so little sense.
Reason: