Machine learning in trading: theory, models, practice and algo-trading - page 1748

 
Konow's tag:
So the work of the NS is somehow tied to Optimization?

Yes, on the optimization of fs (neurons). There are just a lot of them and they can be connected in different ways

 
Maxim Dmitrievsky:

ahahaha ))))

Of course rough, even in some places wrong, but only in some places. And clearer. Any optimization of extremum search takes full search for the longest one))))) GSF is a superposition of an unbound low-frequency process on a much higher-frequency one, search optimization is a logical thinning of the search. This makes it easier for me to understand everything else.

 
ReTeg Konow:
To some extent it is mandatory. You can only truly understand something that was created by yourself. I am trying to reproduce the original idea behind the concept of NS.

The concept is quite simple - any multidimensional function can be approximated by a composition of one-dimensional ones. I hope you have already invented the concept of "function")

 
Aleksey Nikolayev:

The concept is quite simple - any multidimensional function can be approximated by a composition of one-dimensional ones. I hope you have already invented the concept of "function")

I did not find a separate definition of the notion of a "multivariate function". There is a "distribution function" of probability theory, and within it, a kind of "multivariate distribution functions" is considered, but there is no mention of the MO technology.

Obviously, multidimensional functions, if related to NS, are far from its essence. Probably, something to do with the implementation of some technical nuances. I, on the other hand, am trying to understand the essence.
 
Aleksey Nikolayev:

The concept is quite simple - any multidimensional function can be approximated by a composition of one-dimensional ones. I hope you have already invented the concept of "function")

Decomposition by minus one argument to one-dimensional is usually understandable, but how easy to explain how in this composition of one-dimensional to find extrema faster than full search.

 
Here, another thought to add to the comprehension of the essence:

The transformation of a fragment of data inside functions (neurons) into a "weight", is intended for their unification and universalization of the network application.
 
Guys, I think you're getting hung up on the mechanisms of finding optimal values when training the network, but training is not the essence of the device, but part of the process of working.
 
Tag Konow:
Here, another thought to add to the piggy bank of understanding:

Conversion of data fragment inside functions (neurons) into "weight", is intended for their unification and universalization of network application.

record a video with explanations, it's so incomprehensible

 
Maxim Dmitrievsky:

record a video with explanations, it's so unclear

It's too early, I don't understand it yet.
 
Reg Konow:
I have not found an independent definition of "multidimensional function". There's a "distribution function" of probability theory, and within it, a type of "multivariate distribution functions" is considered, but there's no mention of MO technology.

Obviously, multidimensional functions, if they have anything to do with NS, are far from its essence. Probably something to do with the implementation of some technical nuances. I, on the other hand, am trying to understand the essence.

Multidimensional function is an ordinary mathematical function, the area of definition of which is a multidimensional space. In the case of NS - it is the space of features.

I want to ask you as a mathematician mathematician - did you study mathematics at school at all)?

Reason: