Machine learning in trading: theory, models, practice and algo-trading - page 604

 
Vizard_:

That was a quip. No banter - twist the twist, see how it affects you.
If the inputs are adequate for the task, you can do it on "1 neuron".
In the context of mo, ideologically correct at toxic-a.
--------------
Professor on Deep Networks - youtu.be/qx3iM2aa2yU
31 min. "There's still not much science, but a lot of voodoo magic."



Stages of development characterized by a high rate of change are specifically called a leap.

The activation function (sigmoid, tanh and others) is a leap, modified by the introduction of a limit on the rate of change.

How much more time will it take for the "seekers" here to realize the meaning of this fact...

 
Maxim Dmitrievsky:

Well it does not work on the forex)

It doesn't work on Forex. Well, I should say, I haven't tried it yet.
 
Oleg avtomat:

Stages of development characterized by a high rate of change have a special name - a jump.

The activation function (sigmoid, tanh, etc.) is a jump, modified by the introduction of a limit on the rate of change.

How much more time will it take for the "seekers" here to realize the meaning of this fact...


what is the point of making sense of anything without actual evidence of robustness?

i prefer such statements: here's a deposit growth curve (at least on the test)... and now you're all m...kikes... then yes, no questions

 
Maxim Dmitrievsky:

What is the point of realizing something without actual proof of robustness?


Did you realize what you just said?

 
Oleg avtomat:

Do you know what you're saying?


I do.

 
Maxim Dmitrievsky:

And in the learning process you can include steepness optimization, I actually did so, but only for fuzzy logic. The steepness can have a big effect, yes.

You gave a link to the article https://habrahabr.ru/post/322438/

If the graph of neural network error function is plotted really so (posted here in tangents):


Obviously, it is possible to build something similar using sigmoid, but the steepness of individual sections will be lower.

If sigmoid is less steep, you can probably do the same thing with tangents, but you just need to take them 3-5 times more. I.e. increase the number of neurons.

Probably sigmoid was giving me less error, because I lacked the number of neurons in network at tangent.

Нейронные сети в картинках: от одного нейрона до глубоких архитектур
Нейронные сети в картинках: от одного нейрона до глубоких архитектур
  • 2022.02.17
  • habrahabr.ru
Многие материалы по нейронным сетям сразу начинаются с демонстрации довольно сложных архитектур. При этом самые базовые вещи, касающиеся функций активаций, инициализации весов, выбора количества слоёв в сети и т.д. если и рассматриваются, то вскользь. Получается начинающему практику нейронных сетей приходится брать типовые конфигурации и...
 

Who has an opinion? Is it better to study trading and pay money or on a free basis? And another question, is it worth spending money on a paid course at all?

 

Something I thought about the article https://www.mql5.com/ru/articles/497 where the steepness of the activation function changes and came to the conclusion that the network itself will find the right steepness:

Let's see the formula:

for(int n=0; n<10; n++) 
  {
   NET+=Xn*Wn;
  }
NET*=0.4; // - умножением меняем крутизну ф-ии активации 

When training, the network must pick up the multipliers Wn. If it is more profitable for the network to total *0.4, then it will simply pick up all weights of Wn, each of which will already be * 0.4. I.e. we just put a common multiplier in brackets, which itself will be determined by minimum error.

If anyone disagrees, correct me.

Нейронные сети - от теории к практике
Нейронные сети - от теории к практике
  • 2012.10.06
  • Dmitriy Parfenovich
  • www.mql5.com
В наше время, наверное, каждый трейдер слышал о нейронных сетях и знает, как это круто. В представлении большинства те, которые в них разбираются, это какие-то чуть ли не сверхчеловеки. В этой статье я постараюсь рассказать, как устроена нейросеть, что с ней можно делать и покажу практические примеры её использования. Понятие о нейронных сетях...
 
elibrarius:

Something I thought about. and came to the conclusion that the network itself will find the right steepness:

Exactly. The NS will either proportionally increase or decrease all the weights by the right amount (which will be -steepness), and even pick up the right offset.

Anyway, for most tasks it doesn't matter.

 
I would like to automatically determine the number of neurons in the network. What are the formulas for the calculation?
Reason: