"New Neural" is an Open Source neural network engine project for the MetaTrader 5 platform. - page 96

 
Andrey Dik:

Let's put it this way - "Almost doesn't work"... but what's left of the whole "Doesn't work" and is left to be scraped out in crumbs.

it's even scraped out sometimes, but not to the full depth of the sample

 
Dmitry Fedoseev:

It's no different. It is an ordinary function. Input one parameter, output one value.

I see. Thank you.
I will continue my logical exploration. So, the "layer of neurons" is a kind of a complex of one-type functions, each of which processes one value at the input and outputs one result? Or the result is somehow prepared by all "neurons" of the complex?
 
Реter Konow:
I see.
I'll continue my logical exploration. So, the "layer of neurons" is a kind of a complex of one-type functions, each of which processes one value at the input and outputs one result? Or the result is somehow prepared by all "neurons" of the complex?

Yes. But this single value sent to the input of one neuron is added from outputs of all neurons of the previous layer (they are added with multiplication by coefficients).

 
Maxim Dmitrievsky:

Exactly, but I haven't formulated a general approach from an MoD perspective to this yet :)

by the way, could it work as a replacement for Hearst?https://en.wikipedia.org/wiki/Sample_entropy

Or a lagging one too.

Possibly. From my point of view, process entropy is a perfect indicator of discontinuity. It should be. But I need to do some research and I'm too lazy to do it - let someone else try.

As for time... In the market there is a periodicity of processes as a nested structure. Except that it's not easy to calculate these periods. Gunn had his, I got mine for some reason. I don't know... We'll see in practice... But, until I started working with specific time periods, my TS was working at +0% profit as on SB.

 
Alexander_K:

Possibly. From my perspective, process entropy is a perfect indicator of disunity. It should be. But, it needs research, and I'm already lazy - let someone else try it now.

As for time... In the market there is a periodicity of processes as a nested structure. Except that it's not easy to calculate these periods. Gunn had his, I got mine for some reason. I don't know... We'll see in practice... But, until I started working with specific timeframes, my TS worked in +0% of profit as on SB.

I will do some research )) the code is simple

volatility clustering is what differentiates an efficient (stressed) market from a SB, yeah, that's the only periodicity, I guess. And it's exactly what's tied to time cycles

at least that is the general opinion (or misconception) of econometricians
 
Dmitry Fedoseev:

Yes. Except that this one value, fed to the input of one neuron, is added from the outputs of all neurons of the previous layer (added with multiplication by coefficients).

Ok. We need to find a practical analogy. The diagram shows that the layers have different numbers of neurons. If you turn the diagram upside down, you get a pyramid. So the result goes through several processing steps. The more neurons in the layer, the more data this layer receives and processes. If the next layer outputs less data than the previous one, it means that the data is generalised from layer to layer?
 
Реter Konow:
Okay. We need to find a practical analogy. The diagram shows that the layers have different numbers of neurons. If you flip the diagram, you get a pyramid. So the result goes through several processing steps. The more neurons in the layer, the more data this layer receives and processes. If the next layer outputs less data than the previous one, it means that the data is generalised from layer to layer?

Come to think of it... and the pyramids were built by the ancients... look for analogies there.

 
Maxim Dmitrievsky:

I'll do a little research )) the code is simple

The code is simple, but our input data doesn't quite fit:

Wiki entropy: ".... measures the deviation of a real process from an ideal one. ... Mathematically, entropy is defined as a function of the state of the system, defined to an arbitrary constant."

и?

what in finance VR could be an ideal market? - who the hell knows, OK let that be the first assumption, perfect market = sine wave!

as inputs we have at least 3 prices high, low, clowes - and which one should we use? - OK, let it be the second assumption, median price rules!

what do we measure from and to? - beginning of the day? week? expiry day? trading session? - OK, start of the day, let it be the third assumption....

total of 3 questions, 3 times we assume we're right? here the problem comes down to combinatorics: how many times we derive the correct initial hypothesis and how many times our further exploration leads to the correct market valuation... on history ))))


entropy sounds nice, but I dug this subject some years ago from the perspective of informational entropy, the only conclusion is that if a pattern starts to form or the nearest repetition of candlestick combinations in the history it will not work, because simple patterns and correlations do not work in the market, the same thing applies to them when they become obvious - they stop appearing )))). I usually say to myself in such cases - you are not the smartest one, such smart people represent half the world by monitors)))

 
Реter Konow:
OK. We need to find a practical analogy. The diagram shows that layers have a different number of neurons. If we turn the diagram upside down, we get a pyramid. So the result goes through several processing steps. The more neurons in the layer, the more data this layer receives and processes. If the next layer outputs less data than the previous one, it means that the data is generalised from layer to layer?

if there are fewer neurons in the layer than in the previous one, there is information compression, and, "unpacking" - if there are more neurons than in the previous one.

 
Реter Konow:
Ok. We need to find a practical analogy. The diagram shows that the layers have a different number of neurons. If we turn the diagram upside down, we get a pyramid. So the output goes through several processing steps. The more neurons in the layer, the more data this layer receives and processes. If the next layer outputs less data than the previous one, it means that the data is generalised from layer to layer?

Yes, they are generalised. If the input is, say, 100 bars, the output should be two commands: buy or sell.

The task is not to make a neural network hold a lot of data, but to match the amount of data it is trained on. If the network is too big and not enough data, it will learn easily, but it will not be able to generalize to other data. So the number of neurons should be as small as possible. More than three layers are kind of unnecessary. In the first layer the number of neurons corresponds to the size of the pattern of input data, and in the last one - to the number of resulting variants. And in the intermediate one it is as small as possible, but not smaller than in the output.

Reason: