Market etiquette or good manners in a minefield - page 56

 
to gpwr Unfortunately your assumptions are wrong. I input the first difference of hourly bars it looks like Open[i] - Open[i+1] at best, while Neutron also inputs the first difference but not bars. Therefore it is impossible to build the function you are sorry, because there is no delay... as a class
 
paralocus писал(а) >>

OK! Tell me if it's not possible to train the grid to a fixed number of epochs, but rather to reach a certain minimum error. Finding the optimal number of epochs, inputs and temperatures is quite a cumbersome and time-consuming task.

Oh, and I almost forgot:

This already applies to valid inputs. How do you calculate the MO of an input vector?

Of course you can. You've derived the dependence of the normalized error vector length as a function of the number of epochs. So see what you can expect. You can, of course, set the value of 10^-6 and wait for the next century :-), or you can set something reasonable (like 0.5-1.8), but then the question arises about the reasonability of this step. After all, sometimes the NS gets into a local low and will sit there forever. What you will do in this situation - to wait, or put the condition to flight? Well, so you're putting it on condition after a certain number of epochs... In short, experiment and think.

M0=SUM(x[i])/n, where index runs from 1 to n.

P.S. You can download normal matcad here.

 

Thank you!

to gpwr here please.

Maybe you want the weights themselves?

Then here they are:

These are the weights of a well-trained single layer

 
gpwr писал(а) >>

The first difference of the hour bars will do. It does not matter whether Open or Close. And why can't you show the first weights by which the input data are multiplied as a graph W[n] for a certain learned state of the network? Don't you know them?

They are values of weights of input layer and output neuron, as a function of the learning epoch on EURUSD clocks.

 

to Neutron,TheXpert


Colleagues, you are deluded to put it mildly. There is NO phase delay on (H+L)/2 compared to any number obtained within each range (Open, Close, ...). NONE!!!!! There can't be one by definition, for the simple reason you remove the phase, there is no data slip, i.e. each such value is calculated within its own unique data. Claiming any kind of f.z. - is nonsense. What forms the bar can rather be defined as a "discretisation" of the signal along the x-axis.


This is what you Sergei should have actually shown, and your picture is completely wrong. You stupidly showed "MA" replacing it with a close estimate, but you were talking about (H+L)/2 on bars, then, exactly what you predicted.

  • Black (H+L)/2
  • The rest Open, Close, High, Low


And if you finally open your eyes, you'll see there's no delay.


PS: You don't seem to read or think at all.

 
grasn >> :

And if you finally open your eyes, you will see that there is no delay.

I'm not gonna change your mind.

 
gpwr >> :

Thank you. Could you please show these weights as a function not of epoch but of their number after the last epoch. That is, if w0[i,j] is a weight as a function of epoch i their number j, where j=0..15, then I am interested in a graph of w0[1000,j] with obcissa x=j (0..15). I'll explain later why I need it.

No, dear! First you explain, and then we'll see if we can build such monsters or not. It would take me half a day to calculate the 1000th weight.

 
TheXpert >> :

I'm not gonna change your mind.

а-а-а-а-а-а-а-а-а-а-а-!!!!! We need to see where the moon is, maybe that's the reason. I now understand the expression - "words are missing" Is that where the delay is???


 
gpwr >> :

Thank you. That's exactly what I needed. Here's a chart of your scales.

>> I'm waiting for the scales from Neutron.

Compare it to the weighting graph of your AR model. Colleagues have simply forgotten that historically NS originated from filter theory. For perceptrons, this is effectively the same technology as adaptive filters (AR is also a filter). And they don't understand what they are actually doing. Resetov once proposed NS on fuzzy sets, even wrote an article. This is a really good idea.

 
grasn >> :

а-а-а-а-а-а-а-а-а-а-а-!!!!! We need to see where the moon is, maybe that's the reason. I now understand the expression - "no words are enough" Is that where the delay is???


There's the OTO of the market out again. The thing is that the delay, of course, there is, for the middle of the bar comes much after its opening, but, on the other hand - it's as if there is no delay, because when the bar is already formed, then everyone already cares when this golden mean inside the bar comes. I've seen bars that start and end with the middle. The point of your argument with Neutron is that bars are not bars to Neutron, because a bar is not quantized by time, while for you a bar is a BAR.

You have different "frames of reference" - that's all.

Reason: