neural network and inputs

 

Hi all. I got interested in the topic of ns, read two books, in general I understand what's what. Normalize inputs, there's no sense in choosing ns type - msp (or rbf) can cope with most tasks... but what about inputs? Here, the majority divides into two camps: those who says that indicators on inputs are necessary (and emphasize that) and those who believes that it's not necessary at all: most of indicators ns can reproduce, so if necessary ns will create the necessary "indicator" inside itself.

Maybe, by some selection of useful indicators we decrease the area of searching (more exactly, we impose a certain direction of searching), but ideally wemust ourselves choose the methods of data analysis. Is my layman's statement correct? Perhaps we need to create special conditions in the NS for this?

 
You're really right that the type of network is not so important, the main thing is the inputs, and of course the output. Just find the right inputs for the network and it will do wonders, but how do you find them????? Simple normalisation won't do it.....
 
nikelodeon:
You're really right that the type of network is not so important, the main thing is the inputs, and of course the output. Just find the right inputs for the network and it will do wonders, but how do you find them????? Simple normalisation won't do it.....


Is your question a clue or is it a societal issue? :)

 

1. if taking the first position, the main thing is to find the right set of indicators.

2. if taking the second position, there are two main components: 1. normalisation of data. 2.providing history data using the filter (remove unnecessary information).

Regarding the second, I would like to add. Generally I wanted to feed filtered ticks to inputs (regular Renko), because I look at the time interval chart with skepticism. But to divide prices into filtered zones (e.g. 5). The closer to the real price, the lower the renko period, the further away, the higher. I.e. the further the data is from the present, the less it affects the present condition, hence we increase filtering.

But I gave up ticks, because it is difficult to create a system of current price data receipt and connect it with tick history, I decided to play with time intervals, although the basic philosophy has not changed.

 
It's simple - inputs and outputs should give the most adequate (correct) information to the neural network about existing patterns on the instrument being traded in order to get increasing equity.
 
Let me interject so as not to make new topics. Who normalises the input signals? And what signals at the output are more convenient to operate with? I have had enough of primitive Persaptrons, built a network, normalized signals on all layers, but the output is a mystery, and there are a lot of uncertainties.
 
grell:
I will allow myself to interfere, so as not to multiply topics. Who normalizes input signals? And what signals are more convenient to operate with on the output? I have had enough of primitive psaptrons, built a network, normalized signals on all layers, but there is a big dilemma with the output, and there are a lot of uncertainties about small things.

You are asking strange questions) There are two main tasks usually solved by NS in our application domain: classification and regression. Based on this, the network is constructed, its type and architecture are chosen, and its output is interpreted accordingly. Whether the output will be the belonging of the input set to some class, or the value of tomorrow's price (conditionally). What does your network do? What are you teaching it?

Normalization of inputs is simple, though there may be nuances depending on input and its characteristics (input of NS may be composite, for example). For a "homogeneous" input set the simplest and usually sufficient variant is a linear transformation in a given range [a;b]. Depending on characteristics of input set additional transformations are possible to improve distinguishability for example...

Read articles, I've learned something from them in my time (all of them are available at neuroscience school and there's no sense to lay them out here):

Presentation of input data in neural network prediction tasks Krisilov V.A., Chumichkin K.V., Kondratyuk A.V.

Transformation of neural network input data to improve their distinctness. Krisilov V.A., Kondratyuk A.V.

Acceleration of Neural Networks Learning by Adaptive Simplification of Learning Samples. Krisilov V.A., Chumichkin K.V.

Preliminary estimation of training sampling quality for neural networks in tasks of forecasting of time series. Tarasenko R.A., Krisilov V.A.

A choice of the size of description of a situation during formation of training selection for neural networks in tasks of forecasting of time series. Tarasenko R.A., Krisilov V.A.

Increase of quality and speed of training of neural networks in a task of the forecasting of behavior of time series. Oleshko D.N., Krisilov V.A.

 
Figar0:

You are asking strange questions) There are two main tasks usually solved by NS in our application domain: classification and regression. Based on this, the network is constructed, its type and architecture are chosen, and its output is interpreted accordingly. Whether the output will be the belonging of the input set to some class, or the value of tomorrow's price (conditionally). What does your network do? What are you teaching it?

Normalization of inputs is simple, though there may be nuances depending on input and its characteristics (input of NS may be composite, for example). For a "homogeneous" input set the simplest and usually sufficient variant is a linear transformation in a given range [a;b]. Depending on characteristics of input set additional transformations are possible to improve distinguishability for example...

Read articles, I've learned something from them in my time (all of them are available in neuroscience school and there's no sense to lay them out here):

Presentation of input data in neural network prediction tasks Krisilov V.A., Chumichkin K.V., Kondratyuk A.V.

Transformation of neural network input data to improve their distinctness. Krisilov V.A., Kondratyuk A.V.

Acceleration of Neural Networks Learning by Adaptive Simplification of Learning Samples. Krisilov V.A., Chumichkin K.V.

Preliminary estimation of training sampling quality for neural networks in tasks of forecasting of time series. Tarasenko R.A., Krisilov V.A.

A choice of the size of description of a situation during formation of training selection for neural networks in tasks of forecasting of time series. Tarasenko R.A., Krisilov V.A.

Increase of quality and speed of training of neural networks in a task of the forecasting of behavior of time series. Oleshko D.N., Krisilov V.A.


The network is predicting sooner, the output is two values in the range [-1;1]. I first feed 8 values to the net input, then normalize to the range [-1;1] without shifting zero. Then I normalize by weights and layers in the same way. The output is a forecast of the two nearest fractals with their relative position relative to the 0 bar. There is no quantitative binding. That is, if the output is -1 and 0.5, it means that the nearest fractal is two times lower than Open[0] and the next fractal is higher than Open[0]. And by analogy, if the values are 0.3 and 1, then both fractals are higher than Open[0]. Thanks for the selection. And the questions are strange in your opinion. In my head (mind) everything is clear and understandable. And the schemes, and methods of teaching, and training, and interpretations. But when it comes to describing the machine - stupor.
 

If the signals fed to the network input and output do not carry any useful information for the network, then normalising is useless.

And if these signals carry useful information, then in principle, how and with what to normalise does not matter - the main thing is not to blur this contained information ))))

 
LeoV:
If the network input and output signals do not provide useful information for the network, it is useless to normalise )))

Which input signals do you think carry useful information?) Frankly speaking, I don't care what's going on in the hidden layer, may it be recipes of pies of the peoples of the world, the main thing that the output is useful information, and the input I will provide informative.
 
LeoV:
If the signals fed to the input and output of the network don't carry useful information for the network - it's useless to normalize )))

О! Leonid! Merry Christmas! Success in business and good health!

How's the project going?

http://www.neuroproject.ru/demo.php?

I wanted to get closer to neuro-experiments myself.

What do you think is relevant at the moment, can you share your ideas on neuro?

Thank you.

Reason: