Taking Neural Networks to the next level - page 16

 

Is it possible to train a neural network to identify range and trend? And is it possible to identify samples of data (yourself) that you can see is a range or a trend and give that to the program to learn from and then apply this to making its predictions. 


So basically instead of a program training itself from raw data, train it from selected data to pattern recognize?


It could maybe then be used as a range / trend indicator for manual trading or be used to make trading decisions for an EA such as buy low and sell high in range or follow trend in trend conditions

 

It is possible, but not what you want. What you want is to know when it starts or ends. This i think is notoriously difficult if not impossible to achieve.

 

This seems like an easy classification task. If the network has been trained on thousands of examples that have been labeled as uptrend/range/downtrend, it's easy for a network to find out which scenario has the highest probability or similarity with earlier encountered patterns for the given current chart. A very simple Multilayer Perceptron with SoftMax activation and Categorical Cross Entropy Loss could do this. Is it useful? [--> like Enrique says]

But all this of course has nothing to do with predictions / forecasting. It's just about identifying a present state and seems very similar to image classification. I remember an article where people did exactly that: that fed pictures(!), so not even actual prices of candlestick charts into a convolutional network (but after that they did infact make predictions based on those images with pretty convincing trading results; sorry I don't remember the link to the article though).

 
Brian Rumbles:

So basically instead of a program training itself from raw data, train it from selected data to pattern recognize?

Data preselection is rarely a good idea (except for removal of inputs that have been proven irrelevant / uncorrelated with the desired result). If you remove information (this is what preselection does) you usually can't expect more valuable output information (less in --> less out).

As long as it's not a necessity due to limited computation power, or if reduced information isn't the goal (=noise reduction applications) better keep the raw data. We don't need to worry about adding additional preselection decisions cause somehow it already takes place inside the "black box", cause some neurons will always be more active than others and therefore amplify (or weaken) certain parts within the following layer architecture, just as needed for the best answer.

 
Enrique Dangeroux:

It is possible, but not what you want. What you want is to know when it starts or ends. This i think is notoriously difficult if not impossible to achieve.

Fair enough, but I still think it may be useful if we can identify these conditions in a reasonable amount of time (Perhaps).


Chris70:

This seems like an easy classification task. If the network has been trained on thousands of examples that have been labeled as uptrend/range/downtrend, it's easy for a network to find out which scenario has the highest probability or similarity with earlier encountered patterns for the given current chart. A very simple Multilayer Perceptron with SoftMax activation and Categorical Cross Entropy Loss could do this. Is it useful? [--> like Enrique says]

But all this of course has nothing to do with predictions / forecasting. It's just about identifying a present state and seems very similar to image classification. I remember an article where people did exactly that: that fed pictures(!), so not even actual prices of candlestick charts into a convolutional network (but after that they did infact make predictions based on those images with pretty convincing trading results; sorry I don't remember the link to the article though).

I think it may be useful.

About the second paragraph you wrote, yes exactly, it is not forecasting the price position but the market conditions, and so it could use pictures even , it can learn like a 3 year old child.


I think it could be useful to know where the 'value' is hiding or what is low and what is high in the context of the current conditions. If the conditions change, hopefully the downside is less than the upside potential because we bought low for example, therefore over time these losses would be more than compensated. 

So yeah I'm just thinking we don't need to know where the price will go but what the context of the current conditions are.

 
Chris70:

Data preselection is rarely a good idea (except for removal of inputs that have been proven irrelevant / uncorrelated with the desired result). If you remove information (this is what preselection does) you usually can't expect more valuable output information (less in --> less out).

As long as it's not a necessity due to limited computation power, or if reduced information isn't the goal (=noise reduction applications) better keep the raw data. We don't need to worry about adding additional preselection decisions cause somehow it already takes place inside the "black box", cause some neurons will always be more active than others and therefore amplify (or weaken) certain parts within the following layer architecture, just as needed for the best answer.

I see, so it can still analyze the whole population and find ranges or tends. I was in my original post of this idea just thinking that maybe the network needs to be told what to look for by a human first and then can learn the rest better
 
Brian Rumbles:
I see, so it can still analyze the whole population and find ranges or tends. I was in my original post of this idea just thinking that maybe the network needs to be told what to look for by a human first and then can learn the rest better

I've heard this many times. People often are having a hard time "trusting the black box" and want to add more elements of control because they THINK it helps.

Infact, the network IS told what to look for, the moment we define the labels. But it's not told how to get there. And it's better to leave it alone and do it's job.

This is again similar to our brain. We trust our brain "black box" without needing to know the "data flow pathways" or what any particular brain cell is doing in this moment. And this machinery magically works although there is no external control (no alien remote control, I hope ;-) )

 

I don't disagree with you, if it can find the conditions better itself then cool. But additionally we as humans who know what we want could tell it what to look for for our purposes.


An additional alternative thought if the NN could be trained to inform us which direction the trend is, this could be very useful because it can atleast be proven what accuracy the network was correct, then we can follow trend or buy low outliers in an uptrend for example

 
NELODI:
Can I turn my strategy into an EA? No, I can't. Because I can't copy my Brain into a computer.

*Elon Musk has left the chat*

 

After reading a few pages and philosophying around about the idea to feed a trend-filtering NN into a main NN, i wonder how long... or should i better say how manyNNs with "useful input" ( e.g. one for trenddirection analyzation, one for Momentum analyzation, ...) would make sense to feed into a main-NN until it beginns to overfit ( i know i know-> "absolutely case related").

How long (& whith what kinds of "sub-NN-purposes") we could generate something better and better.

One Metalabeling-Classifier proved its usefulness, but what if we turn the tables and make the Metalabeling-NN the main actor in the roleplay? (maybe in this case as a regressor)


Btw any fundamental/logical relation of assets beeing fed together into the NN could prove itself as useful, since Oil is macroeconomically mainly traded in Dollars (or was it gold? no no, that was a hundred years ago).


But lets grab this net of nets idea again: since i dont have the time or computational power to figure out the maximal amount for not overfitting, consider it worth a try ;I

Inb4 Chris will remind me that price information is lost,because we couldnt convert it back into raw price. (But in this case is that really necessary? Why not find out ;) )



edit: Okay short thing to add: if we have ECN Brokers. Why dont we use any data from the internet with higher quality for the timespans with horrible data? (wheter it be tickstory,dukaskopy or another brokers data..)

Reason: