"New Neural" is an Open Source neural network engine project for the MetaTrader 5 platform. - page 23

 
Urain:
Come on, let's see what you got.

Also strongly interested in Sparse Nets.

And fuzzy logic related models -- hooray!!! I remembered the name of one of the models, here's a link to the description. The tsukamoto model.

____

These will probably be a little bombshell if implemented successfully.

 
I don't understand how SVN = Support Vector Machine is fundamentally different from MLP?
 

TheXpert:

...


For that the product must be at the level of neuro-packages like NSDT, niasilim.

...

Too bad)) This is the program at the moment is the most suitable for users of any level. It is the first in the Stocks & Commodities magazine rating of the best analytical programs for 9 years by the poll of traders. But it is too cool for OpenSource. ))

Renat 2011.10.18 00:45 #

I have an idea to develop a neural network engine of several types, so that any trader can use it with minimal effort.

The code will be provided in source code in MQL5 and distributed as a part of the terminal.

I think if it's code, any trader can't use it anymore. Any trader-programmer, yes. For any trader it would suit to include the neuronet in a trading system through the MQL5 Wizard, but I suppose the Wizard would have to be significantly modified then.

I'm reading this article and at least now I have an idea how complicated everything is.)

Мастер MQL5: Создание эксперта без программирования
Мастер MQL5: Создание эксперта без программирования
  • 2010.12.15
  • MetaQuotes Software Corp.
  • www.mql5.com
Вы хотите быстро проверить торговую идею, не тратя времени на программирование? Выберите в "Мастере MQL5" нужный тип торговых сигналов, подключите модули сопровождения позиций и управления капиталом - на этом вся работа закончена. Создайте свои реализации модулей или закажите их через сервис "Работа" - и комбинируйте новые модули с уже существующими.
 

I propose, though, ask for a separate section of the forum, all in one topic is tedious. As a repository of piled up thoughts that's okay, but when the discussion begins ...

Let it be public, who wants to participate. Make threads about specific aspects of the system, and the admins of the threads would put the final decisions in the first post as they're discussed. If necessary, arrange a public vote in individual threads, etc.

Not bad by the way, if there was a function of the pinned post (indicates admin), on all pages of the branch, so that before his eyes was always some kind of outcome of the discussion.

 

For those who are interested, I will try to explain in several short posts-lectures why I consider neural networks based on biological methods of information transformation and using the principle of sparsity to be very promising.

Lecture 1: Biological basis of sparsity in neural networks.

The developing brain of a child goes through a stage of creation of a large number of synapses (connections between neurons), followed by a stage of removal of almost half of the connections by adolescence. Many scientists speculate that this removal of synapses is necessary to reduce the energy used by the brain due to slower metabolism and hormonal changes. With more synapses, children's brains are able to remember a lot of information, which explains why foreign languages are easier to learn before adolescence. Removing half of the connections by adolescence helps the brain summarize information better. The mechanism for removing half of the connections in the developing adolescent brain is not yet known. Many believe that metabolic changes reduce the amount of biogens (nutrients) needed to maintain synapses. The limited amount of these substances causes competition between the input connections of the neuron for its existence. This competition is modeled by methods of Competitive Learning, in which either the sum of the absolute values of the neuron's input weights or the sum of their squares is kept constant. These methods are used in self-learning networks. In teacher-assisted learning networks (e.g., direct propagation networks), the competition between neuron input weights is usually not taken into account. In such networks, connections between neurons are removed after their weights have been trained. Either the smallest weights or the weights with little effect on the average learning error are removed.

References:

https://en.wikipedia.org/wiki/Synaptic_pruning

Huttenlocher, P. R. (1979).
Synaptic density in the human frontal cortex - developmental changes and effects of age.
Brain Res., 163, 195--205.

Braitenberg, V., Schuz, A. (1998).
Cortex: Statistics and geometry of neuronal connectivity.
Berlin: Springer.

LeCun, Y., Denker, J. S., Solla, S. A., Howard, R. E., Jackel, L. D. (1990).
Optimal brain damage.
In Touretzky, D. S. (Eds), Advances in Neural Information Processing Systems 2, NIPS*89, Morgan Kaufmann, Denver, CO, 598--605.

Hassibi, B., Stork, D. G., Wolff, G. J. (1993).
Optimal brain surgeon and general network pruning.
Proc. IEEE Int. Conf. Neural. Networks, 1, 293--299.

Miller, K D., & MacKay, D. J. C. (1994).
The role of constraints in Hebbian learning.
Neural Computat., 6, 100--126.

Miller, K. D. (1996).
Synaptic economics: Competition and cooperation in synaptic plasticity.
Neuron, 17, 371--374.

Synaptic pruning - Wikipedia, the free encyclopedia
Synaptic pruning - Wikipedia, the free encyclopedia
  • en.wikipedia.org
In neuroscience, synaptic pruning, neuronal pruning or axon pruning refer to neurological regulatory processes, which facilitate changes in neural structure by reducing the overall number of neurons and synapses, leaving more efficient synaptic configurations. Pruning is a process that is a general feature of mammalian neurological development...
 
Mischek:
More imho. You're unlikely to find a specialist consultant on the outside that meets your requirements. If you have a budget, no matter what, it's more efficient to divide it between you at the end of the project, either evenly or not evenly, based on a subjective assessment of the meta-quotes.

At the expense of an external expert I will support. First, as I think you need not one specialist, but at least two (ideally even more), and secondly, not the fact that his qualifications will be more at least a couple of local forum users.

sergeev:

You can do even simpler.

In this situation we go from the particular to the general, with an attempt to abstract to universal models.

1. Draw (on paper + verbal algorithm of the matmodel) the networks we may realize (topologies and training methods for them).
2. Find common docking points in the drawn models to create abstract engine classes.


This is probably the most correct approach.
 

tol64:

I think that if it is a code, then any trader can no longer use it. Any trader-programmer, yes. For any trader, the possibility to include the neuronet in the trading system through the MQL5 Wizard would be fine, but then I suppose the Wizard would have to be significantly modified.

I'm reading this thread and now at least I have a good idea how complicated everything is.)

1. In my opinion, it must be, firstly, a powerful and versatile library, through which the trader-programmer (or simply MQL programmer) can create a neural network of necessary complexity and functionality. At this stage I think it is necessary to create core library which consists of a small number of objects (maximum abstraction and universality is important here).

2. On the second step it is necessary to write the library functionality in more detail and in depth (to determine the types of networks, training methods, topology variants, etc.).

And on the third step I think you have to decide what exactly will be fed to the input and how to learn.

4. Finally the most interesting thing, in my opinion. Together with MQ it would be nice to develop some kind of "Neural Network Wizard" which you could use to create a template of neural network, consistently specifying all its characteristics.

From the point of view of a layman it should be something like this. Start the wizard and specify in it: a network of such and such a type, so many layers, neurons of such and such, process the parameters of such and such a turkey (or just analyze a certain flow of information), in the output we get a certain signal.

The result of the wizard should (at least it seems to me to be a good idea) be a template that can be used as a separate module or can be used for development of an EA template in an existing VISARD.

5. If the question is about using a neural network for creation of a template of the expert by Wizard, it will be necessary to provide the form in which it will be possible to add a neural network (it can be more than one).

In this case neural network templates will have to be in a certain place (like signals, etc.) and meet certain requirements.

PS

It would be good to define the "importance" of the whole neural network, a separate layer (or part of a layer), and a separate neuron in this approach.

 
gpwr:
That's what I was trying to get you to do :) . Keep up the good work.
 
Vigor:

I suggest, all the same, to ask for a separate section of the forum, all in one topic it is tiresome. As a repository of dumped thoughts it's okay, but when the discussion begins...

For storage, here's a login.
 

All these contrived networks are solvable - the most important thing is trading orientation))) And this means integration into working EAs. That is, for most of the typical TS it is not necessary to do a bunch of auxiliary things such as preprocessing of input data or preparing a training sample, and it should be systematized and automated. I.e. if someone decided to input a waveform he would not need to first generate a number of waveform values, then generate a number of predicted values (for example, the increase of price during some bars), normalize it all, etc. and then train the network.

From purely technical point of view it may look like this: for input neurons we have a virtual function EnterData that returns a double. If you want to input some indicators or whatever, just write everything in this function.

The same for the output neuron is the ExitData function that calculates what is being predicted.

For example, I want to predict changes in the price for 5 bars: I have redefined the function

double ExitData(){

return(Open[-5]-Open[0]) ;

}

Or I want to predict volatility

double ExitData(){

return(High[iHighest(...,5,-5)]-Low[iLowest(....,5,-5)]) ;

}

etc.

Also, to set the learning period and out-of-sample as a property of the network object. And after training, it would be possible to get characteristics of the equity curve on the out of samples (for example, the profit factor)

I.e.

Net.StartTime=2005 year

Net.FinishTime=2008 year

Net.StartOutOfSamples=2009 year

Net.FinishOutOfSamples=2011 year

Net.Teach;

Net.OutOfSamples;

if (Net.PFOutOfSamples>3) Print("Good");

Or, if the network is not trading by itself but, for example, predicting volatility, the user redefines the function for assessing the network's quality for out-of-samples.

Then by standard means of the tester and optimization it is possible to implement search of the best topology or choice of network type and many other things

Reason: