Machine learning in trading: theory, models, practice and algo-trading - page 335

 
Азбука ИИ: «Рекуррентные нейросети»
Азбука ИИ: «Рекуррентные нейросети»
  • 2016.11.04
  • Тарас Молотилин
  • nplus1.ru
N+1 совместно с МФТИ продолжает знакомить читателя с наиболее яркими аспектами современных исследований в области искусственного интеллекта. В прошлый раз мы писали об общих принципах машинного обучения и конкретно о методе обратного распространения ошибки для обучения нейросетей. Сегодня наш собеседник — Валентин Малых, младший научный...
 

Did things get better after using the trawl?

And maybe the condition to close the deal should be adjusted, because trawl is too easy way out, without trying to sort things out?

For an automaton, the deal closing is associated with some condition.

On the real, the stops will ruin the nerves...

 
Renat Akhtyamov:

Did things get better after using the trawl?

And maybe the condition to close the deal should be adjusted, because trawl is too easy way out, without trying to sort things out?

For an automaton, a deal closing is bound to some condition.

On the real, stops will ruin your nerves...

Why? Trawl is a great thing. I always use adaptive trail immediately after opening all my trades, even if I work with hands, so I don't bother.
 
Renat Akhtyamov:

Did things get better after using the trawl?

And maybe the condition to close the deal should be adjusted, because trawl is too easy way out, without trying to sort things out?

For an automaton, a deal closing is bound to some condition.

On the real, the stops will spoil the nerves...


I don't know, maybe the parameters were wrong on the VPS... Later, the updated version will be put in the monitoring with a normal risk

There are a lot of improvements, in particular, I will add more inputs.

 

Started to train a neural network. The planned task is not going yet. Writes that the data is not in the format. I don't understand what it wants yet(.

But here is an example, I made for the network [3,4,1].

//Ожидаемый отклик НС
t  = 0.3    1.    0.    0.    0.5 
//Реальный отклик обученной НС
ans  = 0.3223616    0.9315578    0.1047166    0.0809235    0.4536240  

It seems to be fine.

 
Yuriy Asaulenko:

Started to train a neural network. The planned task is not going yet. Writes that the data is not in the format. I don't understand what it wants yet(.

But here is an example, I made for the network [3,4,1].

Seems to be fine.


The tensorflow from google also looks good, but it is not very convenient to install and python
 
Maxim Dmitrievsky:

Google's tensorflow also looks good, but it is not very convenient with installation and python

This is SciLab neuronics. Now the main task suddenly (unexpectedly)) began to learn. I screwed up somewhere apparently).

In general, they say there are a lot of neuronics on the Internet and in C++. But I was not looking for them.

 
Yuriy Asaulenko:

This is SciLab neuronics. Now the main task suddenly (unexpectedly)) began to learn. I screwed up somewhere apparently).

In general, they say there are a lot of neuronics on the Internet and in C++. But I was not searching for them.


You can get them everywhere, nowadays there's a real neural boom.)
 

The experiment to train a neural network (NS) to cross two MAs failed. Learning was conducted to recognize only the intersection upwards.

For the experiment the NS - 3,3,3,1 was selected and tested for training and recognition of artificially created patterns. However, after learning to recognize the MA, not a single crossover was recognized. The reason - the NS needs more contrast images, and does not care about all differences of 0.01-0.1 between inputs.

For a given structure of the NS it is quite possible to get reliable recognition when the signal difference is not less than 0.2-0.3.

 

I'm starting to study neural networks.

I am looking at variants, which can be implemented directly in MT5.

I am interested in the variant using ALGLIB (https://www.mql5.com/ru/articles/2279), but from the description of the network it follows that it is a sequential network without feedbacks. And the disadvantage is that it can be trained only by one thread of the processor (which processes the Expert Advisor with the neural network).

I think it would not be too difficult to add 2 hidden sequential layers to the neural network from the article https://www.mql5.com/ru/articles/497 and then train it either by full brute force or genetic in the tester. But in this case you can use a lot more computational threads (the core of your CPU, in the network and in the cloud). Do I get it right?

How can I add manual instructions for correct answers (shopping and sales locations) to the training of such a network?

Maybe there is already a library for multilayer sequential network somewhere?

And also, I don't quite understand the usefulness of using inner layers, for forex/stock market trading purposes. Does it make sense to add them? Why?

Нейросеть: Самооптимизирующийся советник
Нейросеть: Самооптимизирующийся советник
  • 2016.10.03
  • Jose Miguel Soriano
  • www.mql5.com
Возможно ли создать советник, который согласно командам кода автоматически оптимизировал бы критерии открытия и закрытия позиций с определенной периодичностью? Что произойдет, если реализовать в советнике нейросеть (многослойный персептрон), которая, будучи модулем, анализировала бы историю и оценивала стратегию? Можно дать коду команду на ежемесячную (еженедельную, ежедневную или ежечасную) оптимизацию нейросети с последующим продолжением работы. Таким образом возможно создать самооптимизирующийся советник.
Reason: