Machine learning in trading: theory, models, practice and algo-trading - page 523

 
Maxim Dmitrievsky:

It's nothing, it's a common fitting, it's been discussed 100 times already. It is done elementary and is of no practical use in forex.


I mean fitting. I've kind of indicated that it's an out-of-sample OOS section..... Where's the fit? This is exactly the high enough level of generalization if anything.....

 
Mihail Marchukajtes:

I mean fitting. I kind of said it was an out-of-sample section of the OOS..... Where's the fit? That's a pretty high level of generalization, if anything.....


ah... out... I overlooked the OOS :) then on the real should work, even if worse, but still in the +

for stable results it is necessary to do cross validation and adaptive TS... but there are also many pitfalls

The main pitfall in the self-training system, which I found - scaffolds or NS are trained slightly differently each time on the same sample, because of this the final results may vary greatly. That is, if several times in the tester to run the same TS, the results will be different :)

And every time I study NS more and more, I start to like simple logit or linear regression:D

 

Does anyone know what happens to the network when the number of training examples per class is different? I've seen that this leads to bias and the network starts giving out only one class. And that aligning the number of examples by class corrects the situation.

But I want to understand what is the reason for this?
For example after seeing 100 cats and 2 dogs, a person pays more attention to dogs and looks them over, like "Oh, something new". For some reason these 2 dogs look like cats to a neural network. That is, quantity spoils quality.

Too little - the number of examples of one of the classes, does not allow to allocate a single neuron / connection for their definition? Although with softmax, the output neuron is allocated, so there are no connections to it.
Or to these 2 examples are found 10 very similar ones from another class and make a preponderance in their side?

 
Mihail Marchukajtes:

I mean fitting. I sort of indicated that this is the section outside of the OOS sampling..... Where's the fit? That's a pretty high level of generalization, if anything.....

Well then I'll also show off the back versus forward by month


 
elibrarius:

Does anyone know what happens to the network when the number of training examples per class is different? I've seen that this leads to a skew and the network starts producing only one class. And that aligning the number of examples by class corrects the situation.

But I would like to understand what is the reason?
For example after seeing 100 cats and 2 dogs, a man will pay more attention to dogs and look them over, like - "Oh! Something new". For some reason these 2 dogs look like cats to a neural network. That is, quantity spoils quality.

Too little - the number of examples of one of the classes, does not allow to allocate a single neuron / connection for their definition? Although with softmax, the output neuron is allocated, so there are no connections to it.
Or to these 2 examples are 10 very similar ones from another class and do the override in their direction?


Well it averages out the examples, the second class starts contributing a smaller fraction

 
Get My Forex Systems FREE!
Get My Forex Systems FREE!
  • admin
  • strategy.doubledoji.com
Trading is all about forecasting price. If you have been reading my Forex Strategies 2.0 Blog, you must have seen I am focusing more and more on algorithmic trading. If you can forecast currency pair price with a reasonable degree of accuracy, you can make a lot of pips. Markets have changed a lot. Today algorithms rule the market. Wall Street...
 

What progress has come to.


Раскрашиваем чёрно-белую фотографию с помощью нейросети из 100 строк кода
Раскрашиваем чёрно-белую фотографию с помощью нейросети из 100 строк кода
  • 2015.11.17
  • habrahabr.ru
Перевод статьи Colorizing B&W Photos with Neural Networks. Не так давно Амир Авни с помощью нейросетей затроллил на Reddit ветку /r/Colorization, где собираются люди, увлекающиеся раскрашиванием вручную в Photoshop исторических чёрно-белых изображений. Все были изумлены качеством работы нейросети. То, на что уходит до месяца работы вручную...
 
Morexod:

What progress has been made.



It is strange that the eyebrows were not included in the treatment, although a strand of hair is part of the group.

 
Maxim Dmitrievsky:

Well then, I'll also show off my back versus forward by the month


Now that's what I'm talking about... cool. It's quite possible to bet on the real ...

 

I don't know what your networks are doing. But Reshetovsky when classes are skewed. When the number of ones and zeros in the output variable are not equal. It adds to the training and test sample those classes that are smaller. As an example of cats and dogs. If there are 100 cats and 2 dogs, then the sample will be supplemented with 98 more copies of dogs for equilibrium. But the example is not good, because there is a tricky way of adding. Not just like that. As a result, we get a sample of 100 different cats and 100 copies of dogs. Like this...

Reason: