Machine learning in trading: theory, models, practice and algo-trading - page 2495

 
mytarmailS #:

..

Yes I do, I do, calm down already)))

Repeat Uncle Piggy :-)
 
Thrash. I guess neural networks really do rule here. At least a couple of mine are at that level...
 
eccocom #:
Thrash. Looks like neural networks really do rule here. At least a couple of mine are at that level...

laughed all the collective farm and factory shift (yep, all assembled) :-)

show me a signal older than a year that really works on neural networks

Usually they just mention "neural networks, deep learning" and other nonsense. And when considered - martingale, locks, grids and simple MA. Reality of the matter - simple algorithms and simple scams rule.

---

As much as I watch this thread I see the result: only beautiful (really very good) articles and personal self-development of the authors

 
Maxim Kuznetsov #:

laughed all the collective farm and factory shift (yep, all gathered) :-)

show me a signal older than a year which really works with neural networks

Usually they only mention "neural networks, deep learning" and other nonsense. And when considered - martingale, locks, grids and simple MA. Reality of the matter - simple algorithms and simple scams rule.

---

Whenever I look through this thread I see the result: only beautiful (really very good) articles and personal self-development of the authors

That's not what I meant...

And about neural networks I wrote above, that the standard is 55-56, in general about nothing.

 
eccocom #:
As for models, it's not about them, it's about the fact that AI is essentially an approximator...

That's the point, it's about models, which are created based on dependencies found with AI, -- and you don't have to go from a model to the NS, but from the NS to a model -- working under specific current conditions... Of course, conditions may change...

when I thought that balance on the classical AS segment and imbalance on the Keynesian segment -- according to Keyns -- I already realized that the NS for determining this fact is globally kind of irrelevant to me...

(and approximation is only 1 of the AI skills, + optimization, etc.),

Evgeniy Ilin # :

All you need is in OHLC, other data are derived from it. The main thing is a flexible algorithm that will find a way to convert OHLC to those data that have more weight, again this is the task of the machine. In addition only OHLC is the same almost everywhere, if you look at tick volumes or something else this data is different, as well as spreads and so on. All you need is on the chart in the terminal.

But if you have the capacity for deep learning, then it's probably possible... (I even remembered how to find the curvature of the "straight line", like in a fog, using the 1st and 2nd derivatives. 1st and 2nd derivatives, thanks to the author of the quote))

And if it's more modest, you can also run a sample for the current moment... then retrain as the yield curve starts to come out horizontally...

but if you initially use really meaningful signs (logically and economically), then the machine itself will figure out what is moving the market at the moment (i.e., what the driver is most dependent on at the moment)...

Mihail Marchukajtes has a very interesting/logical approach and thanks to him for explaining the polynomial(I'll try to remember too)... just a man in the know!... but if other person does not know and does not want to know how past statistics (NOT a model yet!!!) work, so that it can really be reasonably processed for transferring to the future (with 50/50 probability, of course) - then he will blame the model, the neuron, the market and the conditions... and by the way, the change of the last ones is the reason of good inputs! - The scheme of functioning of any ecosystem, regardless of its structure, is always the same: Conditions -> Reactions -> Consequences (and even environmental consequences).

The main trader's skill is to know when NOT to enter the market... imho!

P.S.

And if found current dependences and their interaction in a model or not - it is a global question ... And it does not concern the NS, but rather the approaching abilities of the brain of the sample research author, who uses the NS as a tool, but not as a basis for entering, and not delegating the responsibility for analysis and conclusions to him

Машинное обучение в трейдинге: теория, практика, торговля и не только
Машинное обучение в трейдинге: теория, практика, торговля и не только
  • 2021.10.25
  • www.mql5.com
Добрый день всем, Знаю, что есть на форуме энтузиасты machine learning и статистики...
 
JeeyCi #:

That's the point, it's about models, which are created based on dependencies found with AI, and you don't have to go from a model to the NS, but from the NS to a model that works under specific current conditions... Of course, conditions may change...

when I thought that balance on the classical AS segment and imbalance on the Keynesian segment -- according to Keyns -- I already realized that the NS for determining this fact is globally kind of irrelevant to me...

(and approximation is just 1 of the AI skills, + optimization, etc.)

but if there is power for deep learning, I guess it's possible... (I even remembered how to find the curvature of a "straight line", as in fog, using the 1st and 2nd derivatives, thanks to the author of the quote))

And if it's more modest, you can also run a sample for the current moment... then retrain as the yield curve starts to come out horizontally...

but if you initially use really meaningful signs (logically and economically), then the machine itself will figure out what is moving the market at the moment (i.e., what the driver is most dependent on)...

Mihail Marchukajtes has a very interesting/logical approach and thanks to him for explaining the polynomial (I'll try to remember too)... just a man in the know!... but if other person does not know and does not want to know how past statistics (NOT a model yet!!!) work, so that it can really be reasonably processed for transferring to the future (with 50/50 probability, of course) - then he will blame the model, the neuron, the market and the conditions... and by the way, the change of the last ones is the reason of good inputs! - The scheme of functioning of any ecosystem, regardless of its structure, is exactly the following: Conditions -> Reactions -> Consequences (and even environmental consequences).

The main trader's skill is to know when NOT to enter the market ... imho!

P.S.

If a trader has found current dependences and their interactions into a model or not - it is a global question... And it does not concern the NS, but rather the approaching abilities of a research author's brain, who uses the NS as a tool, but without delegating all responsibility for the analysis and conclusions to him

I see, where will the euro go today?
 
eccocom #:
Read the TensorFlow documentation, everything is a constructor... practically. Really it's black boxes. If you're interested, I can give you the code of the perceptron written manually, and by the way it's all matrix calculations, that's what it's all built on

By the way , the tensorflow.keras (like Evgeny Dyuka's)

At the moment Keras doesn't provide any functionality to extract the feature importance

SKLearn seems more interesting - Interpretation of machine learning results(maybe the library is not very good, but the logic of evaluation is given)

p.s.

you didn't attach...

Feature Importance Chart in neural network using Keras in Python
Feature Importance Chart in neural network using Keras in Python
  • 2017.07.27
  • andre andre 481 1 1 gold badge 5 5 silver badges 8 8 bronze badges
  • stackoverflow.com
I am using python(3.6) anaconda (64 bit) spyder (3.1.2). I already set a neural network model using keras (2.0.6) for a regression problem(one response, 10 variables). I was wondering how can I generate feature importance chart like so:
 
JeeyCi #:

By the way tensorflow.keras (as Evgeny Dyuka has) - then

SKLearn seems more interesting - Interpretation of Machine Learning Results (maybe the library is not very good, but the evaluation logic is given)

p.s.

you didn't attach...

You're getting into some kind of jungle. The problems of predicting (or rather not predicting) the NS at a much simpler level and have nothing to do with the NS itself

https://neurohive.io/ru/tutorial/prostaja-nejronnaja-set-python/

It's a simple perceptron))).

My tutorial example is in Jupiter, I don't want to copy bits and pieces, and I don't use githab.

Простая нейронная сеть в 9 строк кода на Python
Простая нейронная сеть в 9 строк кода на Python
  • 2019.02.14
  • neurohive.io
Из статьи вы узнаете, как написать свою простую нейросеть на python с нуля, не используя никаких библиотек для нейросетей. Если у вас еще нет своей нейронной сети, вот всего лишь 9 строчек кода: Перед вами перевод поста How to build a simple neural network in 9 lines of Python code , автор — Мило Спенсер-Харпер . Ссылка на оригинал — в подвале...
 
eccocom # :

You're getting into some kind of jungle.

into the logic... that NS is used when it is necessary to circumvent the lack of a formula describing the dependence of a trait on a factor... weighting is used... but before and after NS, standard/classical statistical processing is in effect... For example having only PDF=F'(X)=dF(x)/dx (although we don't need CDF, because all conclusions from population analysis are made by PDF) and having volatile data - first of all I need to bring distributions to uniformity for possibility of their joint analysis - and here weighting is helpful (here I don't aspire to mathematics)... but the analysis itself has nothing to do with NS, nor do its conclusions to it (ns)... although such estimation may be crude, but the classical statics is also imperfect (e.g., the use of logarithms of increments already introduces trendiness into the conclusions by itself - a purely mathematical defect)... and any model has its Assumptions...

market participants do NOT wait for predictions, but assess risk and volatility and make their trading (and hedge) decisions based on this... It's just that in this analysis there are 2 variable factors - volatility and time window - and NS helps to bring the samples to uniformity (but you can use GARCH as well) to allow their joint analysis within one statistical model and helps to determine the horizon... in those moments when there is no mathematical formula, and you don't need it (everything changes in this world)... but by weighting, weighting and weighting again (for the sake of compression to a regression) - to enable joint analysis within one statistical model, and preferably without noise or at least with its minimization...

Bayesian inference logic for Gaussian is worth to be kept in mind...

The main thing, I suppose, is to build such NS architecture, that when neuronal layers pass on the way to the output the dispersion does not increase... imho (why to accumulate it, if it is available as it is, is a rhetorical question)... and then already classical logic of statistics... and even in very deep history there are not enough samples to qualitatively analyze robust moments (everything happens in life)... I guess outliers can happen in Mihail Marchukajtes classification model as well... (I need to think, how should the sequent deal with them?)

so far my imho is ... I'll also look at import scipy.stats as stats

p.s.

thanks for the link

 
JeeyCi #:

to the logic ... that NS is used when it is necessary to bypass the lack of a formula describing the dependence of a trait on a factor... weighting is used... but before and after NS, standard/classical statistical processing is in effect... For example having only PDF=F'(X)=dF(x)/dx (although we don't need CDF, because all conclusions from population analysis are made by PDF) and having volatile data - first of all I need to make distributions uniform for possibility of their joint analysis - and here weighting is helpful (I don't aspire to mathematics here)... but the analysis itself has nothing to do with NS, nor do its conclusions to it (ns)... although such estimation may be crude, but the classical statics is also imperfect (e.g., the use of logarithms of increments already introduces trendiness into the conclusions by itself - a purely mathematical defect)... and any model has its Assumptions...

market participants do NOT wait for predictions, but assess risk and volatility and make their trading (and hedge) decisions based on this... It's just that in this analysis there are 2 variable factors - volatility and time window - and NS helps to bring the samples to uniformity (but you can use GARCH as well) to allow their joint analysis within one statistical model and helps to determine the horizon... in those moments when there is no mathematical formula, and you don't need it (everything changes in this world)... but by weighting, weighting and weighting again - to bring to the possibility of joint analysis within one statistical model, and preferably without noise or at least with its minimization...

The main thing, I suppose, is to build such NS architecture, that when neuronal layers pass on the way to the output the dispersion does not increase... imho (why to accumulate it, if it already is available as it is - a rhetorical question)... and then already classical logic of statistics... and even in very deep history there are not enough samples for qualitative analysis of robust moments (everything happens in life)

so far my imho is ... I'll also look at import scipy.stats as stats

p.s.

thanks for the link

When will the practical application begin?
Reason: