Machine learning in trading: theory, models, practice and algo-trading - page 2559

 
mytarmailS #:

looking for the road to the nuthouse? :)

Problem is, I'm not the one who comes up with all these definitions, so I can only send to Google, for more accurate information. I can find a link to that article later. But it's about entropic analysis of series, stationary and not so stationary
 
Maxim Dmitrievsky #:
The problem is that I do not make up all these definitions, so I can only send to Google, for more accurate information. I can find a link to that article later

Well, that's the funny thing...

You said "regularity," I don't know what that is, so I didn't ask you and I went and googled it and it turns out that's not what you meant. If I didn't know that, we'd be using the same concept (regularity) implying different things, so we'd never get to the point...

And all because of one pseudoscientific idiot...

 
mytarmailS #:

I want to train SMM, but in an unusual way, through a fitness function, genetic or other...

I want to make state transition matrices myself... There is a package, there are these matrices, but what and where to change I do not really understand, can you help with this?

The fitness function in HMM is the logarithm of likelihood. If you come up with a custom f. f., that's already some other method.

 
Aleksey Nikolayev #:

In HMM, the fitness function is the logarithm of likelihood. If you come up with a custom f. f., it is already some other method.

So what do we need to optimize ?

fit <- HMMFit(x , nStates = 3)
> fit

Call:
----
HMMFit(obs = x, nStates = 3)

Model:
------
3 states HMM with 5-d gaussian distribution

Baum-Welch algorithm status:
----------------------------
Number of iterations : 60
Last relative variation of LLH function: 0.000001

Estimation:
-----------

Initial probabilities:
           Pi 1         Pi 2 Pi 3
  2.636352 e-255 2.770966 e-50    1

Transition matrix:
          State 1    State 2    State 3
State 1 0.1864987 0.76046799 0.05303333
State 2 0.2539474 0.60377350 0.14227910
State 3 0.6191488 0.07157308 0.30927815

Conditionnal distribution parameters:

Distribution parameters:
  State 1
           mean  cov matrix                                               
      0.4752939  0.97587370  0.02993559 -0.21805741  0.25639651  0.1567241
     -0.5686039  0.02993559  0.85342747  0.43374921  0.18220534 -0.2149688
      0.3739333 -0.21805741  0.43374921  0.58127533 -0.01600787 -0.2097350
     -0.3833589  0.25639651  0.18220534 -0.01600787  1.13979299 -0.3723484
     -0.5871168  0.15672407 -0.21496881 -0.20973503 -0.37234835  1.0462750

  State 2
            mean  cov matrix                                               
      0.07949112  1.14644170  0.21413163 -0.05544488 -0.02902406 0.04179052
      0.15306029  0.21413163  0.84865045 -0.19661403 -0.12397740 0.01617397
     -0.03560680 -0.05544488 -0.19661403  1.25872915  0.15638695 0.03917204
      0.07304988 -0.02902406 -0.12397740  0.15638695  0.70073838 0.02934227
      0.35500064  0.04179052  0.01617397  0.03917204  0.02934227 0.65031019

  State 3
           mean  cov matrix                                              
     -0.5093426  0.60603137 -0.21462708  0.06322606  0.27231407 0.1076386
      0.1526545 -0.21462708  0.56847783 -0.06347737 -0.15941211 0.2161427
     -1.0672876  0.06322606 -0.06347737  0.17662599  0.08658292 0.1981628
      0.7778853  0.27231407 -0.15941211  0.08658292  1.17497274 0.4802186
     -0.2541008  0.10763858  0.21614270  0.19816276  0.48021858 0.7488420


Log-likelihood: -1379.07
BIC criterium: 3118.43
AIC criterium: 2894.14

Here is a model for three states

 
mytarmailS #:

So that's the funny thing...

You said "regularity," I don't know what it is, so I didn't ask you and I went and googled it and it turned out to be not what you meant. If I hadn't understood that, we'd be using the same concept (regularity) implying different things, thus we'd never get to something common...

All because of one pseudoscientific idiot...

The moral of this pseudo-science is that stationarity does not mean predictability, and vice versa :D Markets are unpredictable because they are non-stationary. And they are not unsteady because they are unpredictable. That's it, I'm tired.
 
Maxim Dmitrievsky #:
That's it, I'm tired

Me too)

 
mytarmailS #:

So what needs to be optimized?

Here's a model for the three states.

So everything is already optimized by the Baum-Welch algorithm. The optimal value of the logarithm of likelihood is written out below. The parameters (transition matrix and others) are calculated.

 
By the way, the term "regularization" is also used when describing comb and lasso regressions) There, it means compression of coefficients to zero in order to reduce the variance of the model.
 
Aleksey Nikolayev #:

So everything is already optimized by the Baum-Welch algorithm. The optimal value of the likelihood logarithm is written out below. The parameters (transition matrix and others) are calculated.

This is just a model trained to three states, and I want a model that will be trained so that my fitness function was satisfied.

Imagine that I train a neuron, change its weights by genetics and watch its fitness.

I want to do the same with SMM, but I will change its transition matrix.


But it's clear what to change with the weights of the neuron, and not so much with this one.

 
mytarmailS #:

This is a model trained in the three states, and I want a model that will be trained so that my fitness fu. was satisfied.

Imagine that I train a neuron, change its weights with genetics and watch its fitness.

I want to do the same with SMM, but I will change its transition matrix.


But with the weights of the neuron it is clear what to change, and here not so much.

What I need is the ability to set a custom FF. But this function HMMFit() doesn't support this possibility, because it implements a Baum-Welch with a rigidly sewn into it LLH. You can only set some Baum-Welch parameters.

You need another package where you can set a user-defined f.f.

Reason: