Machine learning in trading: theory, models, practice and algo-trading - page 633

 
Maxim Dmitrievsky:

of a smart man.

He talks some unscientific nonsense. He remembers about water memory, and then he sneers at the electron microscope, showing the results very far from the modern ones. I have not watched further, I am afraid he will start about flat earth :)

 
Dr. Trader:

Some unscientific nonsense. He remembers about water memory, and then he sneers at the electron microscope showing results very far from the modern ones.

What is different in modern ones? Do they use different electrons?

the water was ironic in case you didn't get it

 

It shows some blurry picture at the molecular level. And here is a video from 5 years ago where they have already reached the recognition of atoms (not molecules) and created a 3D model with their location.https://www.youtube.com/watch?v=yqLlgIaz1L0

It is not clear what magnetic waves he wants to see on the CD, where bits are represented by physical pits in plastic. And why on his picture of the CD under the microscope you can see bits of information going in a small circle, while on this scale they will be visually located in a straight line? What a show with false facts and pictures taken from the Internet.

Maxim Dmitrievsky:

It was ironic about the water, in case you didn't get it.

There's irony everywhere. What exactly did he mean by the water experience? He said and forgot, he did not give any conclusions. Seriously or not, it is not clear. About the molecules, too, ironic, it seems that he wants to say that they do not exist, and only he knows what is really there.

The first 20 minutes of the video is enough to doubt his arguments and confidence in him.

 
Dr. Trader:

It shows some blurry picture at the molecular level. And here is a video from 5 years ago, where they have already reached the recognition of atoms (not molecules) and the creation of a 3D model with their location.

It is unclear what magnetic waves he wants to see on the CD, where bits are represented by physical pits in plastic. And why on his picture of the CD under the microscope you can see bits of information going in a small circle, while on this scale they will be visually located in a straight line? What a show with false facts and pictures taken from the Internet.

He has irony everywhere. What exactly did he mean by his experience with water? He said and forgot, he didn't give any conclusions. Seriously or not, it is not clear. About the molecules, too, ironic, it seems that he wants to say that they do not exist, and only he knows what is really there.

The first 20 minutes of the video is enough to doubt his arguments and his credibility.

Video about randomness )) about everything and nothing, but essentially about what is random that we can not yet calculate. It is not at the molecular level and the atomic lattice and is shown as if. I didn't focus much attention on the disk.

He has another video in which he explains that light is a particle and not a wave, using interference as an example ) and that waves (and fields) were invented because we can not calculate all the options.

by the way, the tracks there are in straight lines, the image is filmed at an angle


 

Okay, I'll come at it from the other side. Suppose I have an input set of 100 inputs. I calculate the entropy for each input and get results from -10 to 10. Question: Which inputs are preferable to take????

Let's say I have 10 inputs below zero, the rest are higher, BUT all values lie between -10 and 10.....

 

And also... I can't calculate the mutual information.... Or rather conditional probability, for subsequent calculation of entropy and VI.

Can someone explain on the fingers or better example.

first column 40 lines input variable

second column 40 lines output....

Did a lot of work overnight to identify the hypothesis. Stuck on these things and no way. Please help and I will give my thoughts on my hypothesis...

 
Mihail Marchukajtes:

And also... I can't calculate the mutual information.... Or rather the relative probability, for subsequent calculation of entropy and VI.

Can someone explain on the fingers or better example.

first column 40 lines input variable

second column 40 lines output....

Did a lot of work overnight to identify the hypothesis. Stuck on these things and no way. Please help and I will give my thoughts on my hypothesis...

Have you been drinking at night?

 
Maxim Dmitrievsky:

were you drunk last night?

No... not drunk..... why?

I just couldn't sleep, and as a rule at times like that the only thought that comes to mind is. It's not worth wasting time when you have a lot of work to do...

 

Well, it seems that entropy and mutual information have little to do with econometrics at all

correlation, covariance, variance are usually used :) not that way a little

cross-entropy seems to be a concept used in NS in training... but why do you need it?

 
Maxim Dmitrievsky:

Well, it seems that entropy and mutual information have little to do with econometrics at all

correlation, covariance, variance are usually used :) not that way a little

Cross-entropy seems to be a concept used in NS in training... but why do you need it?

Well, first of all. With VI I plan to reduce the number of inputs to reduce the learning curve.

I chose only those inputs that have negative entropy and are close to zero. Learning with enviable consistency began to come to the same model parameters.

Getting a model is half the battle, the other half of this not simple matter is to choose exactly the one that will work in the future. To do this I save network data not in a pre-trendy form, but in a dowble and look at the entropy of the network and how it changes with time, by the way here is a table of entropy change in time for binary polynomial outputs....

1 7.481151166 5.100318157 4.593448434 8.798740335 10.34478836 4.480187448 4.462974562 4.864834535

2 7.675977242 5.395113191 4.647719201 9.658965819 -17.34873011 4.511112896 4.529873469 4.925396515

3 7.512766799 5.414556649 4.644887426 8.929776132 -976.6274612 4.644286062 4.386822711 5.050380326

4 8.045096956 5.079259638 4.671147058 9.875423555 9.171932774 4.623802531 3.917309752 4.941859173

5 8.045378868 5.007650592 4.290382249 9.433280634 10.64451391 4.647512921 3.790881638 4.990994671

6 7.814542877 3.644626791 4.344130499 8.980821417 10.5023546 4.637264293 3.831404183 5.032854966

7 -26.55886859 3.781908903 4.516251137 8.797781513 10.54684501 4.883377949 3.86512079 4.659267439

8 -161.3020423 3.718875753 4.564760685 9.184890078 9.157325707 5.074360669 3.785251605 4.364874679

9 1.909633919 3.825969935 4.579305659 8.739113103 8.280835877 5.009919646 4.242339336 4.39432571

10 6.213306097 -10.87341467 5.067862079 10.18574585 8.07128492 1.73846346 4.299916662 4.567998062

11 6.171390883 1.962160448 5.081660438 8.650951109 7.510213446 1.596086413 4.313971802 4.55943716

12 6.120246115 3.948723109 4.801258198 8.235748448 7.127388358 1.698956287 4.082715891 4.781776645

13 6.138878328 -3.010948518 4.804114984 8.523101895 7.177670414 1.698630529 4.082338047 4.82267867

14 6.212129971 -3.922803979 4.757739216 9.25848968 7.66609198 1.698756132 4.125811197 4.874060339

15 6.090848662 -7.954277387 4.76183886 10.81234021 7.701949544 1.540056412 4.062605741 4.915433819

16 5.99824787 -59.32132062 4.806934783 9.083600192 7.697975097 1.540406949 4.097070448 4.978901083

17 5.83493287 4.565768504 4.899180184 -28.38726036 7.830286358 1.543100257 4.25790422 5.043798266

18 5.758509171 -3.4626244 4.895859118 -1237.359668 8.484082841 1.706466252 4.177809837 5.037940939

19 5.744674247 -12.48734205 4.961865536 1.569990079 8.915892511 1.682437372 4.336780002 5.057555915

20 5.738253623 -10.20442198 4.98732747 9.795996355 8.842880831 1.539687763 4.344159624 5.106441146

21 5.731628697 -1.706645474 5.005196184 10.75926151 8.059670516 1.432952506 4.391768977 4.729395732

22 5.874802768 -0.43939394479 4.970298578 10.33058781 7.832786294 1.431618527 4.568893332 4.715744749

23 5.953727915 -3.949602879 5.017109405 9.668521648 7.941416688 1.425216096 4.646327857 4.745979757

Surprisingly, at some point when adding one value the entropy abruptly becomes negative. What can this be related to????

If we assume that the positive value is a measure of uncertainty, and the negative value is a measure of order, then we choose readings of the network with minimal entropy value, however I believe that too large an index in the negative zone is also not good. That's why there are two variants here, either to choose a network with the smallest entropy, or the one with entropy closer to zero....

Well and when calculation of VI will be organized, then it will be possible to look how many VI in output of network in relation to input. I think this approach will be able to dot a lot of I's.

It is not difficult to get a lot of models, but to choose correctly one is quite another matter.

I am waiting for comments on this post, and most importantly an explanation of why this may be. Theories of hypothesis, etc. I would appreciate it. Thanks!!!!

Reason: