Machine learning in trading: theory, models, practice and algo-trading - page 176

 
Alexey Burnakov:
1) no comments. This is bullshit.

2) Well, that's how you talk about dimensionality reduction in the first place. These are steps BEFORE training. And about the properties of the network itself I have not heard...

3) What a load of bullshit. Can you say something about L1, L2 norms for lossfunction?

On Kaggle, exactly the regression was done with the L1 weighted fitness function. And the top spots were taken by people who were constructing the fits on past price data.

And what can you add about the quality metrics obtained at the top in that competition? Or is this just more bullshit? The roar is usually looser...

1)Err... Well you define the denials mister, last time I agreed with you thatIdeliberatelymisledeveryone bytelling lies, but now you deny it, it turns out I was telling the truth? Or explain the definitions of the terms delusion and nonsense. And what is the function of Bullshit from Delusion.

2)Again I repeat, I did not talk about a single ANN of any type, it is a "black box" of many elements simple and complex, some of which are ANN, what exactly to call the "steps to learning" in this case is difficult to determine and is not necessary, to be honest this term for a long time, usually say preprocessing or extraction of features, in general to separate this process makes sense for small systems, and even for small not always, CNN for example on convolution layers extracts features learning filtering, this is also learning, but even in the womb About properties of "network" I can only say that there are a lot of them, you cannot repeat them.

3)->1) tell about L1L2loss function))))) I will not say anything new about them, I suggest you start with Vorontsov's MLcourse , I wouldn't advise you to google pappers right away, you may be too short of a basis for understanding them, you need to start with the basics.

 
J.B:

I suggest to start with Vorontsov's ML course, I do not advise to google pipers right away, probably not enough base to understand what is in them, here you need to start with the basics.

Not quite understand your reference to Konstantin Vorontsov.

If we look the section SOFT, the given list speaks about squalor of this resource, or more precisely about the Soviet scientifically shaped squalor: a lot of formulas, theorems, etc., etc. and full absence of tools for practical work.

Here is my first, maybe superficial impression of this resource.

Ошибка
  • www.machinelearning.ru
Запрашиваемое название статьи неправильно, пусто, либо неправильно указано межъязыковое или интервики название. Возможно, в названии используются недопустимые символы.
 
J.B:

1)Err... Well define denials, mister, last time I agreed with you that I deliberatelymisledeveryone bytelling lies, but now you deny it, so I was telling the truth? Or explain the definitions of the terms delusion and nonsense. And what is the function of Bullshit from Delusion.

2)Again I repeat, I did not talk about a single ANN of any type, it is a "black box" of many elements simple and complex, some of which are ANN, what exactly to call "steps to learning" in this case is difficult to determine and is not necessary, to be honest such a term for a long time, usually say preprocessing or extraction of features, in general to separate this process makes sense for small systems, and even for small not always, forexample CNN on convolution layers extracts features learning filtering, this too learning, but even at About properties of "network" I can only say that there are a lot of them, you will not repeat them.

3)->1) tell about L1L2loss function))))) I will not say anything new about them, I suggest you start with Vorontsov's MLcourse , I would not advise you to google pappers right away, you may not have enough basis to understand what is what, you need to start with the basics.


You can't say anything new because you're too lazy to insert quotes from the wiki and you can't formulate an answer yourself.

One more time. Turning to Kaggle winton, there is a problem with regression of price increase. If you are too lazy to even read it. What can you say about the organizers' choice of the L1 loss function? Its properties? Why is it applicable to the market? How can you comment on the results of the competition in its top part?

Do not send me links to the courses. There's the standard material, which I have been using for a long time in my work.

Rotten excuse really. "I'm not telling you anything new." At least say something, schoolboy.
 
SanSanych Fomenko:

I did not quite understand your reference to Konstantin Vorontsov.

If we look at the section SOFT, the given list shows the squalor of this resource, or rather about the Soviet science-like squalor: a lot of formulas, theorems, etc., etc. and a complete lack of tools for practical work.

Here is my first, maybe superficial impression of this resource.

Dear Konstantin Vyacheslavovich leads a good introductory course on machine learning in Russian, I advised Alexei Burnakov to watch it in order to start mastering ML. Since for some reason he decided that I should teach him what the L1 and L2 norms are.
 
Alexey Burnakov:

Say something, schoolboy.

Take a deep breath for half a minute... It'll pass. So you embarrassed yourself a little, it doesn't happen))) And rightly that you do not keep to yourself, and immediately switched to rudeness, when the resentment is held back it then spoils the health((((.

Allow poking, insulting me, even paste the picture with something nasty and link it to me, blow off steam completely! Feel that the framework R^2 and L1 do not give you peace, you need a little psychotherapy

 
J.B:
Dear Konstantin Vyacheslavovich leads a good introductory course on machine learning in Russian, I advised Alexei Burnakov to watch it in order to start mastering ML. Since for some reason he decided that I should teach him what the L1 and L2 norms are.

On what grounds do you call him "respected"?

Here I have every reason not to respect your Vorontsov.

Let me explain.

If we look at the list of his degrees and titles, a person of this level cannot not know about machine learning in R. These are thousands of functions and hundreds of monographs that are not mentioned, but mention is obligatory) on a site called "machine learning". Fora professor at the Russian Academy of Sciences, Ph.D. in Physics and Mathematics, and so on and so forth, this is unthinkable! In Soviet times, if one managed to figure out such dormant ignorance, one would become a laughingstock for the rest of one's life and one would never be able to wash oneself clean.

That's what it's all about.

 
SanSanych Fomenko:

On what grounds do you call him "respected"?

Here I have every reason not to respect your Vorontsov.

Let me explain.

If we look at the list of his degrees and titles, a person of this level cannot not know about machine learning in R. These are thousands of functions and hundreds of monographs which are not mentioned, but the mention is obligatory) on a site called "machine learning". Fora professor at the Russian Academy of Sciences, Ph.D. in Physics and Mathematics, and so on and so forth, this is unthinkable! In Soviet times, if one managed to figure out such dormant ignorance, one would become a laughingstock for the rest of one's life and one would never be able to wash oneself clean.

This is what it's all about.

Konstantin Vyacheslavovich I know personally, not by hearsay, I know the level of his competence and ability to solve REAL problems, I put it on a par with Lekun and Hinton, in respect, with your metric estimate ML professionals do not agree. And I will explain why.

Now, for example, Python for ML is popular in wide masses, IMHO such scripting languages as Python, Matlab, R etc. are good mainly for beginners, to FAST try many STANDARD tools and immediately visualize the results of work. In production engineering unfortunately the practice showed that the standard tools are very rarely used as they are, all the tools are self-modified and constantly modifiable, and you need to work fast, every time you rewrite from scripting language to C++ you probably know what a pain (((( But to write a number of spare parts and shells, But it's possible AND WANTED to write a number of components, shells and widgets to work as comfortably from your environment, in native Python (for standard tasks) as in Python, it will bring HUGE dividends than prototyping in Python and rewriting it in Syes every time. Changes, and there will always be changes, are made faster by almost an order of magnitude. So I'm not surprised that a machine learning guru might not know R or python.

PS: About "thousands of functions and monographs". Well for example in Mql also thousands, probably tens of thousands of functions, classes and programs are written and how many of them (alien production) are you using?

 
J.B.:

Konstantin Vyacheslavovich I know personally, not by hearsay, I know the level of his competence and ability to solve REAL problems, I put him on a par with Lukun and Hinton, in respect, with your metric assessment of ML specialists do not agree. And I will explain why.

Now, for example, Python for ML is popular in wide masses, IMHO such scripting languages as Python, Matlab, R etc. are good mainly for beginners, to FAST try many STANDARD tools and immediately visualize the results of work. In production engineering unfortunately the practice showed that the standard tools are very rarely used as they are, all the tools are self-modified and constantly modifiable, and you need to work fast, every time you rewrite from scripting language to C++ you probably know what a pain (((( But to write a number of spare parts and shells, But it's possible AND WANTED to write a number of components, shells and widgets to work as comfortably from your environment, in the native Python (for standard tasks) as in Python, it will bring HUGE dividends than prototyping in Python and rewriting it in C. Changes, and there will always be changes, are made faster by almost an order of magnitude. So I'm not surprised that a machine learning guru might not know R or python.

PS: About "thousands of functions and monographs". Well for example in Mql also thousands, probably tens of thousands of functions, classes, programs and how many of them (someone else's work) do you use?

You don't get it: the guru may not use them, but the GURUS MUST MEMBER ANALOGUE WORKS. And if he doesn't mention it, he's not a guru. So it's not about R at all. It's about the principle.

I know this kind of public well from Soviet times. All of these "gurus" were in educational institutes and were engaged in science that had nothing to do with anything. And it was absolutely impossible to make them turn their eyes to practice. And if in Soviet times such a public somehow could be brought to their senses, but now it is an isolated class of people, judging by the numerous references. And this class in general has no relation to world science, to the world trend - they have isolated themselves and write something there, write...

PS.

Take the requirements to published articles on this site: mandatory reference to analogues at the beginning of the article.

 

Guys, stop this harping bile, if only 10% of this energy could be channeled in the right direction.... ehhh, if we could all unite and do something together, ehhh I wish it were possible...

Guys I have what I think is a pretty strong idea how to extract patterns from data, I've been nurturing it for a long time and I'm sure that if this method doesn't work but no MO will, but we need help in implementation and even in the very computing power

If anyone is ready to join the development, come back ...

 

SanSanych Fomenko:

The guru may not use it, but the GURU MUST MEMBER ANALOGUE WORKS. And if he doesn't mention it, he's not a guru.

I see what you mean, he's an asshole in this, but otherwise he's a guru, I can't take anything away from him.

SanSanych Fomenko:

All of these "gurus" were in educational institutes and were engaged in scientific imagination that had nothing to do with anything. And it was absolutely impossible to get them to turn their eyes to practice.

How do you like this: http://www.forecsys.ru/ru/site/projects/safran/ in 97? You say it has nothing to do with the case?))


Reason: