Machine learning in trading: theory, models, practice and algo-trading - page 2450
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Is the average value of neural network weights, taken modulo, an indicator of the quality of its training?
Suppose there are two identical neurons, trained on the same data, one has a value of 0.87 and the other 0.23, which one is trained better?
Is the average value of neural network weights, taken modulo, an indicator of the quality of its training?
Suppose there are two identical neurons, trained on the same data, one of them has a value of 0.87, and the other one 0.23, which one is trained better?
It's a big deal...
Well probably the one that guesses better, shouldn't that be the first to come to mind????
A balanced one should be closer to 0.5, in theory. If you take a simple network like mlp. For binary classification.
Please, explain your point.
on the contrary, I think that the closer the value is to 1.0, the clearer the neural network is trained. and if it is about 0.0, then it is kind of "hesitant", if you can put it that way with respect to the neural network.
this is a pisser...
Well probably the one that guesses better, shouldn't that be the first thing that comes to mind????
everyone is smart in hindsight, neurons even more so...
Is the average value of neural network weights, taken modulo, an indicator of the quality of its training?
Suppose there are two identical neurons, trained on the same data, one of them has 0.87, and the other 0.23, which one is trained better?
No. Especially if some regression, where baes(ax+b) in space. What's the point of these indirect metrics, anyway? Run on points outside the training sample and everything is clear.
clarify the thought, please.
On the contrary, I think that the closer the indicator is to 1.0, the clearer the neural network is trained. and if it is about 0.0, then it is kind of "hesitant", if you can say so with respect to the neural network.
everyone is smart in the back mind, and neurons even more so...
there is a front one, it is called validation, and weights have nothing to do with prediction ability, they can be anything...
weights are roughly a formula , and prediction is the result of calculating a formula - compare one to the other == schizophrenia
there is a front one, it is called validation, and weights have nothing to do with prediction ability, they can be anything...
weights are roughly a formula, and prediction is the result of calculating the formula - to confront one with the other == schizophrenia
Zhi, shi are spelled with an i.
Zhi, shi is spelled with an i.
Thank you, Russian is not my language, I did not learn it...
Thank you, Russian is not my language, I didn't learn it...
It's all ahead of me.