Machine learning in trading: theory, models, practice and algo-trading - page 2452

 
Andrey Dik #:

Max, don't jump to conclusions.

The word "maybe" in your post suggests that you have not thought about such a statement of the question, right?

The neural network in general and MLP in particular is a very flexible thing, and the same set of features can be partitioned equally by the same network but at different values of neuron weights.... Right? - So the question arises, which of these variants of the set of weights is more robust?

And with the second, who answered my post, I do not think it is necessary to maintain a dialogue anymore - it is meaningless.

I did not encounter such information. I know that boundary values are bad for activation fi ions and learning, as a consequence
 
Maxim Dmitrievsky #:
I wondered about the scales myself. I know that limit values badly affect the activation fi ions and learning, as a consequence

Yes, it is interesting about weights, but even more interesting is what values the network outputs take.

 

To understand whether the quality of learning depends on the values of weights, we need to measure this dependence, by experiment.

We know what the values of we ights are, we don't know what the quality of learning is , hence we need to define...

Solearning quality, network performance, etc... - this is a measure by which we can express our expectations from the network (e.g. network error on new data)

Okay, now we have a definition of what quality of learning is, then we can measure the dependenceof quality of learning on the values of weights.

But if we have developed a measure of network quality ( learning quality ) then why do we need to look at weights if we can just choose the best value of the quality measure...

I don't know how handicapped you have to be not to understand these simple things + three people have already said the same thing...


 
mytarmailS #:

To understand whether the quality of learning depends on the values of weights, we need to measure this dependence, by experiment.

We know what the values of we ights are, we don't know what the quality of learning is , hence we need to define...

Solearning quality, network performance, etc... - this is a measure by which we can express our expectations from the network (e.g. network error on new data)

Okay, now we have a definition of what quality of learning is, now we can measure the dependenceof quality of learning on the values of weights.

But if we have developed a measure of network quality ( learning quality ) then why do we need to look at weights if we can just choose the best value of the quality measure...

I don't know how handicapped you have to be not to understand these simple things + three people have already said the same thing...

I'll try to put it very simply to the especially gifted: knowing metrics of weights and outputs, affecting quality of learning, we can get more robust network BEFORE checking on unfamiliar data, which healthy people don't have yet at the time of learning.

 
We are waiting for successful results of your research. Preferably in the form of a signal.
 
Andrey Dik #:

I will try to make a very simple point to especially gifted people: knowing the metric of weights and outputs, affecting the quality of learning, we can get a more robust network BEFORE checking on unfamiliar data, which healthy people do not have at the time of training.

1) define robustness at least for yourself, so that it can be hardened

2) how do you expect to measure network performance on new data, without using the new data (test) ?

 
mytarmailS #:

1) give a definition of serviceability, at least for yourself, so that it can be coded

2) how are you going to measure network performance on new data without using new data (test) ?

1. Well-mannered people address strangers as "you".

2. Apologize for your behavior.

Complete these two points and I will voice perhaps the answer to your questions.

 
elibrarius #:
We are waiting for successful results of your research. Preferably in the form of a signal.

And good luck to you.

I don't have to prove anything to anyone. If you are interested, you will think about it, and if you are not interested, you will pass by.

 
Andrey Dik #:

Is the average value of neural network weights, taken modulo, an indicator of the quality of its training?

Suppose there are two identical neurons, trained on the same data, one of them has a value of 0.87, and the other one 0.23, which one is trained better?

it is doubtful that a scalar quantity can uniquely characterize a vector or a polynomial - the result of NS training

 
Igor Makanu #:

it is doubtful that a scalar quantity can uniquely characterize a vector or a polynomial - the result of NS training

absolutely.
However, nobody forbids to use more complex and/or complex metrics. the main idea is to add to fitness function metrics of weights and NS outputs.
Reason: