Neural networks. Questions from the experts. - page 15

 

You have a 2-dimensional solution space in your figure. And the input is only one value - how can that be?

Maybe it is really worthwhile to get acquainted with the topic on a minimal level to continue the discussion?

 
joo:

You have a 2-dimensional solution space in your figure. And the input is only one value - how can this be?

Maybe it's really worth getting acquainted with the topic on a minimal level to continue the discussion?

When I was writing, I was aware that the figure does not exactly match, but this is just an example of overlapping classes, taken from an article on this forum. It doesn't change the point.

I repeat, there is already a solution to this problem but it is linear.

--------------------------------------------

OK, I'll draw mine now.

Handmade is always better..... )

 

Just in case, let me explain.

The space is one-dimensional, the data is spread out for clarity.



 
Then MLP is something like 1-N-1, where 1-inputs, N-neurons in the hidden layer (S-shaped f-activation), 1-output -1 or 1 (logical f-activation). N will depend on the degree of detail description of the desired function by the network.
 
joo:
Then MLP is something like 1-N-1, where 1-inputs, N-neurons in hidden layer (S-shaped f-activation), 1-output -1 or 1 (logical f-activation). N will depend on the degree of detail description of the desired function by the network.

I'm tired of poking. Both like this, like that, and like that.

N tried from 3 to 140.

Output of both 1 and 2 neurons.

And so on and so on.

Now I will try to post results....

--------------

I like to do everything myself, but I know I'll pass. I don't know what I'm doing.

Although there is a solution (linear), even encoded, and it works.

 

I wonder... Is the MQL language capable of this?

http://rutube.ru/tracks/3140465.html?v=d80d4eebf754c9fcfa2116bc496b083a

 
lasso:
......

I'll try to post the results now....

.......

So where are the results?
 
lasso:

There is a solution (linear) though, even encoded, and it works.

For reference, if something can be done without using NS, it is better not to pull the NS. You will get tired of learning it. I faced many times, and in the literature talks about it. But if you understand the essence of the linear decision, and well pre-process data in the context of this essence. Then NS may work in accordance with the linear solution. With the assumption of errors peculiar to NS.....
 
nikelodeon:
For your information, if you can do something without using the NS, it is better not to pull the NS. It would be a hassle to teach it. I faced it more than once, and in the literature about it speaks. But if you understand the essence of the linear decision, and well pre-process data in the context of this essence. Then NS may work in accordance with the linear solution. With the assumption of errors peculiar to NS.....

I have not understood, what "in compliance with the lichy solution" is, but in general - I agree. :)

But apparently lasso is training his biological network to work with artificial ones in this way.

 
Wrong. don't bother......
Reason: