Machine learning in trading: theory, models, practice and algo-trading - page 2302

 
Aleksey Mavrin:

x2=0

at x2=DBL_MIN.

Y will not be infinite, but very large.

If you break away from the machine numbers, you can still decrease x2, which will still increase Y. And so on. - The race of infinitesimal and infinitely large numbers will also be infinite. )

 
elibrarius:
Where isy = x1/x2 interrupted?

Read what a discrete function is and what a continuous function is and don't talk nonsense.

If you want to argue and not understand something, then train the network with one hidden layer on the functiony = x1/x2.

And then explain why the network didn't approximate"y" because it's all so continuous, according to your beliefs


It's written that it can:

В соответствии с теоремой Цыбенко, сеть с одним скрытым слоем способна аппроксимировать 
любую непрерывную многомерную функцию с любой желаемой степенью точности

and it can't...

Explain why?

 
mytarmailS:

Read what a discrete function is and what a continuous function is and don't talk nonsense.

If you want to argue and not understand something, then train the network with one hidden layer on this functiony = x1/x2

And then explain why the network didn't approximate"y" because it's all so continuous, according to your beliefs


It's written that it can:

and it can't...

Explain why?

The functiony = x1/x2 is continuous. Except at the point x2=0. And even that can be argued, because it can be approached infinitely long.
 
elibrarius:
The functiony = x1/x2 is continuous. Except at the point x2=0. And even then you can argue.

What if it's close enough to zero?

Or alternatively, divide by two - less than/more than zero

then the problem is solvable.
 
Renat Akhtyamov:
What if it's close enough to zero?

It can be approached indefinitely.

 
elibrarius:
The functiony = x1/x2 is continuous. Except at the point x2=0. And even then you can argue.

So why isn't the network learning?

it is written

В соответствии с теоремой Цыбенко, сеть с одним скрытым слоем способна аппроксимировать 
любую непрерывную многомерную функцию с любой желаемой степенью точности


elibrarius:
Yes and it can be argued, because it can be approached infinitely long.

how is it infinite? the error has reached zero and there's nowhere to go.


Okay, I see you're way off base... You're talking nonsense...

 
mytarmailS:

So why isn't the network learning?

it is written.


How is it infinite?? The error has reached zero and there is nowhere to go...


Okay I see you're way off base... You're talking nonsense...

again booze booze booze booze?

 
Maxim Dmitrievsky:

booze again?

I quit a month ago, even New Year's Eve dry...

The last joy I had in my life was .... life is a pain))

 
mytarmailS:

I quit a month ago, even New Year's Eve dry...

The last joy I had in my life, and then .... life is pain ))

it can approximate the rips too, but not with any accuracy anymore

 
Maxim Dmitrievsky:

it can approximate the discontinuous ones as well, but not with any accuracy

No one's arguing, that's not what we're talking about...

counter question - booze booze?

:)

Reason: