Machine learning in trading: theory, models, practice and algo-trading - page 3651

 
Maxim Dmitrievsky #:
A deal with a blabbermouth? I'd rather make a deal with the devil.
You still have to prove a deal.

I thought you were a blabbermouth who can't solve a simple problem. You better take the deal, because everybody can make mistakes, and so can you.

What's your problem? I'll show you the solution, the code. It'll teach you a lesson, which means you'll still benefit from the deal.

 
Andrey Dik #:

I thought you were a blabbermouth who can't solve a simple problem. You better take the deal, because everyone can make mistakes, and so can you.

What's your problem? I'll show you the solution, the code. It'll teach you something, which means you'll still get the benefit of the deal.

I know the solution and the code, but it won't be a direct solution to the problem. Which means it's off-topic.
 
Andrey Dik #:

Yes, it will.

Let's make a deal. I will show you the open source code here, not in some python, but the entire source code in MQL5. After that, if I solve the problem, you will never cross my path again. True, in this case you will be a blabbermouth. But it is still a good deal for you, because otherwise it turns out that you cannot solve the simplest task.

Are you going to refuse the deal or accept it?

"Oh, brother, they're crooks." [Carlson]


 
Maxim Dmitrievsky #:
I know the solution and the code, but it won't be a direct solution to the problem.

What do you mean "will not be a direct solution to the problem"?

I will show a direct solution of the problem - training of a neural network and its ability to produce a sequential series of unlimited length of natural numbers after training.

 
I didn't realise the perks of the deal.
 
Andrey Dik #:

What do you mean "would not be a direct solution to the problem"?

I will show a direct solution of the problem - training of a neural network and its ability to produce a sequential series of unlimited length of natural numbers after training.

I've already solved the problem that was impossible for you. Although it's a simple task.

If you are going to remove activation functions or other cheating, it is not a solution of the problem. It is not solved in the standard way.
 
Maxim Dmitrievsky #:
I didn't realise the perks of the deal.

I told you clearly, I'm teaching you.

If you don't agree, you'll be left ignorant and in a very ugly light.

If you do, you get knowledge and no more arguments between us.

 
Andrey Dik #:

I told you clearly, science to you.

If you don't agree, you will remain an ignoramus and in a very ugly light.

If you agree, you will get knowledge and absence of disputes between us from now on.

Then we will be on the same level?))
I am not arguing with anyone, but discussing points.
If you suffer from it - post prufs, we'll discuss )))
 
Maxim Dmitrievsky #:
1. I have already solved the problem that was impossible for you. Even though it's a simple problem.

2. If you are going to remove activation functions or other cheating - it is not a solution to the problem. It is not solved in a standard way.

1. If you have solved it, then the problem is unsolvable, if you have solved it, then it is unsolvable. You haven't solved the problem to be solved, forget it, don't mention it.

2. No, everything is fair, a real neural network, with sigmoid activators. Standard ns training, no tricks, the code will be open.

 
Andrey Dik #:

1. If you have solved it, then the problem is unsolvable, if you have solved it, then it is unsolvable. You haven't solved a solvable problem, bygones, don't bring it up.

2. No, everything is fair, a real neural network with sigmoid activators. Standard ns training, no tricks, the code will be open.

I've solved your impossible problem. And it wasn't posed correctly. I had to guess and waste time.

Give me the code, it's your chance for rehabilitation.

You are not in a position to make conditions for me now :)