Discussion of article "Neural networks made easy (Part 5): Multithreaded calculations in OpenCL" - page 3

 
Dmitriy Gizlyk:

Using vector operations allows you to perform the product of 4 elements in parallel rather than sequentially. Watch the video https://ru.coursera.org/lecture/parallelnoye-programmirovaniye/4-1-chto-takoie-viektorizatsiia-i-zachiem-ona-nuzhna-f8lh3. It is about OpenMP, but the meaning is the same.

Thanks!

So vectorisation is done automatically by the compiler for OpenCL? I don't see any special commands in the code.

 
Aleksey Vyazmikin:

Thank you!

So vectorisation is done automatically by the compiler for OpenCL? I don't see any special commands in the code.

No, there is no automatic vectorisation. In the code we first declare 2 vector variables.

double4 inp, weight;

Then we write a portion of data into vector variables from the incoming buffers.

         default:
           inp=(double4)(matrix_i[k],matrix_i[k+1],matrix_i[k+2],matrix_i[k+3]);
           weight=(double4)(matrix_w[shift+k],matrix_w[shift+k+1],matrix_w[shift+k+2],matrix_w[shift+k+3]);
           break;

And then performing operations with vector variables means performing vector operations. The dot function used in the code is intended only for vector operations.

      sum+=dot(inp,weight);

Thus, we have parallelised the multiplication operation but not at the thread level but at the level of a concrete operation using vector calculations.

 
Dmitriy Gizlyk:

No, there is no automatic vectorisation. In the code we first declare 2 vector variables

Then we write a portion of data into vector variables from the incoming buffers.

And then performing operations with vector variables means performing vector operations. The dot function used in the code is intended only for vector operations.

Thus, we have parallelised the multiplication operation but not at the thread level but at the level of a concrete operation using vector calculations.

T.e. a special function dot() is responsible for vectorisation? Can this vectorisation be done in MQL5 without OpenCL?

 
Aleksey Vyazmikin:

T.e. a special function dot() is responsible for vectorisation? Can this vectorisation be done in MQL5 without OpenCL?

dot performs only scalar product. There are other functions to perform other operations.
There are no vector variables in MQL.

 
Dmitriy Gizlyk:

dot performs only scalar product. There are other functions to perform other operations.
There are no vector variables in MQL.

Got it, thanks for the clarification.

 

I still despite the extreme necessity of the article and gratitude to the author for the article summarise what is really missing in it:

1. The code of the main mql programme is not considered, the trading principle itself is not clear

2. It is not clear where the kernels are called.

3. The code contains references to third-party libraries, which are not described in the article, it is not clear

4. Kernels are not considered and explained in the article itself, that's why they caused such a heated discussion on the forum.

5. The kernel itself is written very difficult for most users not experienced in OpenCL.

6. The method of neural network training is not clear at all

These 6 points make the article practically useless for the absolute majority.

I will keep my opinion:

1. You need to create the mql code of a very simple Expert Advisor trading on one simple indicator, write its version on mql with a description of the code and the OpenCL version and compare the speed, ideally take a ready-made example like Moving Average.mq5.

2. Calls of functions from all third-party libraries should be described.

3. Describe kernels in detail, line by line, describe several variants of kernels and compare performance

4. Describe in detail the methodology of neural network training.

well, that's how it is ....

Документация по MQL5: Основы языка / Функции / Вызов функции
Документация по MQL5: Основы языка / Функции / Вызов функции
  • www.mql5.com
Если некоторое имя, которое не было описано ранее, появляется в выражении и за ним следует левая круглая скобка, то оно по контексту считается именем некоторой функции. Аргументы (формальные параметры) передаются по значению, т. е. каждое выражение x1, . . . , xn вычисляется и значение передается функции. Порядок вычисления выражений и порядок...
 
I would like to become more familiar with neural networks, to study them in depth. But where to start? Advise me something relevant, if not difficult.
 
I'm also interested, but there are so many sources you'd drown. Where to start.
 
What is the file in the NeuroNet archive - copy.mqh?
 
Ivan Titov:
What is NeuroNet.mqh file in the archive?

I cleaned the NeuroNet.mqh file a bit. And for compatibility with EAs of previous articles, I saved the old version in a copy.