6.File operations

We are confidently approaching the completion of work on the methods of the CNeuronBatchNorm batch normalization class. Previously, we have already built methods for initializing the class, as well as built an algorithm for the operation of feed-forward and backpropagation passes using standard MQL5 capabilities. Let's move on to working on file handling methods. We have discussed the importance of having and correctly functioning these methods several times as the performance of these methods determines how quickly we can deploy a trained model into operational use.

We have already done similar work more than once for other classes in our library. Now we will follow the same established algorithm. First, we evaluate the need to write each element of the class to the data file. In the structure of our class, we have created only one new data buffer and one variable. Both of these elements are important for organizing correct object operations in our class. Therefore, we save both elements to a data file.

class CNeuronBatchNorm    :  public CNeuronBase
  {
protected:
   CBufferType       m_cBatchOptions;
   uint              m_iBatchSize;       // batch size
 
public:
                     CNeuronBatchNorm(void);
                    ~CNeuronBatchNorm(void);
   //---
   virtual bool      Init(const CLayerDescriptiondescriptionoverride;
   virtual bool      SetOpenCL(CMyOpenCL *opencloverride;
   virtual bool      FeedForward(CNeuronBaseprevLayeroverride;
   virtual bool      CalcHiddenGradient(CNeuronBaseprevLayeroverride;
   virtual bool      CalcDeltaWeights(CNeuronBaseprevLayer,bool read)override;
   //--- methods for working with files
   virtual bool      Save(const int file_handleoverride;
   virtual bool      Load(const int file_handleoverride;
   //--- object identification method
   virtual int       Type(void)  override   const {return defNeuronBatchNorm;}
  };

Having determined the scope of our work, we now proceed directly to creating the file handling methods for our class. As always, the first step is to create the CNeuronBatchNorm::Save method for writing data to the file. Like all the methods we have discussed so far, this one is also created as a virtual method in the base neural layer class and is overridden in each new neural layer class to fully save all the necessary information for subsequent restoration of the correct operation of the saved objects. In parameters, the method receives a file handle of to write the data.

bool CNeuronBatchNorm::Save(const int file_handle)
  {
//--- call the method of the parent class
   if(!CNeuronBase::Save(file_handle))
      return false;

The obtained file handle for writing data is not checked, as this control is already implemented in the same-named method of the parent class, which is called in the body of this method. Thus, we check the result of the operations of the method of the parent class.

   if(!CNeuronBase::Save(file_handle))
      return false;

It is very convenient to use the method of the parent class. This usage serves a dual purpose. The first purpose is a control function because the parent class already implements a set of controls that do not need to be duplicated in the new method. We only need to call the parent class method and check its execution result. The second purpose is functional. The method of the parent class already stores all inherited objects and variables. Here, it's the same situation: we call the parent class method once, thereby saving all inherited objects and variables. Convenient, isn't it? Moreover, we do not need to call the method for each individual functionality. With one call, we accomplish two tasks: control and saving of inherited objects. Checking the result of the function execution confirms the correct execution of both functions of the method.

After successfully executing the parent class method, we understand that the handle to the file provided as a parameter is valid. Now we can proceed with further file operations without the risk of getting a critical error. First, we save the normalization batch size, which is stored in the m_iBatchSize variable. Also, we make sure to check the result of the operation.

//--- save the size of the normalization batch
   if(FileWriteInteger(file_handlem_iBatchSize) <= 0)
      return false;

At the end of the method, we save the buffer of m_cBatchOptions normalization parameters. To do this, we just call the corresponding method of the specified object and check its operation result.

//--- save normalization settings
   if(!m_cBatchOptions.Save(file_handle))
      return false;
//---
   return true;
  }

As you can see, by using parent class methods and internal objects, we have described the method for saving all the necessary information easily and quite concisely. The main data saving controls and operations are hidden in these methods.

Similarly, let's create a method for loading data from the CNeuronBatchNorm::Load file. It should be noted that this method is responsible not only for reading data from a file but also for fully restoring the functionality of the object to the state at the time of data saving. Therefore, this method should include operations for creating instances of objects required for the correct functioning of our batch normalization class. In addition, we must initialize all unsaved objects and variables with initial values.

In the parameters, the CNeuronBatchNorm::Load method, like the previous data saving method, receives the handle of the file with the saved data. We have to organize the reading of data from the file in strict accordance with the sequence of their writing to the file. This time, in the body of the method, we immediately call the method of the parent class. The calculation here is the same: by calling the parent class method once, we immediately execute the entire functionality with inherited objects and variables. At the same time, we only need to check the result of the parent class method once to ensure the correctness of all its operations.

bool CNeuronBatchNorm::Load(const int file_handle)
  {
//--- call the method of the parent class
   if(!CNeuronBase::Load(file_handle))
      return false;

After the successful execution of the parent class method, we move on to loading the data of the objects of the batch normalization class. According to the sequence in which the data is written to the file, we first read the size of the normalization batch.

   m_iBatchSize = FileReadInteger(file_handle);

Finally, it remains to load the buffer data of the m_cBatchOptions normalization parameters.

//--- initialize a dynamic array of optimization parameters
   if(!m_cBatchOptions.Load(file_handle))
      return false;
//---
   return true;
  }

After successfully loading all the data, we will conclude the method execution with a positive result.

We have finished creating a batch data normalization layer using standard MQL5 tools. To complete the work on the CNeuronBatchNorm class, we need to supplement its functionality with the ability to perform multi-threaded mathematical operations using OpenCL. We'll do that in the next section. But now we have the opportunity to conduct the first tests.