Working with files from multiple charts - page 2

 
roshjardine:
create per order csv file withi similar data structure, have a directory watcher service that will combine these cvs files into one..read the target file programatically..this way you could avoid file writing lock in case there is...hope this helps

Although I didn't thoroughly get what you mentioned but I really appreciate your kind guidance and your time.

By the way I think I should first explain a bit about what I've done ==> I've introduced a single CSV-type file directory that it aims to collect the deleted orders data in it. So any chart that the EA is working on it, would first open the mentioned file and first check if the order was previously written or not and if it was not recorded there, the the EA will write the deleted order data in the CSV file. As the EAs from different chart are going to access the same single file then I used the  FILE_READ|FILE_WRITE|FILE_CSV|FILE_SHARE_READ|FILE_SHARE_WRITE as file opening flags.

Now my problem is that after some while working properly, the EA start to show the error #5001 and error #5004. This is the problem I can not find any solution for it.

 
parham.trader:

Although I didn't thoroughly get what you mentioned but I really appreciate your kind guidance and your time.

By the way I think I should first explain a bit about what I've done ==> I've introduced a single CSV-type file directory that it aims to collect the deleted orders data in it. So any chart that the EA is working on it, would first open the mentioned file and first check if the order was previously written or not and if it was not recorded there, the the EA will write the deleted order data in the CSV file. As the EAs from different chart are going to access the same single file then I used the  FILE_READ|FILE_WRITE|FILE_CSV|FILE_SHARE_READ|FILE_SHARE_WRITE as file opening flags.

Now my problem is that after some while working properly, the EA start to show the error #5001 and error #5004. This is the problem I can not find any solution for it.

for the sake of testing..if you do hardlink on the csv file, one for read and the other for write, does the error still show? the 5001 shows too many file open 
 

I've found a solution to the problem.

I just made changes to the "file_open_func" and used a FOR loop in opening the files so that it checks if the error is related to the error #5001 or the error #5004 then it iterates up to 500 times to open the file successfully. The code of the above-mentioned change is provided below:

//+------------------------------------------------------------------+
//|                         file_open_func                           |
//+------------------------------------------------------------------+
int file_open_func(string file_name,int file_open_flags,ENUM_FILE_POSITION pointer_position_origin=SEEK_END,long pointer_position_shift=0,short delimiter=';')
{
   int
   file_handle=INVALID_HANDLE,
   file_open_iteration,
   max_file_open_iteration=500;

   for(file_open_iteration=0; file_open_iteration<=max_file_open_iteration; file_open_iteration++)
      {
      ResetLastError();
      file_handle=FileOpen(file_name,file_open_flags,delimiter);
      if(file_handle!=INVALID_HANDLE || (_LastError!=ERR_FILE_TOO_MANY_OPENED && _LastError!=ERR_FILE_CANNOT_OPEN))
         break;
      }

   if(file_handle==INVALID_HANDLE)
      Print("File ",file_name," can not be opened due to error ",_LastError);
   else // The file is opened successfully and the pointer position in the file should be set
      FileSeek(file_handle,pointer_position_shift,pointer_position_origin);

   return(file_handle);
}

   The other changes I've made is about the opening flags within the "find_column_index_in_file_func" & "find_row_index_in_file_func" & "find_value_in_file_func" that I added the FILE_SHARE_WRITE flag to the previous ones and the current new flags in these functions are : FILE_READ|FILE_CSV|FILE_SHARE_READ|FILE_SHARE_WRITE

I'd appreciate if you share your opinion about this solutions and let me know about its pros and cons.

Thank you for your kind guidance

Reason: