Scraping bond yield data from the web

11 April 2019, 12:48
Steven Brown
7
8 153

Introduction

Automated trading is based almost entirely on technical indicators that use past price action to forecast future price action. However, the trader who ignores fundamental forces that move markets is at a disadvantage to traders who factor fundamental data into their trading decisions. An indicator based on fundamental data gathered automatically can improve the performance of an Expert Advisor. The fundamental data that have the most effect on exchange rates are interest rates, which affect the perceived value of currencies. While central bank rates are not volatile, the yields on government bonds, such as the U.S. 10-year treasury note, fluctuate on all time frames in global bond markets. Those yields reflect the expectation the market has as to where future central bank rates will go. Bond yields are often a leading indicator of interest rates and of exchange rates. In the forex market, the metric that applies to a currency pair is the interest rate differential, especially the delta, or change in the interest rate differential, on various time frames. Figure 1 shows a case where movement of the interest rate differential, expressed in basis points, in the positive direction was a leading indicator of movement of the EUR/USD currency pair in the same direction. This article shows how to gather bond yield data from the web and to derive from that data expressions of interest rate differential and delta.


Interest rate differential leading indicator

Figure 1. Interest rate differential indicator on hourly EUR/USD chart.

Scraping 101

A web page displayed in a browser typically consists of many elements: formatted text, graphics, images, sound, and video. All those elements reside in files on web servers and are downloaded sequentially by the browser using specific addresses or URLs to access them. However, it is possible for a program to download one element of the page, ignoring the rest, because that element has useful information. Obtaining information that way is called "scraping." To do that, the scraper program has to have the URL to the file that contains the element, which may be a number displayed on a web page. It can download that file, search it for the text representing the number, and convert that text to a numerical value.

Get the URL

The first task in scraping is to obtain the URL of the file containing the element to be downloaded. That can be the URL of the web page if the element is embedded in the html text of the page. In that case, the element can be parsed from the html text of the page. Or the URL can be embedded in a link on the page, which the browser uses to fetch the element and which the scraper program can use to fetch the html text from which to parse the element. Or the URL can be passed to the browser by a script linked to on the page, which the browser runs after downloading the page and the script. In that case, the scraper program does not have to run the script but can use the URL that the script generates. That URL can be discovered by using the Developer Tools available in Internet Explorer or Google Chrome. Whatever the source of the URL, the scraper program uses it to download a file from the web server and parse it for the desired information. There are several financial websites that report bond yields. We shall look first at https://www.marketwatch.com/investing/bond/tmubmusd10y?countrycode=bx, to create examples of scraping.

First, let's look at the html file that the browser downloads from the web server when the above link is clicked. With the page displayed in the Chrome browser, click the tools button in the upper-right corner, move the mouse cursor to "More tools," select "Save page as," download the html file, and open it in a text editor such as Notepad. It becomes apparent that this web site made it easy for bots to get the quote, because it is included in one of a series of meta tags in the head of the html file. Metadata is not displayed by browsers and has no effect on what is displayed, but it is accessible to any program that downloads the html file. The quote appears in the meta tag <meta name="price" content="3.066">, 28 characters from the beginning of the tag, and it is the same value that is prominently displayed on the page by the browser. The scraper program can search the file for the text string [<meta name="price" content=], add 28 to the index of the start of the meta tag, and convert the text at the resulting location to a floating-point number. To avoid confusion, square brackets are used in this article to quote html text which makes frequent use of quotation marks.

Build a better bot

The html file linked to above, when downloaded from the server, contains large blocks of style sheet information, and the total size is 295 kilobytes. However, the meta tag of interest is only 3 kilobytes from the beginning of the file. A well-behaved bot does not download more data than it needs, so it would be reasonable to download only the first 4 kilobytes every time it gets the quote. Unfortunately, the mql function WebRequest() has no way to limit the amount of data to download. The array containing server response data has to be a dynamic array. If a static array of a specified size is used, the program will compile but an error occurs at runtime, crashing the terminal. A Range header could be included in the request to the server, but most servers do not support the Range header. Therefore, the amount of data downloaded from a web server is almost always the size of the html file requested. A better way is to use the functions in wininet.dll, a component of Windows. One of those functions, InternetReadFile(), can download a specified number of bytes, even when the Range header is not supported and the download begins at the start of the file. WinINet functions can be imported into an mql script or Expert Advisor. The file ScraperBot01.mq5 attached to this article is a script that downloads the first 4 kilobytes of the html file, locates the meta tag of interest in the downloaded text, finds the text in that tag that represents the last quoted yield on the 10-year T-note, converts that text to a floating-point number, and prints its value to the terminal.

ScraperBot 01

The source code file ScraperBot01.mq5 begins by importing wininet.dll and prototyping the functions that will be called, declaring all parameters as having types compatible with mql5. The WinINet functions are documented at https://docs.microsoft.com/en-us/windows/desktop/wininet/wininet-reference.

#import "wininet.dll"
  int InternetCheckConnectionW(string& lpszUrl, uint dwFlags, uint dwReserved);
  int InternetOpenW(string& lpszAgent, uint dwAccessType, string& lpszProxyName, string& lpszProxyBypass, uint dwFlags);
  int InternetOpenUrlW(int hInternetSession, string& lpszUrl, string& lpszHeaders, uint dwHeadersLength, uint dwFlags, uint dwContext);
  int InternetReadFile(int hFile, uchar& lpBuffer[], uint dwNumberOfBytesToRead, uint& lpdwNumberOfBytesRead);
  int InternetCloseHandle(int hInternet);
#import

uchar uc_Buffer[4096]; // InternetReadFile() expects a static buffer.
float f_US;

The static array uc_Buffer that will receive the html text downloaded from the web server, and the variable f_US that will be set to the numerical value parsed from the text, are declared on the global scope. The convention in this and other files attached to this article is to denote global variables by means of an underscore between type specifier and name. The size of uc_Buffer is set to accommodate the specific number of bytes that will be downloaded.

A few local variables are declared at the top of OnStart(), and others are declared as needed, for clarity as to their purpose. First we check to see if an internet connection is available. Return values of functions called in this script are printed to the terminal, to indicate success or failure, and if an error occurs, open handles are closed and the script is terminated by the "return" statement. If an internet connection is available, the handle iNet1 is initialized for subsequent calls to WinINet functions. A valid handle value is greater than zero.

void OnStart() 
{ bool bResult;  int i, iNet1, iNet2;  

  string stURL = "http://www.msn.com"; 
  bResult = InternetCheckConnectionW(stURL, 1, 0); // 1 == FLAG_ICC_FORCE_CONNECTION
  Print("InternetCheckConnectionW() returned ", bResult);
  if(!bResult) return;
  
  string stAgent = "Mozilla/5.0", stNull = "";
  iNet1 = InternetOpenW(stAgent, // _In_ LPCTSTR lpszAgent 
                        1,       // 1 == INTERNET_OPEN_TYPE_DIRECT
                        stNull,  // _In_ LPCTSTR lpszProxyName
                        stNull,  // _In_ LPCTSTR lpszProxyBypass
                        NULL);   // _In_ DWORD dwFlags
  Print("iNet1 == ", iNet1);
  if(iNet1==0) return;


Next a connection to the web server is established, initializing the handle iNet2 to download the html file.

  stURL = "https://www.marketwatch.com/investing/bond/tmubmusd10y?countrycode=bx";
  string stHdr = "Accept: text/*";
  iNet2 = InternetOpenUrlW(iNet1,            // HINTERNET hInternet,
                           stURL,            // LPCWSTR   lpszUrl,
                           stHdr,            // LPCWSTR   lpszHeaders,
                           StringLen(stHdr), // DWORD     dwHeadersLength,
                           0x00080000,       // DWORD     dwFlags, 0x00080000 == INTERNET_FLAG_NO_COOKIES
                           NULL);            // DWORD_PTR dwContext
  Print("iNet2 == ", iNet2);
  if(iNet2==0) 
  { InternetCloseHandle(iNet1);
    return;
  }


Now we can download data from the web server.

  uint uGet, uGot;
  uGet = 4080; // number of bytes to download
  bResult = InternetReadFile(iNet2,     // _In_  HINTERNET hFile
                             uc_Buffer, // _Out_ LPVOID lpBuffer
                             uGet,      // _In_  DWORD dwNumberOfBytesToRead
                             uGot);     // _Out_ LPDWORD lpdwNumberOfBytesRead

  Print("InternetReadFile() returned ", bResult, ". Number of bytes read: ", uGot);
  InternetCloseHandle(iNet2);  // download done
  if(!bResult) {InternetCloseHandle(iNet1); return;}
  uc_Buffer[uGot] = 0// Terminate string in uc_Buffer by appending a null character.


Now we search for the meta tag of interest in the downloaded text, and if it is found, add 28 to the offset so we can use that as the index in uc_Buffer to the text representing the number. That text is accessed by calling StringSubstr(), passing the index to it in the variable "i". If the text at that index does not represent a number, StringToDouble() returns zero, indicating an error, unless the bond yield happens to be exactly zero. Note that a quotation mark within a string is coded as \" to distinguish it from the quotation marks at the beginning and end of a string.

  i = StringFind(CharArrayToString(uc_Buffer), "<meta name=\"price\" content=", 0); // 0 == position from which search starts 
  Print("Offset of \'<meta name=\"price\" content=\' == ", i); 
  if(i == -1) {Print("String not found.");  InternetCloseHandle(iNet1);  return;} 
  i += 28; // Advance index to known location of text representing bond yield.
  f_US = StringToDouble(StringSubstr(CharArrayToString(uc_Buffer), i, 8));
  Print("US 10-year T-note yield, stored in variable f_US: ", f_US);
  InternetCloseHandle(iNet1); // Done with wininet.
}//END void OnStart()

The script ScraperBot01 can run on any chart, report its progress in the terminal and print the value scraped from the web.

An alternative

Now that the procedure to scrape data from a web page has been demonstrated, let's consider what to do if the bond yield data were not present in a meta tag. In that case, we would examine the html file of the web page for a source of the number that is displayed on the page. Knowing the latest quote of the bond yield, we can use the search feature of a text editor to find a string of text representing the number. In the html file that we downloaded, the quote can be found in three other locations besides the meta tag. The first of those three is a JSON-LD structured data snippet, an html element designed to make information about a web page easily accessible to search engines and web crawlers. Here is that data snippet, formatted with line breaks for clarity.

<script type="application/ld+json">
{ "@context":"http://schema.org/",
  "@type":"Intangible/FinancialQuote",
  "url":"https://www.marketwatch.com/investing/bond/tmubmusd10y?countrycode=bx",
  "name":"U.S. 10 Year Treasury Note",
  "tickerSymbol":"TMUBMUSD10Y",
  "exchange":"Tullett Prebon",
  "price":"3.061",
  "priceChange":"0.007",
  "priceChangePercent":"0.22%",
  "quoteTime":"Sep 28, 2018 5:07 p.m.",
  "priceCurrency":"PERCENT"
}
</script>

The algorithm will first search for the offset of the tag <script type="application/ld+json">, and from that location find the offsets of ["price":"] and of </script>. If the offset of ["price":"] is less than the offset of </script>, indicating it is within the data snippet, we add 9 to the offset of ["price":"] to arrive at the offset of the number representing the quote. That procedure is demonstrated in the attached script ScraperBot02.mq5.

ScraperBot 02

This script downloads the html file up to the size specified by uMax. The first time it is run, uMax should be set to a value that is much greater than the expected size of the download, such as 1 million. The script reports the number of bytes downloaded, and if that is at or near uMax, the value of uMax should be increased. The script also reports the offset in the file of the tag <script type=\"application/ld+json\">. The value of uMax can then be set somewhat higher than the offset of the tag. In this case, the offset turns out to be 166696, so uMax is set to 180224 to download enough of the file to include the JSON-LD snippet but not the entire file. The script uses a static array to download chunks of 16 kilobytes which are copied to and accumulated in a dynamic array. These arrays are declared with global scope.

uchar uc_Buffer[16400], uc_DynBuf[];

ScraperBot02 is then the same as ScraperBot01 up to the part where data is downloaded from the web server, which parcels out data in chunks. InternetReadFile is repeatedly called in a do-while loop until the desired amount of data is downloaded.

  uint uGet, uGot, uDst, uMax;
  uGet = 16384;    // number of bytes to download per call to InternetReadFile, must be at least 1 byte less than size of uc_Buffer
  uGot = uDst = 0; // uGot is number of bytes downloaded in call to InternetReadFile; uDst is total number of bytes downloaded.
  uMax = 180224;   // maximum number of bytes to download.

  do
  { bResult = InternetReadFile(iNet2,     // _In_  HINTERNET hFile
                               uc_Buffer, // _Out_ LPVOID lpBuffer
                               uGet,      // _In_  DWORD dwNumberOfBytesToRead
                               uGot);     // _Out_ LPDWORD lpdwNumberOfBytesRead

    uc_Buffer[uGot] = 0; // Terminate string in uc_Buffer by appending a null character.

    ArrayCopy(uc_DynBuf, // destination array 
              uc_Buffer, // source array 
              uDst,      // index at which writing to destination array begins 
              0,         // index at which copying from source array begins 
              uGot);     // number of elements to copy 
    uDst += uGot; // advance index of destination array for next pass in loop
  }while(bResult && uGot > 0 && uDst < uMax);
 
  Print("Size of uc_DynBuf == ", ArraySize(uc_DynBuf));
  Print("Bytes downloaded  == ", uDst);

ScraperBot02 now locates the tag <script type=\"application/ld+json\"> and stores the offset in the index variable i. Starting from that offset, it locates ["price":"] and stores that offset in j. Then it locates </script> at the end of the snippet and stores that offset in k. If j is less than k, 9 is added to j which becomes the offset of the text representing the number which is then converted to a floating-point value in f_US and printed to the terminal.

  int i, j, k; // indexes

  i = StringFind(CharArrayToString(uc_DynBuf), "<script type=\"application/ld+json\">", 0); // 0 == position from which search starts 
  Print("Offset of <script type=\"application/ld+json\"> == ", i); 
  if(i == -1) {Print("<script type=\"application/ld+json\"> not found.");  InternetCloseHandle(iNet1);  return;}

  j = StringFind(CharArrayToString(uc_DynBuf), "\"price\":\"", i); // i == position from which search starts 
  if(j == -1) {Print("\"price\":\" not found.");  InternetCloseHandle(iNet1);  return;}
  Print("Offset of \"price\":\" == ", j); 

  k = StringFind(CharArrayToString(uc_DynBuf), "</script>", i); // i == position from which search starts
  Print("Offset of </script> == ", k); 
  if(j > k) {Print("Offset of \"price\":\" is greater than offset of </script>");  InternetCloseHandle(iNet1);  return;}

  j += 9; // Advance index to known location of text representing bond yield.
  f_US = StringToDouble(StringSubstr(CharArrayToString(uc_DynBuf), j, 8));
  Print("US 10-year T-note yield, stored in variable f_US: ", f_US);
  InternetCloseHandle(iNet1); // Done with wininet.
}//END void OnStart()

Using developer tools

Another source of the quote from that web server can be found using developer tools in the Chrome browser. From the menu button in the upper-right corner, open the developer tools and enter https://www.marketwatch.com/investing/bond/tmubmusd10y?countrycode=bx into the address bar. Select the Network tab in the center pane and select "XHR" as the type of events to monitor. Selecting any one of those events opens a pane on the right, showing details such as headers and response. The event labeled "quoteByDialect..." has an interesting response, which can be highlighted by right-clicking in that pane and selecting "Select all." Press Ctrl+C to copy the highlighted text to the clipboard, then paste it into a text editor. The quote can be found in the block of text after the string ["CompositeTrading":{"Last":{"Price":{"Iso":"PERCENT","Value":]. The URL to fetch that block of text can be found under the Headers tab. In this case it is rather long: https://api.wsj.net/api/dylan/quotes/v2/comp/quoteByDialect?dialect=official&needed=CompositeTrading|BluegrassChannels&MaxInstrumentMatches=1&accept=application/json&EntitlementToken=cecc4267a0194af89ca343805a3e57af&ckey=cecc4267a0&dialects=Charting&id=Bond-BX-TMUBMUSD10Y,Bond-BX-TMBMKDE-10Y. Clicking that link right from this article displays the block of text in a browser window. It is actually two blocks of text, one after the other, because there are two ticker symbol strings, separated by a comma, at the end of the URL. The first, "Bond-BX-TMUBMUSD10Y", is for the US 10-year treasury note, and the second, "Bond-BX-TMBMKDE-10Y", is for the German 10-year government bond. Deleting the second ticker symbol string from the URL reduces the size of the downloaded text from 7.1 kilobytes to 3.6 kilobytes.

The attached script ScraperBot03 downloads the block of text for the ticker symbol "TMUBMUSD10Y", locates the string ["CompositeTrading":{"Last":{"Price":{"Iso":"PERCENT","Value":], adds 61 to the offset of the start of the string, uses that as the index to the text representing the number, converts the text to a floating-point number and prints it to the terminal. The script is modeled after ScrapterBot01, so the code is not quoted here. An advantage of this resource is the small size of the download. As the file addressed by the URL is only 3.6 kilobytes, the mql5 function WebRequest, instead of functions in wininet.dll, could be used to download it, without downloading far more data than needed.

ScraperBot 04

This script downloads the same data as ScraperBot 03, utilizing WebRequest instead of WinINet functions. In order for WebRequest to work, the base URL for the server, in this case "https://api.wsj.net", needs to be included in the list of allowed servers under "Tools\Options\Expert Advisors" in the MetaTrader platform. The global char array ch_Data does not pass any data to WebRequest and exists only to satisfy the requirement for a parameter of that type.

char ch_Buffer[], ch_Data[16];
float f_US;

void OnStart() 
{ int i;   
  string stURL = "https://api.wsj.net/api/dylan/quotes/v2/comp/quoteByDialect?dialect=official&needed=CompositeTrading|BluegrassChannels&"
                 "MaxInstrumentMatches=1&accept=application/json&EntitlementToken=cecc4267a0194af89ca343805a3e57af&ckey=cecc4267a0&"
                 "dialects=Charting&id=Bond-BX-TMUBMUSD10Y";
  string stHdr = "Accept: text/*, User-Agent: Mozilla/5.0";
  string stRspHdr; // response header
  
  i = WebRequest("GET",     // const string  method,    HTTP method 
                 stURL,     // const string  url,       URL 
                 stHdr,     // const string  headers,  
                 1024,      // int           timeout, 
                 ch_Data,   // const char    &data[],   the array of the HTTP message body 
                 ch_Buffer, // char          &result[], an array containing server response data 
                 stRspHdr); // string        &result_headers 

  Print("Server response code: ", i);
  if(i == -1) {Print("GetLastError == ", GetLastError());  return;}
  Print("Size of ch_Buffer (bytes downloaded) == ", ArraySize(ch_Buffer));
  Print("Response header:\n", stRspHdr);   
 
  string stSearch = "\"CompositeTrading\":{\"Last\":{\"Price\":{\"Iso\":\"PERCENT\",\"Value\":";
  i = StringFind(CharArrayToString(ch_Buffer), stSearch, 0); // 0 == position from which search starts 
  Print("Offset of ", stSearch, " == ", i); 
  if(i == -1) {Print(stSearch, " not found.");  return;}
  i += 61; // Advance index to known location of text representing bond yield.
  f_US = StringToDouble(StringSubstr(CharArrayToString(ch_Buffer), i, 8));
  Print("US 10-year T-note yield, stored in variable f_US: ", f_US);
}//END void OnStart()


Other bonds

In order to derive an interest rate differential, there need to be two interest rates. The scripts that fetch the yield on the US 10-year treasury note can be adapted to fetch the yield on the German 10-year government bond by substituting the ticker symbol "TMBMKDE-10Y" for "TMUBMUSD10Y" in the URLs. The ticker symbols for other nations can be used as well. Other financial web sites may use different ticker symbols, but the same principle applies. The German bond is often used as a proxy for the interest rate pertaining to the euro, but doing that neglects the influence of other nations of the European Union. However, a composite interest rate for the euro can be derived from two or more European government bonds. In this example, the 10-year government bond yields of the top three EU member nations, in terms of GDP, whose currency is the euro, will be used to derive a composite "European bond" rate. Those three nations are Germany, France, and Italy. Their bond yields will be weighted according to the relative sizes of their economies, shown in the following table.


2017 Gross Domestic Product, billions of Euros

Germany3,197
France2,241
Italy1,681
Total7,119


The weighting factor of each country will be the ratio of its GDP to the total of the three. For Germany it is 0.449, France 0.315, and Italy 0.236. Those factors add up to 1 and are used as coefficients in calculating the composite value of the bond yield for the euro, according to the following equation.

f_EU = 0.449*f_DE + 0.315*f_FR + 0.236*f_IT

where f_EU is the composite European bond yield, f_DE is the German bond yield, f_FR is the French bond yield, and f_IT is the Italian bond yield.

Interest rate differential

The EUR/USD currency pair is quoted as the value of the euro in terms of the US dollar, so it tends to move in the same direction as the value of the euro and in the direction opposite to that of the dollar. For movement of the interest rate differential to forecast movement of the currency pair in the same direction, an increase in the European composite bond yield should move the interest rate differential in the positive direction, and an increase in the US T-note yield should move it in the negative direction. Therefore, the interest rate differential is calculated as f_EU - f_US. At the time of writing this article, the calculated value of f_EU is 1.255, and the value of f_US is 3.142. Therefore, the interest rate differential is 1.255 - 3.142 = -1.887. Movement in the positive direction, say from -1.887 to -1.798, would forecast upward movement of EUR/USD. Movement in the negative direction, from -1.887 to -1.975, would forecast downward movement of EUR/USD. The strength or reliability of the indicator depends on the size of the move of the interest rate differential. Movement of interest rates is usually denoted in basis points or one hundredth of a percentage point. Movement of the interest rate differential from -1.887 to -1.975 would be a move of 8.8 basis points in the negative direction, a fairly strong move on an intraday basis, likely indicating a downward move of the currency pair in the hourly time frame. A move of only one or two basis points is in the realm of market noise and unlikely to be a reliable indicator of movement of the currency pair.

ScraperBot 05

This script fetches all four bond yields, calculates the composite European bond yield, and prints the interest rate differential to the terminal. It is modeled after ScraperBot 04, but instead of making a separate request to the server for each bond yield, all four ticker symbols are appended to the URL, and all four quotes are returned in one 14-kilobyte download containing four sequential blocks of text. ScraperBot 05 locates the ticker symbol in the file before locating the string preceding the quote, and it returns with an error message if a string is not found.

char ch_Buffer[], ch_Data[16];      // global buffers
float f_US, f_DE, f_FR, f_IT, f_EU; // global variables to store bond yields

void OnStart() 
{ int i;   
  string stURL = "https://api.wsj.net/api/dylan/quotes/v2/comp/quoteByDialect?dialect=official&needed=CompositeTrading|BluegrassChannels&"
                 "MaxInstrumentMatches=1&accept=application/json&EntitlementToken=cecc4267a0194af89ca343805a3e57af&ckey=cecc4267a0&"
                 "dialects=Charting&id=Bond-BX-TMUBMUSD10Y,Bond-BX-TMBMKDE-10Y,Bond-BX-TMBMKFR-10Y,Bond-BX-TMBMKIT-10Y"; // four ticker symbols

  string stHdr = "Accept: text/*, User-Agent: Mozilla/5.0";
  string stRspHdr; // response header

  i = WebRequest("GET",     // const string  method,    HTTP method 
                 stURL,     // const string  url,       URL 
                 stHdr,     // const string  headers,  
                 1024,      // int           timeout, 
                 ch_Data,   // const char    &data[],   the array of the HTTP message body 
                 ch_Buffer, // char          &result[], an array containing server response data 
                 stRspHdr); // string        &result_headers 

  Print("Server response code: ", i);
  if(i == -1) {Print("GetLastError == ", GetLastError());  return;}
  Print("Size of ch_Buffer (bytes downloaded) == ", ArraySize(ch_Buffer));
   
  string stSearch = "\"CompositeTrading\":{\"Last\":{\"Price\":{\"Iso\":\"PERCENT\",\"Value\":";

// Get US 10-year treasury note yield.
  i = StringFind(CharArrayToString(ch_Buffer),"\"Ticker\":\"TMUBMUSD10Y\"", 0); // 0 == position from which search starts 
  if(i == -1) {Print("\"Ticker\":\"TMUBMUSD10Y\" not found.");  return;}
  i = StringFind(CharArrayToString(ch_Buffer), stSearch, i); // i == position from which search starts 
  Print("Offset of ", stSearch, " == ", i); 
  if(i == -1) {Print(stSearch, " not found.");  return;}
  i += 61; // Advance index to known location of text representing bond yield.
  f_US = StringToDouble(StringSubstr(CharArrayToString(ch_Buffer), i, 8));
  Print("US 10-year T-note yield, stored in variable f_US: ", f_US);

// Get German 10-year government bond yield.
  i = StringFind(CharArrayToString(ch_Buffer),"\"Ticker\":\"TMBMKDE-10Y\"", i); // i == position from which search starts 
  if(i == -1) {Print("\"Ticker\":\"TMBMKDE-10Y\" not found.");  return;}
  i = StringFind(CharArrayToString(ch_Buffer), stSearch, i); // i == position from which search starts 
  Print("Offset of ", stSearch, " == ", i); 
  if(i == -1) {Print(stSearch, " not found.");  return;}
  i += 61; // Advance index to known location of text representing bond yield.
  f_DE = StringToDouble(StringSubstr(CharArrayToString(ch_Buffer), i, 8));
  Print("German 10-year government bond yield, stored in variable f_DE: ", f_DE);

// Get French 10-year government bond yield.
  i = StringFind(CharArrayToString(ch_Buffer),"\"Ticker\":\"TMBMKFR-10Y\"", i); // i == position from which search starts 
  if(i == -1) {Print("\"Ticker\":\"TMBMKFR-10Y\" not found.");  return;}
  i = StringFind(CharArrayToString(ch_Buffer), stSearch, i); // i == position from which search starts 
  Print("Offset of ", stSearch, " == ", i); 
  if(i == -1) {Print(stSearch, " not found.");  return;}
  i += 61; // Advance index to known location of text representing bond yield.
  f_FR = StringToDouble(StringSubstr(CharArrayToString(ch_Buffer), i, 8));
  Print("French 10-year government bond yield, stored in variable f_FR: ", f_FR);

// Get Italian 10-year government bond yield.
  i = StringFind(CharArrayToString(ch_Buffer),"\"Ticker\":\"TMBMKIT-10Y\"", i); // i == position from which search starts 
  if(i == -1) {Print("\"Ticker\":\"TMBMKIT-10Y\" not found.");  return;}
  i = StringFind(CharArrayToString(ch_Buffer), stSearch, i); // i == position from which search starts 
  Print("Offset of ", stSearch, " == ", i); 
  if(i == -1) {Print(stSearch, " not found.");  return;}
  i += 61; // Advance index to known location of text representing bond yield.
  f_IT = StringToDouble(StringSubstr(CharArrayToString(ch_Buffer), i, 8));
  Print("Italian 10-year government bond yield, stored in variable f_IT: ", f_IT);

// Calculate European composite bond yield.
  f_EU = 0.449*f_DE + 0.315*f_FR + 0.236*f_IT;
  Print("European composite bond yield: ", f_EU);

// Calculate interest rate differential.
  Print("Interest rate differential, f_EU-f_US = ", f_EU-f_US);
}//END void OnStart()

ScraperBot06.mq4 implements ScraperBot05.mq5 using WinINet instead of WebRequest, which was found to be unreliable on the MT4 platform.


Delta

Changes in interest rate differential are more relevant to trading a currency pair than the interest rate differential itself. Delta can be expressed as the value of the interest rate differential at the close of a bar minus its value at the close of the preceding bar. While that single-period delta may be useful on longer time frames, an exponential moving average of delta is more useful on shorter time frames, as it takes the change over many bars into account. An EMA is calculated by applying a smoothing factor or alpha to the current delta and adding that to (1-alpha) times the previous value of the EMA, designated EMAp.

EMA = a*Delta + (1-a)*EMAp

Because delta can be positive or negative and has a mean value of zero, EMAp can be initialized to zero in the case where no previous value of the EMA is available. The value of alpha can be arbitrarily assigned, or it can be calculated from an arbitrarily assigned number of periods of the moving average. In that case,

a = 2.0 / (n+1)

where n is the number of EMA periods, which can be a whole or fractional number. The normal range of alpha is greater than zero, less than or equal to 1, or 0 < a <= 1. When the value of alpha is 1, the calculation of EMA does not take the previous value of the EMA into account, and the EMA becomes the current delta.

If an historical database of interest rate differential is available, an indicator can be coded to display a time series of interest rate differential or of an EMA of delta on a chart of a currency pair. Indicators are not allowed to fetch data from the internet, but an indicator can fetch data from a local disk file that is updated by a script fetching bond yield data from the internet. Programming such an indicator is a topic for another article.

Conclusion

Source code from the attached scripts can be copied into Expert Advisors to automate the collection of fundamental data. The techniques demonstrated in this article to scrape bond yield data from the web can be applied to other financial web sites and to other currency pairs besides the one used in the examples. If a resource addressed by a URL changes and no longer works, the programmer has the knowledge and the means to find out what is wrong and to find a solution. A well-behaved bot does not download more data than necessary and does not make too frequent requests, certainly not every tick of a currency pair. Bond yields are less volatile than exchange rates and once every five minutes is probably the highest frequency necessary to update bond yields used as an indicator for trading. Likewise, information obtained from web sites is not necessarily public domain and should not be re-distributed. Used wisely, automated collection of fundamental data has the potential to make automated trading profitable.

Attached files |
ScraperBot01.mq5 (4.23 KB)
ScraperBot02.mq5 (5.66 KB)
ScraperBot03.mq5 (4.54 KB)
ScraperBot04.mq5 (2.59 KB)
ScraperBot05.mq5 (5.02 KB)
ScraperBot06.mq4 (7.48 KB)
Last comments | Go to discussion (7)
Steven Brown
Steven Brown | 15 Apr 2019 at 04:42
ScraperBot06.mq5 attached to this message is the same as ScraperBot05.mq5 but uses the functions in WinINet.dll instead of WebRequest(). I uploaded ScraperBot06.mq5 because WebRequest() in ScraperBot05.mq5 now returns error code -1, after which GetLastError() returns code 4014, 

ERR_FUNCTION_NOT_ALLOWED

4014

Function is not allowed for call

 

Metatrader 5 may have changed the implementation of WebRequest() since I submitted the article in October 2018, but the documentation for WebRequest() has not changed.

Edit: the problem is solved on my computer after allowing requests to the web server "https://api.wsj.net" in Metatrader options. Apparently, the permission was removed by a Metatrader update. 

jimjack
jimjack | 15 Apr 2019 at 15:31
Steven Brown:
ScraperBot06.mq5 attached to this message is the same as ScraperBot05.mq5 but uses the functions in WinINet.dll instead of WebRequest(). I uploaded ScraperBot06.mq5 because WebRequest() in ScraperBot05.mq5 now returns error code -1, after which GetLastError() returns code 4014, 

ERR_FUNCTION_NOT_ALLOWED

4014

Function is not allowed for call

 

Metatrader 5 may have changed the implementation of WebRequest() since I submitted the article in October 2018, but the documentation for WebRequest() has not changed.




thank you for your reply.

I compiled it again and there's bunch of warning saying  "possible loss of data due to type conversion"

still no luck on seeing anything in the terminal. dll is allowed.

do I need any other packages from microsoft, .net etc...?


thanks



Steven Brown
Steven Brown | 15 Apr 2019 at 16:22
IMPORTANT! The call to WebRequest() in ScraperBot04.mq5 and ScraperBot05.mq5 will work only if the server named in the web request is allowed in Metatrader options. To enable a web server, click on Tools\Options, and in the dialog box that appears, click on the tab Expert Advisors. Check "Allow web request for listed URL," click "Add new URL," and enter the base URL of the web server. In the case of ScraperBot05, that URL is "https://api.wsj.net." If you do not do that, the call to WebRequest() will fail, reporting error codes -1 and 4014. This need to specifically allow a web server is mentioned in the article under the heading "ScraperBot04." I wrote "In order for WebRequest to work, the base URL for the server, in this case 'https://api.wsj.net', needs to be included in the list of allowed servers under "Tools\Options\Expert Advisors" in the MetaTrader platform."

I had the web server "https://api.wsj.net" allowed in my copy of Metatrader 5, but it was removed from the list by an update to a new version of Metatrader 5. That should not happen, and it appears to be a bug that should be fixed by Metaquotes. Once a web server is allowed, it should stay allowed through updates.  
Steven Brown
Steven Brown | 15 Apr 2019 at 16:23
jimjack:




thank you for your reply.

I compiled it again and there's bunch of warning saying  "possible loss of data due to type conversion"

still no luck on seeing anything in the terminal. dll is allowed.

do I need any other packages from microsoft, .net etc...?


thanks



The compiler warnings about loss of data can be ignored, because they are the result of using type float instead of type double. I chose to use type float because the precision of type double is not required in this application. After the warnings, compilation succeeds with zero errors. Did you download, compile, and try the script I attached to my reply, the one named ScraperBot06.mq5? It uses WinINet.dll instead of WebRequest(). As for the call to WebRequest() failing in ScraperBot05, do you have the web server enabled in Metatrader options? See my previous post, the one that begins "IMPORTANT!" I wrote in the article that WebRequest() will fail unless the web server is specifically allowed in the Options.
jimjack
jimjack | 16 Apr 2019 at 14:19

hmmm,

let me re-install  metatrader,  because the url was allowed from the beginning. yea I allowed wininet.dll and checked my win (10) directories... its there.

Developing graphical interfaces for Expert Advisors and indicators based on .Net Framework and C# Developing graphical interfaces for Expert Advisors and indicators based on .Net Framework and C#

The article presents a simple and fast method of creating graphical windows using Visual Studio with subsequent integration into the Expert Advisor's MQL code. The article is meant for non-specialist audiences and does not require any knowledge of C# and .Net technology.

Color optimization of trading strategies Color optimization of trading strategies

In this article we will perform an experiment: we will color optimization results. The color is determined by three parameters: the levels of red, green and blue (RGB). There are other color coding methods, which also use three parameters. Thus, three testing parameters can be converted to one color, which visually represents the values. Read this article to find out if such a representation can be useful.

Library for easy and quick development of MetaTrader programs (part I). Concept, data management and first results Library for easy and quick development of MetaTrader programs (part I). Concept, data management and first results

While analyzing a huge number of trading strategies, orders for development of applications for MetaTrader 5 and MetaTrader 4 terminals and various MetaTrader websites, I came to the conclusion that all this diversity is based mostly on the same elementary functions, actions and values appearing regularly in different programs. This resulted in DoEasy cross-platform library for easy and quick development of МetaТrader 5 and МetaТrader 4 applications.

Studying candlestick analysis techniques (part III): Library for pattern operations Studying candlestick analysis techniques (part III): Library for pattern operations

The purpose of this article is to create a custom tool, which would enable users to receive and use the entire array of information about patterns discussed earlier. We will create a library of pattern related functions which you will be able to use in your own indicators, trading panels, Expert Advisors, etc.