Discussing the article: "Low-Frequency Quantitative Strategies in Metatrader 5: (Part 1) Setting Up An OLAP-Friendly Data Store"

 

Check out the new article: Low-Frequency Quantitative Strategies in Metatrader 5: (Part 1) Setting Up An OLAP-Friendly Data Store.

The article outlines a practical data pipeline for quantitative analysis based on Parquet storage, Hive-style partitions, and DuckDB. It details migrating selected SQLite tables to Parquet, structuring market data by source, symbol, timeframe, and date, and querying it with SQL window functions. A Golden Cross example illustrates cross‑symbol evaluation of forward returns. Accompanying Python scripts handle data download, conversion, and execution.

We just finished a series of articles introducing the basic concepts of statistical arbitrage for the average retail trader armed with only a consumer notebook, a regular internet connection, and limited capital. For it to be possible, we keep our focus on the low-frequency mean-reversion strategies, assuming that high-frequency trading (HFT) remains a capital-intensive domain, usually restricted to institutional players. We expect to have succeeded in showing that retail traders can succeed in low-frequency quantitative trading, provided that they have enough data, since the computational power required is already broadly available. Enough data means the data required to find niche opportunities. Enough computational power means hardware and software capable of processing the data in the required time.

In the last article of that series, we suggested the use of a specialized database for data analysis, an open-source and free database to be used in tandem with SQLite in our pipeline. The main idea behind this suggestion is that as our dataset evolves and grows, an Online Transaction Processing (OLTP) system like SQLite is not the best tool for the data analysis job, as it could have been when it was only demonstrating the application of the introductory concepts. In other words, as we are progressing towards a real trading system, we need a dedicated data analysis tool, being the first, and arguably the most relevant one, a specialized OLAP database.

Now, in this new series that you can think of as a new chapter of the same story, it is time to start implementing this OLAP-friendly data analysis system. However, if you followed that series, running the tools and experimenting with the provided scripts and statistical methods, you may have a lot of data already stored in your SQLite database, and some of this data may be the results of previous data analysis, like the outputs of cointegration tests and scoring system. You may want to preserve the part of this data that needs to be migrated or converted to the new system. But even if you think this data is irrelevant, at some point in the future, you may be faced with the need to import data from other database systems (Postgres, MySQL, etc.) including SQLite. So, our first task in this new chapter is to show how fast and straightforward this migration can be.