Building a Trade Analytics System (Part 1): Foundation and System Architecture
Introduction
MetaTrader 5 stores trade history in the terminal, which can become a bottleneck when you need a reproducible trading journal or robust strategy analytics. If you want automated, queryable records keyed by symbol, magic number, order/deal/position tickets, or to run consistent reports across strategies, keeping data confined to MetaTrader 5 forces manual exports and ad‑hoc scripts. This first article addresses that concrete pain: we design a minimal, reliable pipeline to move closed trades out of the terminal and into an external store.
Rather than attempt a full multi‑user platform, Part 1 defines the target flow (MetaTrader 5 → Expert Advisor → HTTP API → Database) and delivers the backend scaffold to receive and persist trade records. It clarifies the contract clients will use and implements the Flask/SQLite foundation you need to continue integration. This article intentionally stops short of MetaTrader 5 integration: the Expert Advisor and the POST payload handling are reserved for subsequent parts, so expectations for Part 1 remain focused and achievable.
Prerequisites
Before we proceed, it is helpful to highlight a few basic requirements for following along with this series. This project brings together two environments: Python for the backend and MetaTrader 5 for the trading side. For that reason, a basic understanding of Python programming and general familiarity with MQL5 development will make the learning process smoother.
You do not need to be an expert in either. If you are comfortable reading simple Python code and understand how Expert Advisors work in MetaTrader 5, you will be able to follow along without difficulty. If some parts feel new, that is completely fine. The series is structured in a way that builds step-by-step, and each section focuses on practical implementation rather than theory. You are encouraged to move forward, experiment with the code, and learn through practice as the system gradually comes together.
System Design
The system is designed as a simple external trade analytics pipeline that begins in MetaTrader 5 and ends in a database-backed analytics environment. Its purpose is to capture trading activity from the terminal, move that data out in a structured form, and prepare it for later analysis.
At a high level, the system is made up of four main components:
- MetaTrader 5 terminal
- MQL5 Expert Advisor
- Flask Web Server API
- Database
The MetaTrader 5 terminal is where trading activity takes place. It is the source of the data, but it is not responsible for storage outside the terminal or for external analysis.
The MQL5 Expert Advisor acts as the collection layer. Its role is to observe relevant trading events, extract the necessary trade information, and prepare that information for transmission. The EA does not perform the analysis itself. Its job is to bridge the terminal and the external system.
The Flask API acts as the receiving layer. It accepts incoming trade data from the EA, processes the request, and prepares the data for storage. Introducing this backend creates a clean separation between the terminal side and the analytics side of the system. It also establishes a foundation that can be extended later into more advanced architectures.
The database acts as the persistent storage layer. Once trade records are received by the backend, they are committed to the database so that they can be queried, analyzed, and reused later. This transforms isolated trade activity into structured data that can support an external trading journal and analytics workflow.
In this series, the workflow starts when MetaTrader 5 captures relevant trade data. The data is packaged in a structured format, sent to the Flask API via HTTP, and stored in the database. This creates a clear and maintainable path from trade execution to external analysis.
The main design goal is simplicity: each component has a single responsibility. MetaTrader 5 generates trading activity; the EA captures and sends data; the backend receives it; and the database stores it. By separating these responsibilities early, the system becomes easier to understand, implement, and extend in later stages of the series.
Backend
In this section, we begin building the backend component of our system. This is a crucial step because the backend is responsible for receiving trade data from MetaTrader 5 and preparing it for storage in our database.
To keep the development process consistent and predictable, we will build this backend in a Linux environment, specifically using the Ubuntu distribution.
Development Environment Setup
If you are using a Windows machine, it is recommended to install the Windows Subsystem for Linux (WSL). WSL allows you to run a Linux environment directly within Windows. By default, it installs an Ubuntu distribution, which is sufficient for this project.
Once Ubuntu is installed, open the terminal and follow the steps below.
Step 1: Create the Project Directory
Create a new directory for the backend project:
mkdir mt5TradeAnalyticsBackend
Then navigate into the directory:
cd mt5TradeAnalyticsBackend
This will serve as the root folder for our backend application.
Step 2: Update Package Index
Before installing any software, update the package index:
sudo apt update
This ensures that your system has access to the latest available packages.
Step 3: Verify Python Installation
Check whether Python 3.10 or later is installed:
python3 --version If installed correctly, you should see an output similar to:
Python 3.12.3
If Python is not installed, you can install it using:
sudo apt install python3
Step 4: Create a Virtual Environment
A virtual environment allows you to isolate your project dependencies from the global Python installation. This is important because it prevents version conflicts and keeps your project clean and reproducible.
Create a virtual environment using:
python3 -m venv .venv
Note: On some Ubuntu systems, you may need to install the venv module first:
sudo apt install python3-venv
Step 5: Activate the Virtual Environment
Activate the environment with:
. .venv/bin/activate
Once activated, your terminal prompt will typically show (.venv) indicating that you are now working inside the virtual environment.
Step 6: Install Project Dependencies
Now we install the required libraries.
Install Flask:
pip install Flask
Flask is a lightweight web framework that will allow us to create API endpoints for receiving trade data.
When Flask is installed, it automatically installs several core dependencies, including:
- Jinja2
- Werkzeug
- Click
- Itsdangerous
- MarkupSafe
- Blinker
These libraries handle templating, request processing, CLI tools, and internal framework operations.
Install Flask-SQLAlchemy.
Next, install Flask-SQLAlchemy:
pip install -U Flask-SQLAlchemy
Flask-SQLAlchemy provides a simplified interface for working with databases using SQLAlchemy. It allows us to define models, interact with the SQLite database, and perform operations such as inserting and querying records more easily.
Readers who would like a more in-depth understanding of installation and usage are encouraged to refer to the official Flask documentation, where detailed and up-to-date guidance is provided directly from the source.
Data Modeling
To store and analyze trading activity outside MetaTrader 5, we need a structured representation of each trade. Instead of relying on the terminal’s internal history, we define a format that can be stored, queried, and reused efficiently.
For this project, we will use an SQLite database. SQLite is a lightweight, file-based database engine that comes bundled with Python through the built-in sqlite3 module. This means no additional installation or server setup is required. All data is stored in a single file on disk, making it simple to manage and ideal for building a foundation-level system like the one in this series.
Using SQLite allows us to:
- Define tables using standard SQL
- Persist trade data outside the terminal
- Query and analyze results later
- Integrate seamlessly with a Flask backend
Table: trades
We will define a single table named trades, which will act as the core storage for all captured trading activity. Each row represents one completed trade.
In addition to the basic trade fields, we also include several MetaTrader 5–specific identifiers such as order, deal, and position tickets. These help align our data model with how MetaTrader 5 internally represents trading activity and provide better traceability for analysis later.
The table will include the following fields:
- id — a unique identifier for each record, generated automatically,
- symbol — the traded instrument, such as EURUSD or XAUUSD,
- order_ticket — the identifier of the order associated with the trade,
- deal_ticket — the identifier of the executed deal,
- position_ticket — the identifier of the position to which the trade belongs,
- position_type — the direction of the trade, typically BUY or SELL,
- position_reason — indicates how the position was initiated, for example by an Expert Advisor or manual execution,
- lot_size — the trade volume,
- entry_price — the price at which the trade was opened,
- stop_loss — the stop loss level (this is optional),
- take_profit — the take profit level (this is also optional),
- profit — the final result of the trade after it has been closed,
- open_time — the time the trade was opened,
- close_time — the time the trade was closed,
- magic_number — used to identify trades placed by different strategies or Expert Advisors.
Now, we are going to get our hands dirty and start writing the app. Let us start by setting up our project properly.
Setting Up the Application
Navigate to the server-side project directory (mt5TradeAnalyticsBackend) and create an empty .py file for the backend source code. In the terminal's command line, run the following command:
touch app.py
Open the file using your preferred text editor and paste the following starter code:
from datetime import datetime from flask import Flask, jsonify, request from flask_sqlalchemy import SQLAlchemy from sqlalchemy import DateTime, Float, Integer, String from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column # Create the Flask application app = Flask(__name__) @app.route("/") def index(): return "<p>MT5 Trade Analytics Backend is running.</p>"
Understanding the Boilerplate Code
Let us briefly break down what this code does. The Flask class is used to create the main application instance. The @app.route("/") decorator defines a route that responds when the root URL is accessed. The index function returns a simple message confirming that the backend is running.
The imports at the top prepare us for working with:
- JSON requests (request, jsonify)
- Database models (SQLAlchemy, mapped_column)
- Data types such as Integer, Float, and DateTime
At this stage, we have a minimal web server ready. Next, we extend it to support database operations.
Configuring the Database
Immediately after creating the Flask application, configure the database:
# Configure the SQLite database app.config["SQLALCHEMY_DATABASE_URI"] = "sqlite:///data.db" app.config["SQLALCHEMY_TRACK_MODIFICATIONS"] = False
This tells Flask-SQLAlchemy to:
- Use SQLite database
- Store data in a file named data.db in the project directory
- Disable unnecessary tracking features for better performance
Initializing the Database
Next, we define a base class and initialize the database object.
# Define the base class for declarative models class Base(DeclarativeBase): pass
Create the SQLAlchemy object:
# Create the SQLAlchemy object
db = SQLAlchemy(model_class=Base)
Initialize the app with the extension
# Initialize the app with the extension
db.init_app(app)
Here is what is happening:
Base is the foundation for all models we will define. db becomes the central object used to interact with the database.
db.init_app(app) connects the database system to our Flask application.
Defining the Trade Model
We now define the structure of our trades table using a Python class.
class Trade(db.Model): __tablename__ = "trades" id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True) symbol: Mapped[str] = mapped_column(String(20), nullable=False) order_ticket: Mapped[int] = mapped_column(Integer, nullable=True) deal_ticket: Mapped[int] = mapped_column(Integer, nullable=True) position_ticket: Mapped[int] = mapped_column(Integer, nullable=True) position_type: Mapped[str] = mapped_column(String(10), nullable=False) position_reason: Mapped[str] = mapped_column(String(50), nullable=True) lot_size: Mapped[float] = mapped_column(Float, nullable=False) entry_price: Mapped[float] = mapped_column(Float, nullable=False) stop_loss: Mapped[float] = mapped_column(Float, nullable=True) take_profit: Mapped[float] = mapped_column(Float, nullable=True) profit: Mapped[float] = mapped_column(Float, nullable=False) open_time: Mapped[datetime] = mapped_column(DateTime, nullable=False) close_time: Mapped[datetime] = mapped_column(DateTime, nullable=False) magic_number: Mapped[int] = mapped_column(Integer, nullable=True) def __repr__(self) -> str: return ( f"<Trade id={self.id} symbol={self.symbol} " f"position_type={self.position_type} profit={self.profit}>" )
Creating the Database
Finally, we create the database and table structure:
with app.app_context(): db.create_all()
This ensures that:
- The data.db file is created if it does not exist
- The trades table is created based on the model definition
This operation only needs to run once when the application is initialized.
API Planning
With the database model now in place, the next step is to define how external clients will interact with our backend. In this project, the primary client is the MetaTrader 5 terminal through an Expert Advisor that will eventually send final trade data once a position has been closed.
At this stage, our goal is not to implement all endpoint logic in full, but to establish a clean and modular API structure that the rest of the system can grow around. This gives the application a clear contract early enough and makes later development easier to organize.
To better understand how the system operates, consider a simple flow. When a position is closed in MetaTrader 5, the Expert Advisor will collect all relevant trade data and send it as a JSON payload to the /api/v1/trades endpoint. The backend receives this request, parses the data, and prepares it for storage in the database. This one-directional flow keeps the system simple and reliable while ensuring that all completed trades are captured consistently.
The first small change we make is to import redirect from Flask:
from flask import Flask, jsonify, request, redirect
This allows us to redirect requests made to the root route / into our versioned API root. We define the route as follows:
@app.route("/") def index(): return redirect("/api/v1")
This is a small but useful touch. Instead of leaving the root route unused, we make it forward traffic to the main API namespace, which in our case is /api/v1.
API Root
We begin by defining the root of version 1 of the API:
# 1. API Root
@app.route("/api/v1")
def api_root():
return "<p>MT5 Trade Analytics API v1 is running.</p>"
This endpoint serves as a simple entry point into the application. Its purpose is mainly informational. When accessed through a browser, it confirms that the API is available and that version 1 is active.
Health Check Endpoint
The next endpoint is the health check:
# 2. Health Check @app.route("/api/v1/health") def health_check(): return jsonify({"status": "running"})
This route provides a machine-friendly way to confirm that the backend is operational. While the API root is mainly useful in the browser, the health endpoint is more suitable for quick checks by tools, scripts, or even later by the MQL5 terminal if needed.
Currently, it returns a very small JSON object containing a single status field. This is enough for now, and it keeps the response simple.
Trade Creation Endpoint
The most important endpoint in the current design is the one responsible for receiving final trade data:
# 3. Create Trade (Core Endpoint) @app.route("/api/v1/trades", methods=["POST"]) def create_trade(): return jsonify({ "message": "Trade endpoint ready. Logic will be implemented later." })
This is the core endpoint of the application. Once a position has been closed in MetaTrader 5, the Expert Advisor will send the final trade record to this route. Since all required fields are known at close time, a single submission is enough. This keeps the integration simple and avoids unnecessary complexity around partial updates or intermediate lifecycle states.
For now, the endpoint only returns a placeholder response. Later in the series, we will expand it so that it accepts incoming JSON, validates the payload, creates a Trade object, and commits the record to the database.
Retrieve All Trades Endpoint
We also define an endpoint for retrieving all stored trades:
# 4. Get All Trades @app.route("/api/v1/trades", methods=["GET"]) def get_trades(): return jsonify({ "message": "Retrieve all trades - not implemented yet." })
This route will later be used to fetch the collection of all recorded trades from the database. It will become useful when we begin inspecting stored data, testing the backend, and eventually building the analytics layer.
Even though the implementation is not yet complete, defining the route now makes the structure of the API more coherent. The same resource, /api/v1/trades, can now support both creation through POST and retrieval through GET.
Retrieve a Single Trade Endpoint
The next endpoint allows us to retrieve one trade record using its internal identifier:
# 5. Get Single Trade @app.route("/api/v1/trades/<int:trade_id>", methods=["GET"]) def get_trade(trade_id): return jsonify({ "message": f"Retrieve trade {trade_id} - not implemented yet." })
This route introduces a dynamic path parameter, trade_id, which identifies the specific record being requested. Later on, this will allow us to inspect individual trades directly from the database.
Defining this route now is useful because it makes the API more complete and gives us a dedicated path for record-level inspection without overloading the collection endpoint.
Analytics Summary Endpoint
Finally, we define an endpoint for analytics:
# 6. Analytics Summary @app.route("/api/v1/analytics/summary", methods=["GET"]) def analytics_summary(): return jsonify({ "message": "Analytics summary - not implemented yet." })
This route is separate from the trade collection because it represents processed information rather than raw trade records. Its purpose will be to return summary statistics derived from the stored data, such as total trades, win rate, or cumulative profit. By separating analytics from the trade resource itself, the API remains cleaner and easier to extend later.
Final API Structure
At this point, the application defines the following routes:
1. GET /
Redirects to the versioned API root
2. GET /api/v1
Returns a simple message confirming the API is running
3. GET /api/v1/health
Returns a JSON status response
4. POST /api/v1/trades
Receives final trade data from MetaTrader 5
5. GET /api/v1/trades
Returns all stored trades
6. GET /api/v1/trades/<trade_id>
Returns one stored trade by identifier
7. GET /api/v1/analytics/summary
Returns summary analytics derived from stored trade data.
This design remains intentionally simple. It gives the project a clear and modular API without introducing unnecessary complexity at this early stage. More importantly, it aligns with the overall system design we established earlier: final trade data will be sent once a position is closed, stored in the database, and later made available for retrieval and analysis.
All communication between MetaTrader 5 and the backend uses JSON. The Expert Advisor sends trade data as a JSON payload via HTTP POST. Flask parses the payload (request) and stores the relevant fields in the database. The API responses are returned as JSON via jsonify.
Testing the Local Backend Environment
With the API structure and database model in place, the next step is to run the application locally and confirm that everything behaves as expected. This step is important because it verifies that the backend is correctly configured before we begin integrating it with MetaTrader 5.
Begin by navigating to your project directory and activating your virtual environment. This ensures that all dependencies such as Flask and Flask-SQLAlchemy, are available within an isolated environment.
Once the environment is active, run the following command in your terminal:
flask --app app run --host 0.0.0.0 --debug
This command starts the Flask development server. --app app sets app.py as the entry point. --host 0.0.0.0 exposes the app on the local network as well as on the local machine. This becomes useful later if you want to connect to the backend from another device on the same network.
The --debug flag enables debug mode. In this mode, Flask automatically reloads the server whenever changes are made to the source code. It also provides helpful error messages during development, making it easier to identify and resolve issues quickly.
If the server starts successfully, Flask will display a message in the terminal indicating that it is running and listening on a local address. At this point, open your web browser and navigate to the following address:
http://127.0.0.1:5000/
This is the default address where the Flask application is served. When you access this route, you will be automatically redirected to:
http://127.0.0.1:5000/api/v1

This happens because we configured the root route to redirect to the API root. On the screen, you should see a simple message confirming that the API is running. While the output is minimal, it confirms that the routing system is working correctly and that the application is responding as expected.
At this point, there is one more important observation to make. If you inspect your project directory, you will notice that a new folder named instance has been created. Inside this folder, you will find the database file named data.db.
This confirms that Flask-SQLAlchemy has successfully initialized the SQLite database and created the required table based on the model we defined earlier. The database is stored locally as a file, which makes it easy to inspect, back up, or reset during development.
This step verifies the essentials: the app runs, routing works, and the database is created successfully. With this foundation confirmed, we are now ready to proceed to the next phase, where we will begin implementing the logic that allows the backend to receive and store real trade data.The complete source code for this stage has been provided as app.py, allowing you to replicate the setup and follow along with the development process.
Conclusion
In this article, you built a practical foundation for an external trade analytics pipeline. Concretely, you now have:
- A Flask application (app.py) and the project skeleton,
- A SQLite-backed data model with a created trades table including MetaTrader 5 identifiers (order, deal, position tickets), position fields, P&L, open/close times and magic number,
- A clear API contract under /api/v1 (root, health, GET/POST /api/v1/trades, GET /api/v1/trades/ , and /api/v1/analytics/summary),
- A local verification step: running flask --app app run --host 0.0.0.0 --debug launches the server and creates instance/data.db.
Note what is intentionally unfinished: the POST /api/v1/trades endpoint currently returns a placeholder, and the MetaTrader 5 Expert Advisor that sends closed‑trade JSON is not yet implemented. Those gaps are deliberate—the goal here was to provide a testable, well-documented backend and persistent schema so that the integration work is straightforward and repeatable.
In Part 2 we will implement the EA, define the exact JSON payload, and complete the POST handling (validation and database insert) so closed trades flow from MetaTrader 5 into this backend automatically.
Warning: All rights to these materials are reserved by MetaQuotes Ltd. Copying or reprinting of these materials in whole or in part is prohibited.
This article was written by a user of the site and reflects their personal views. MetaQuotes Ltd is not responsible for the accuracy of the information presented, nor for any consequences resulting from the use of the solutions, strategies or recommendations described.
Price Action Analysis Toolkit Development (Part 66): Developing a Structured Head and Shoulders Scanner in MQL5
Integrating Computer Vision into Trading in MQL5 (Part 2): Extending the Architecture to 2D RGB Image Analysis
How to Detect Round-Number Liquidity in MQL5
Camel Algorithm (CA)
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
You agree to website policy and terms of use