preview
Overcoming Accessibility Problems in MQL5 Trading Tools (I)

Overcoming Accessibility Problems in MQL5 Trading Tools (I)

MetaTrader 5Examples |
181 0
Clemence Benjamin
Clemence Benjamin

Contents:



Introduction

MetaTrader 5 and the MQL5 programming environment provide a robust foundation for designing trading systems, notifications, and automated workflows. By default, the terminal offers visual alerts, simple alert sounds, email notifications, push notifications to mobile devices, logging mechanisms, and access to embedded resources such as files and media. While these features confirm that an event has occurred, they typically lack contextual detail—standard alert sounds do not communicate whether the signal represents a buy, a sell, or another system state. This creates an opportunity for developers to extend the platform toward more informative and accessibility-aware communication.

Fully automated trading systems benefit significantly from these capabilities. They execute trades independently according to predefined logic, manage risk, and generate results that can be reviewed later without continuous chart monitoring by the trader. This autonomy is advantageous for all traders who cannot observe charts constantly, including visually impaired traders, busy professionals, or users with situational limitations. By decoupling observation from execution, fully automated systems reduce reliance on real-time visual feedback.

Semi-automated systems, however, present a more nuanced challenge. Although they automate analysis and signal detection, they often depend on human confirmation or decision-making. In most implementations, this interaction is conveyed almost exclusively through visual cues such as arrows, colors, labels, or chart overlays. Even when accompanied by basic alert sounds, the trader must still look at the chart to understand what the alert represents. This visual dependency introduces an accessibility gap for visually impaired users and reduces efficiency for traders who rely on audio-based feedback.

Visual impairment may be congenital, progressive, or sudden, arising from medical conditions, accidents, or age-related degeneration. Likewise, some traders may experience partial or temporary hearing limitations, operate in noisy environments, or require hands-free interaction due to multitasking. None of these conditions reduce a trader’s market understanding or strategic capability. There are no legal or ethical restrictions preventing impaired traders from participating in financial markets—the limitation lies in the interface, not the market itself. As developers, it is our responsibility to identify these barriers and address them programmatically.

The focus of this article is to explore accessibility-aware design in the context of trading indicators and semi-automated systems. In this first part, we demonstrate how to enhance system-to-user communication by leveraging MQL5 resources to provide contextual voice feedback that explains what the alert represents, rather than relying on generic tones alone. This approach improves clarity for visually impaired traders while also benefiting users who prefer audio-driven interaction. In future parts, we will explore extending this foundation with AI-assisted and generative voice technologies via external APIs. By the end of this article, you will understand how accessibility enhancements can be systematically integrated into trading tools without compromising performance, usability, or design integrity.



Justification of the Accessibility Problem

Accessibility in technology refers to the design of systems, interfaces, and interactions that can be effectively used by people with different physical, sensory, or cognitive abilities. In modern consumer technology, accessibility is no longer optional—it is an expected design principle. Smartphones, computers, and operating systems now include screen readers, voice assistants, high-contrast modes, adaptive input methods, haptic feedback, and customizable text sizing. These features allow users with visual or auditory impairments to operate complex systems independently, without relying on visual inspection or external assistance.

In financial trading software, however, accessibility has historically received far less attention. The dominant interaction model is built around charts, candlesticks, indicators, arrows, colors, and overlays. While this approach is efficient for fully sighted users, it implicitly assumes constant visual engagement. For traders who are visually impaired, temporarily unable to observe charts, or operating in non-ideal environments, this model introduces unnecessary barriers. The problem becomes more pronounced in semi-automated systems, where human confirmation or intervention is required after a signal is detected. Without alternative feedback mechanisms, critical information remains inaccessible when it matters most.

Accessibility challenges in trading extend beyond permanent visual impairment. Some traders may experience partial or progressive vision loss, hearing limitations, or situational impairments such as fatigue, illness, multitasking, or noisy environments. Others may simply prefer hands-free interaction while focusing on analysis, risk evaluation, or other tasks. Designing with accessibility in mind benefits not only impaired users but also busy professionals who cannot continuously monitor charts or notifications visually. Clear, structured communication—especially through contextual audio feedback—enhances efficiency and reduces cognitive load for all users.

The root causes of accessibility limitations in trading tools can be summarized as follows:

  • Heavy dependence on visual chart elements for conveying critical information
  • Generic alert sounds that lack contextual meaning
  • Time-sensitive interaction requirements in semi-automated systems
  • Limited use of alternative feedback channels such as structured audio or voice-based interaction

Within the MQL5 environment, these limitations are not imposed by the platform itself. On the contrary, MQL5 provides access to resource handling, logging, alerts, notifications, and media playback. The accessibility gap arises primarily from design assumptions—namely, that users will always observe charts visually. By leveraging MQL5’s existing capabilities, developers can translate internal system states into structured textual messages, pre-recorded audio cues, or contextual voice feedback that communicates what happened and why, rather than merely signaling that an event occurred.

Importantly, accessibility-aware design does not imply building separate systems for a specific group of users. Instead, it focuses on improving clarity, robustness, and usability for everyone. A system that communicates effectively through multiple channels is more resilient, more inclusive, and more efficient. This perspective aligns directly with the goals outlined in the introduction and sets the foundation for the practical implementation discussed in the next section, where we demonstrate how accessibility can be integrated into a simple indicator without compromising functionality or performance.



Implementation: Solution With MQL5 on a Simple Crossover Strategy

To illustrate accessibility-aware design, we implement a simple moving average (MA) crossover indicator in MQL5. This strategy involves two moving averages: a fast MA and a slow MA. When the fast MA crosses above the slow MA, the indicator generates a buy signal. Conversely, when the fast MA crosses below the slow MA, the indicator generates a sell signal. While the strategy itself is straightforward, the focus of this implementation is on communication, not trading performance.

MQL5 provides resources that allow developers to embed external files, including audio clips, directly into indicators or Expert Advisors. For accessibility purposes, short pre-recorded audio messages can be prepared using a program like Audacity, such as “Buy signal detected” and “Sell signal detected.” These files are then included as embedded resources and triggered programmatically when a crossover event occurs.

In parallel, the indicator generates structured textual output using PrintFormat(), Alert(), or Comment() functions. Each notification includes essential details such as,

  • The symbol being analyzed (e.g., EURUSD).
  • The timeframe (e.g., H1, M15).
  • The direction of the signal (BUY or SELL).
  • The values of the fast and slow MAs at the time of the crossover.
  • The price at which the crossover occurred.

For example, a notification might read,
"MA crossover detected. BUY signal on EURUSD H1. Fast MA 1.08452 crossed above Slow MA 1.08397 at price 1.08460."

The implementation follows these steps:

  • Define input parameters for fast- and slow-moving averages.
  • Calculate MA values on each tick or completed bar.
  • Detect crossover events by comparing previous and current MA values.
  • Trigger structured text notifications and embedded audio playback.
  • Log each event for asynchronous review, compatible with screen readers.

This ensures that each signal is communicated through multiple channels, reducing dependency on visual observation. Traders who are visually impaired or cannot attend to the screen can rely on text logs, audio playback, or future integrations such as AI-driven voice synthesis for comprehensive feedback. Let's get started with the following steps.

1. Starting a New Indicator Project in MetaEditor

Every MQL5 project begins with intentional setup. We open MetaEditor and create a custom indicator. We choose a name: VoiceAlerts_MA_Crossover.

The name matters because it will appear in the navigator, on the chart, and in logs. A clear, descriptive name makes the tool self-documenting.

The wizard automatically generates a default template with:

At this point, the indicator does nothing yet. This is normal: we will build it step by step, keeping control over every feature.

2. Metadata and Properties

The first programming task is to define how MetaTrader will treat this indicator. Using #property directives, we specify:

  • Where to display it (indicator_chart_window)
  • Version, copyright, and link
  • Number of buffers and plots

These properties act as compile-time instructions. They’re not optional; they guide MetaTrader on rendering and memory management.

#property indicator_chart_window
#property version "1.02"
#property copyright "Copyright 2025, Clemence Benjamin"

At this stage, the indicator has structure but no logic.

3. Planning Buffers and Data Storage

Next, we decide what the indicator will calculate and store. Our tool requires:

  • Fast MA values
  • Slow MA values
  • ATR for arrow positioning

We declare three buffers:

double FastMABuffer[];
double SlowMABuffer[];
double ATRBuffer[];

We then bind these buffers to the indicator using SetIndexBuffer() and set them as time-series arrays so index 0 always corresponds to the latest bar:

SetIndexBuffer(0, FastMABuffer);
SetIndexBuffer(1, SlowMABuffer);
SetIndexBuffer(2, ATRBuffer);

ArraySetAsSeries(FastMABuffer, true);
ArraySetAsSeries(SlowMABuffer, true);
ArraySetAsSeries(ATRBuffer, true);

 Buffers are passive memory; the indicator must actively fill them.

4. Creating Indicator Handles

In MQL5, indicators are handle-based. We do not calculate values directly. Instead:

FastMAHandle = iMA(_Symbol, _Period, FastMAPeriod, 0, MaMethod, PriceType);
SlowMAHandle = iMA(_Symbol, _Period, SlowMAPeriod, 0, MaMethod, PriceType);
ATRHandle    = iATR(_Symbol, _Period, ATRPeriod);

Handles act as calculation engines, which we later query for data using CopyBuffer().

We validate handles immediately. This ensures the indicator fails early if resources are unavailable, preventing unpredictable behavior.

5. Initialization and Accessibility Feedback

OnInit() is the constructor phase. Here we are:

  • Bind buffers.
  • Create handles.
  • Optionally play a welcome sound.

if(EnableWelcomeSound)
    PlaySound("::Sounds\\welcome.wav");

This initial sound provides instant feedback for visually impaired traders, confirming that the indicator loaded successfully without requiring a chart check.

6. Persistent State Variables

We declare global variables to persist data across calls to OnCalculate():

datetime LastBarTime = 0;
datetime PendingSignalBar = 0;
string PendingDirection;

These variables:

  • Detect the opening of a new candle.
  • Store pending crossover signals.
  • Ensure one signal per bar.

The principle here: minimal and focused global state prevents bugs and makes logic easier to maintain.

7. Detecting New Bar Openings

Indicators are called multiple times per tick. To avoid multiple signals per bar, we implement new-bar detection:

if(time[0] != LastBarTime)
{
    LastBarTime = time[0];
    // execute logic
}

Why this is important:

  • Avoids repeated alerts
  • Ensures signals fire on confirmed bars
  • Supports stable backtesting

8. Copying Indicator Values

Once a new bar is detected, we retrieve MA and ATR values:

CopyBuffer(FastMAHandle, 0, 0, rates_total, FastMABuffer);
CopyBuffer(SlowMAHandle, 0, 0, rates_total, SlowMABuffer);
CopyBuffer(ATRHandle, 0, 0, rates_total, ATRBuffer);

MQL5 separates calculation (handle) from data access (buffer). This pattern applies to any handle-based indicator, not just MA or ATR.

9. Detecting MA Crossovers

We use closed-bar logic to prevent repainting:

int i = 1; // last closed bar
if(FastMABuffer[i+1] < SlowMABuffer[i+1] && FastMABuffer[i] > SlowMABuffer[i])
{
    PendingSignalBar = time[i];
    PendingDirection = "BUY";
}

This ensures signals are based on completed data, which is crucial for trustworthy alerts, particularly for visually impaired users relying on audio cues.

10. Delaying Alerts Until Next Bar Open

Rather than triggering immediately, we store the signal and execute it at the opening of the next bar:

FireSignal(PendingDirection, PendingSignalBar, high, low);

Benefits:

  • Prevents mid-bar noise
  • Provides consistent, readable audio
  • Matches cognitive expectations for sequential alerts

11. Drawing Wingdings Arrows

Arrows provide visual complements. Using ATR offsets ensures arrows do not overlap candles and remain visible even on large lookback periods:

double price = direction == "BUY" ? low[bar] - ATRBuffer[bar]*0.6
                                  : high[bar] + ATRBuffer[bar]*0.6;
ObjectCreate(0, name, OBJ_ARROW, 0, signal_time, price);

Arrows are color-coded for clarity: green for buy, red for sell.

12. Sound Alerts

We deliberately avoid Alert(). Instead, we use PlaySound() to ensure:

  • Only our custom sounds play.
  • No interference from terminal default alerts.
  • Full control of audio feedback.

This design is central to accessibility, giving visually impaired traders reliable cues.

13. Resource Cleanup

Finally, in OnDeinit(), we release all handles to clean up; this prevents memory leaks and ensures stable chart operation.

IndicatorRelease(FastMAHandle);
IndicatorRelease(SlowMAHandle);
IndicatorRelease(ATRHandle);

Preparing Audio Files for Voice Alerts

To enable the audio feedback functionality in the VoiceAlerts_MA_Crossover indicator, you must provide audio files that match the resource names referenced in the code. The indicator uses three core audio cues:

  • welcome.wav—played when the indicator is loaded.
  • Buy.wav—played when a bullish MA crossover occurs.
  • Sell.wav—played when a bearish MA crossover occurs.

Recording Audio

You have two main options:

1. Studio-quality recording (optional)

  • Use a microphone in a quiet environment.
  • Record short, clear phrases or tones corresponding to each event.
  • Keep the audio concise (1–3 seconds is sufficient).

2. Low-budget/DIY recording

  • Use a smartphone, PC microphone, or headset.
  • Record in any quiet space.
  • Even simple voice recordings are adequate for basic accessibility purposes.

Editing Audio

Regardless of recording method, the audio should be,

  • Trimmed to remove silence at the start and end
  • Saved in .wav format (PCM 16-bit, 44100 Hz recommended)
  • Named exactly as used in the indicator: welcome.wav, buy.wav, sell.wav

Basic editing can be performed using free tools such as Audacity:

  • Open the recorded file in Audacity..
  • Use the trim tool to cut unnecessary silence.
  • Export as WAV (File → Export → Export as WAV).
  • Ensure file names match the code references exactly.

Alternative: Text-to-Speech (TTS) Generation

For those who prefer generated voices:

  • Use any open-source TTS tool or online service.
  • Generate short phrases: "Indicator Loaded," "Buy Signal," "Sell Signal."
  • Save each file as .wav with the exact names above.

Placement in MQL5 Terminal

After preparing the audio files:

1. Take the attached Sounds folder from this project.

2. Copy it into your MetaTrader 5 terminal directory under, MQL5\Sounds

3. Ensure the file structure remains intact and file names are correct.

Notice: If the audio file names do not match the code exactly, the indicator will fail to find them and compilation may fail

By following these steps, you can ensure reliable audio notifications, which are essential for visually impaired traders or anyone relying on voice cues instead of chart visuals.



Expanding the System: Voice Feedback, Commands, and Future AI Integration

Beyond basic audio playback, there is significant potential to enhance accessibility and trader efficiency using interactive voice feedback and recognition. For traders who can hear, detailed voice notifications provide comprehensive signal information without requiring chart attention. Unlike standard platform tones or simple beeps, these notifications include signal type, symbol, timeframe, moving average values, price, and context. Busy traders, for instance, can maintain awareness of market events while multitasking or performing other activities.

Voice command integration represents another opportunity. By enabling traders to respond verbally to notifications, semi-automated systems can receive execution approvals, parameter adjustments, or queries without requiring keyboard or mouse input. For example, a trader could say, "Approve BUY signal EURUSD H1," which the system interprets to execute or queue a trade. This feature provides greater independence for visually impaired traders and improves efficiency for those who are busy or temporarily unable to access the chart.

The technology stack for such enhancements can include:

  • Local Text-to-Speech (TTS) engines for immediate feedback.
  • Voice recognition APIs for command interpretation.
  • Generative AI models to provide contextual explanations or signal analysis.
  • Optional cloud integration for multi-language support or advanced predictive insights.

By leveraging MQL5 resources, these interactions can be initiated by the indicator while maintaining platform independence. For instance, the indicator may write structured events to a file or messaging queue, which a local helper application monitors for TTS output or AI processing. This separation ensures compliance with platform security, and maintains system stability.

Such an approach is not only inclusive but also advantageous for traders without impairments. Detailed, voice-based feedback allows faster decision-making, reduces missed opportunities, and improves situational awareness during busy trading sessions. Future AI integration could enhance the system further, providing natural-language explanations, predictive risk assessment, and interactive learning for all traders.



Testing and Evaluation

Testing the VoiceAlerts_MA_Crossover indicator involves both historical and live verification. In the Strategy Tester (Expert tab), we confirmed that all MA crossover events were correctly detected and recorded in the textual logs. Audio alerts did not play in the Strategy Tester, so to test sound functionality, we applied the indicator on a live chart.

To verify audio notifications effectively:

  • Use a 1-minute timeframe so that MA crossovers occur without long waiting periods.
  • Allow the system to wait for the next bar to open after a crossover is detected; this ensures the audio plays in sync with the indicator’s delayed alert logic.
  • The successful playback of the initialization welcome audio confirms that the audio system is functioning, while subsequent buy/sell sounds demonstrate proper event notification.

This approach ensures that both textual logs and audio alerts are working as intended, providing reliable feedback for visually impaired users or traders relying on voice cues.

Follow the video below, where I demonstrate how to attach the indicator to a chart and confirm the initialization audio playback.



Conclusion

Using a simple moving average crossover indicator as a demonstration, we successfully showed how contextual audio feedback can be integrated into an MQL5 indicator by leveraging the platform’s resource management and sound playback capabilities. The indicator provides immediate feedback during initialization, confirmed by the welcome audio demonstrated in the attached video, and delivers meaningful voice alerts that explain what event has occurred rather than relying on non-descriptive terminal tones. This approach extends the default alerting system of MetaTrader 5 without replacing it, enhancing clarity and usability for a broader range of traders.

To ensure correct operation, you must place all project files in their appropriate directories. The indicator source file must be saved and compiled normally in the Indicators folder, while the accompanying Sounds folder must be copied exactly as provided into the MetaTrader 5 terminal’s MQL5\Sounds directory. The sound file names must match those referenced in the code precisely; otherwise, the compiler or runtime playback will fail. This step is essential for successful compilation and proper audio execution.

With this foundational example in place, we have established a practical framework for accessibility-aware system design in MQL5. In future parts, this foundation can be extended through deeper integrations with voice processing engines, text-to-speech systems, and external APIs, enabling more advanced, real-time, and interactive assistant-like behavior embedded directly into trading tools.

Ultimately, accessibility-driven design leads to more robust, inclusive, and user-centric systems. By communicating clearly across multiple channels—visual, textual, and audio—we create trading tools that adapt to the trader, rather than forcing the trader to adapt to the tool. Accessibility is not an optional feature; it is an opportunity to improve communication, reliability, and the overall trading experience for every market participant.


Attachments

Source FilenameTypeVersionDescription
VoiceAlerts_MA_CrossoverIndicator1.0Accessibility-focused moving average crossover indicator that provides visual Wingdings arrows and custom audio alerts for buy/sell signals. Designed for both visually impaired traders and those preferring voice notifications. Includes delayed alerts on the next bar open to prevent repainting.
SoundsFolderN/AContains all WAV audio files used by the indicator: welcome.wav, buy.wav, and sell.wav. Must be placed in the terminal's MQL5\Sounds folder to ensure correct audio playback. Files can be recorded manually or generated via TTS, as long as names match exactly.
Attached files |
Sounds.zip (346.63 KB)
Angular Analysis of Price Movements: A Hybrid Model for Predicting Financial Markets Angular Analysis of Price Movements: A Hybrid Model for Predicting Financial Markets
What is angular analysis of financial markets? How to use price action angles and machine learning to make accurate forecasts with 67% accuracy? How to combine a regression and classification model with angular features and obtain a working algorithm? What does Gann have to do with it? Why are price movement angles a good indicator for machine learning?
Python-MetaTrader 5 Strategy Tester (Part 05): Multi-Symbols and Timeframes Strategy Tester Python-MetaTrader 5 Strategy Tester (Part 05): Multi-Symbols and Timeframes Strategy Tester
This article presents a MetaTrader 5–compatible backtesting workflow that scales across symbols and timeframes. We use HistoryManager to parallelize data collection, synchronize bars and ticks from all timeframes, and run symbol‑isolated OnTick handlers in threads. You will learn how modelling modes affect speed/accuracy, when to rely on terminal data, how to reduce I/O with event‑driven updates, and how to assemble a complete multicurrency trading robot.
Features of Experts Advisors Features of Experts Advisors
Creation of expert advisors in the MetaTrader trading system has a number of features.
Custom Indicator Workshop (Part 2) : Building a Practical Supertrend Expert Advisor in MQL5 Custom Indicator Workshop (Part 2) : Building a Practical Supertrend Expert Advisor in MQL5
Learn how to build a Supertrend-driven Expert Advisor in MQL5 from the ground up. The article covers embedding the indicator as a resource, reading buffer values on closed bars, detecting confirmed flips, aligning and switching positions, and configuring stop-loss modes and position sizing. It concludes with Strategy Tester setup and reproducible tests, leaving you with a configurable EA and a clear framework for further research and extensions.