Discussing the article: "Hidden Markov Models for Trend-Following Volatility Prediction"

 

Check out the new article: Hidden Markov Models for Trend-Following Volatility Prediction.

Hidden Markov Models (HMMs) are powerful statistical tools that identify underlying market states by analyzing observable price movements. In trading, HMMs enhance volatility prediction and inform trend-following strategies by modeling and anticipating shifts in market regimes. In this article, we will present the complete procedure for developing a trend-following strategy that utilizes HMMs to predict volatility as a filter.

In the book Evidence-Based Technical Analysis Dave Aronson suggests that traders develop their strategies using scientific methods. This process begins with forming a hypothesis based on the intuition behind the idea and strictly testing it to avoid data snooping bias. For this article, we will try to do the same. Firstly, we must try to understand what is Hidden Markov model and why it could benefit us in our strategy development.

A Hidden Markov Model (HMM) is an unsupervised machine learning model that represents systems where the underlying state is hidden, but can be inferred through observable events or data. It is based on the Markov assumption, which posits that the system's future state depends only on its present state and not on its past states. In an HMM, the system is modeled as a set of discrete states, with each state having a certain probability of transitioning to another state. These transitions are governed by a set of probabilities known as the transition probabilities. The observed data (such as asset prices or market returns) are generated by the system, but the states themselves are not directly observable, hence the term "hidden."

These are its components:

  1. States: These are the unobservable conditions or regimes of the system. In financial markets, these states might represent different market conditions, such as a bull market, bear market, or periods of high and low volatility. These states evolve based on certain probabilistic rules.

  2. Transition Probabilities: These define the likelihood of moving from one state to another. The system’s state at time t only depends on the state at time t-1, adhering to the Markov property. Transition matrices are used to quantify these probabilities.

  3. Emission Probabilities: These describe the likelihood of observing a particular piece of data (e.g., a stock price or return) given the underlying state. Each state has a probability distribution that dictates the likelihood of observing certain market conditions or price movements when in that state.

  4. Initial Probabilities: These represent the probability of the system starting in a particular state, providing the starting point for the model's analysis.

Given these components, the model uses Bayesian inference to infer the most likely sequence of hidden states over time based on observed data. This is typically done through algorithms like the Forward-Backward algorithm or the Viterbi algorithm, which estimate the likelihood of the observed data given the sequence of hidden states.


Author: Zhuo Kai Chen