Market Predictability - page 8

To add comments, please log in or register
John Seekers
793
John Seekers  
In this paper, we develop econometric methods for estimating large Bayesian time varying parameter panel vector autoregressions (TVP-PVARs) and use these methods to forecast inflation for euro area countries. Large TVP-PVARs contain huge numbers of parameters which can lead to over-parameterization and computational concerns. Toovercome these concerns, we use hierarchical priors which reduce the dimension of theparameter vector and allow for dynamic model averaging or selection over TVP-PVARs ofdifferent dimension and different priors. We use forgetting factor methods which greatlyreduce the computational burden. Our empirical application shows substantial forecast

improvements over plausible alternatives.


John Seekers
793
John Seekers  

In this paper we introduce a nonparametric estimation method for a large Vector Autoregression (VAR) with time-varying parameters. The estimators and their asymptotic distributions are available in closed form. This makes the method computationally efficient and capable of handling information sets as large as those typically handled by factor models and Factor Augmented VARs (FAVAR). When applied to the problem of forecasting key macroeconomic variables, the method outperforms constant parameter benchmarks and large Bayesian VARs with time-varying parameters. The tool can also be used for structural analysis. As an example, we study the time-varying effects of oil price innovations on sectoral U.S. industrial output. We find that the changing interaction between unexpected oil price increases and business cycle fluctuations is shaped by the durable materials sector, rather by the automotive sector on which a large part of the literature has typically focused.

John Seekers
793
John Seekers  
We derive new tests for proper calibration of multivariate density forecasts based on Rosenblatt probability integral transforms. These tests have the advantage that they i) do not depend on the ordering of variables in the forecasting model, ii) are applicable to densities of arbitrary dimensions, and iii) have superior power relative to existing approaches. We furthermore develop adjusted tests that allow for estimated parameters and, consequently, can be used as in-sample speci cation tests. We demonstrate the problems of existing tests and how our new approaches can overcome those using two applications based on multivariate GARCH-based models for stock market returns and on a macroeconomic Bayesian vectorautoregressive model.
John Seekers
793
John Seekers  
In this paper we study what professional forecasters predict. We use spectral analysis and state space modeling to decompose economic time series into a trend, business-cycle, and irregular component. To examine which components are captured by professional forecasters, we regress their forecasts on the estimated components extracted from both the spectral analysis and the state space model. For both decomposition methods we nd that the Survey of Professional Forecasters can predict almost all variation in the time series due to the trend and business-cycle, but the forecasts contain little or no signi cant information about the variation in the irregular component.
John Seekers
793
John Seekers  
In public discussions of the quality of forecasts, attention typically focuses on the predictive performance in cases of extreme events.However, the restriction of conventional forecast evaluation methods tosubsets of extreme observations has unexpected and undesired effects, and is bound to discredit skillful forecasts when the signal-to-noise ratio in the data generating process is low. Conditioning on outcomes is incompatible with the theoretical assumptions of established forecast evaluation methods, thereby confronting forecasters with what we refer

to as the forecaster’s dilemma. For probabilistic forecasts, proper weighted scoring rules have been proposed as decision theoretically justifiable alternatives for forecast evaluation with an emphasis on extreme events. Using theoretical arguments, simulation experiments, and a real data study on probabilistic forecasts of U.S. inflation and gross domestic product (GDP) growth, we illustrate and discuss the  forecaster’s dilemma along with potential remedies.

ATTENTION: Video should be reuploaded

John Seekers
793
John Seekers  
In recent years, survey-based measures of expectations and disagreement have received increasing attention in economic research. Many forecast surveys ask their participants for …xed-event forecasts. Since …xed-event forecasts have seasonal properties, researchers often use an ad-hoc approach in order to approximate …xed-horizon forecasts using …xed-event forecasts. In this work, we derive an optimal approximation by minimizing the mean-squared approximation error. Like the approximation based on the ad-hoc approach,our approximation is constructed as a weighted sum of the …xed-event forecasts, with easily computable weights. The optimal weights tend to di¤er substantially from those of the ad-hoc approach. In an empirical application, it turns out that the gains from using optimal instead of ad-hoc weights are very pronounced. While our work focusses on the approximation of …xedhorizon forecasts by …xed-event forecasts, the proposed approximation method is very ‡exible. The forecast to be approximated as well as the information employed by the approximation can be any linear function of the underlying

high-frequency variable. In contrast to the ad-hoc approach, the proposed approximation method can make use of more than two such information-containing functions.

ATTENTION: Video should be reuploaded

John Seekers
793
John Seekers  
The common way to measure the performance of a volatility prediction model is to assess its ability to predict future volatility. However, as volatility is unobservable, there is no natural metric for measuring the accuracy of any particular model. Noh et al. (1994) assessed the performance of a volatility prediction model by devising trading rules to trade options on a daily basis and using forecasts of option prices obtained by the Black & Scholes (BS) option pricing formula. (An option is a security that gives its owner the right, not the obligation, to buy or sell an asset at a fixed price within a specified period of time, subject to certain conditions. The BS formula amounts to buying (selling) an option when its price forecast for tomorrow is higher (lower) than today’s market settlement price.)

In this paper, adopting Noh et al.’s (1994) idea, we assess the performance of a number of Autoregressive Conditional Heteroscedasticity (ARCH) models. For, each trading day, the ARCH model, selected on the basis of the prediction error criterion (PEC) introduced by Xekalaki et al. (2003) and suggested by Degiannakis and Xekalaki (1999) in the context of ARCH models, is used to forecast volatility. According to this criterion, the ARCH model with the lowest sum of squared standardized one step ahead prediction errors is selected for forecasting future volatility. A comparative study is made in order to examine which ARCH volatility estimation method yields the highest profits and whether there is any gain in using the PEC model selection algorithm for speculating with financial derivatives. Among a set of model selection algorithms, even marginally, the PEC algorithm appears to achieve the highest rate of return.
John Seekers
793
John Seekers  
We propose a Bayesian estimation method for Vector Autoregressions (VARs) featuring asymmetric priors and time varying volatilities that allows for a possibly very large cross sectional dimension of the system, N. The method is based on a simple triangularisation which allows one to simulate the conditional mean coe¢ cients of the VAR by drawing them equation by equation. This strategy reduces the computational complexity by a factor of N2 with respect to the existing algorithms routinely used in the literature and by practitioners. Importantly, our new estimation algorithm can be easily obtained by modifying just one of the steps of the existing algorithms. We illustrate the bene…ts of our proposed estimation method with numerical examples and empirical applications in the context of forecasting and structural analysis.
John Seekers
793
John Seekers  
We analyze forecasts of consumption, nonresidential investment, residential investment, government spending, exports, imports, inventories, gross domestic product, inflation, and unemployment prepared by the staff of the Board of Governors of the Federal Reserve System for meetings of the Federal Open Market Committee from 1997 to 2008, called the Greenbooks. We compare the root mean squared error, mean absolute error, and the proportion of directional errors of Greenbook forecasts of these macroeconomic indicators to the errors from three forecasting benchmarks: a random walk, a first-order autoregressive model, and a Bayesian model averaged forecast from a suite of univariate time-series models commonly taught to first-year economics graduate students. We estimate our forecasting benchmarks both on end-of-sample vintage and real-time vintage data. We find find that Greenbook forecasts significantly outperform our benchmark forecasts for horizons less than one quarter ahead. However, by the one-year forecast horizon, typically at least one of our forecasting benchmarks performs as well as Greenbook forecasts. Greenbook forecasts of the personal consumption expenditures and unemployment tend to do relatively well, while Greenbook forecasts of inventory investment, government expenditures, and inflation tend to do poorly.
John Seekers
793
John Seekers  
Macroeconomists are increasingly working with large Vector Autoregressions (VARs) where the number of parameters vastly exceeds the number of observations. Existing approaches either involve prior shrinkage or the use of factor methods. In this paper, we develop an alternative based on ideas from the compressed regression literature. It involves randomly compressing the explanatory variables prior to analysis. A huge dimensional problem is thus turned into a much smaller, more computationally tractable one. Bayesian model averaging can be done over various compressions, attaching greater weight to compressions which forecast well. In a macroeconomic application involving up to 129 variables, we find compressed VAR methods to forecast better than either factor methods or large VAR methods involving prior shrinkage.
To add comments, please log in or register