首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Recent advances in the measurement of beta (systematic return risk) and volatility (total return risk) demonstrate substantial advantages in utilizing high‐frequency return data in a variety of settings. These advances in the measurement of beta and volatility have resulted in improvements in the evaluation of alternative beta and volatility forecasting approaches. In addition, more precise measurement has also led to direct modeling of the time variation of beta and volatility. Both the realized beta and volatility literature have most commonly been modeled with an autoregressive process. In this paper we evaluate constant beta models against autoregressive models of time‐varying realized beta. We find that a constant beta model computed from daily returns over the last 12 months generates the most accurate quarterly forecast of beta and dominates the autoregressive time series forecasts. It also dominates (dramatically) the popular Fama–MacBeth constant beta model, which uses 5 years of monthly returns. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

2.
A variety of recent studies provide a skeptical view on the predictability of stock returns. Empirical evidence shows that most prediction models suffer from a loss of information, model uncertainty, and structural instability by relying on low‐dimensional information sets. In this study, we evaluate the predictive ability of various lately refined forecasting strategies, which handle these issues by incorporating information from many potential predictor variables simultaneously. We investigate whether forecasting strategies that (i) combine information and (ii) combine individual forecasts are useful to predict US stock returns, that is, the market excess return, size, value, and the momentum premium. Our results show that methods combining information have remarkable in‐sample predictive ability. However, the out‐of‐sample performance suffers from highly volatile forecast errors. Forecast combinations face a better bias–efficiency trade‐off, yielding a consistently superior forecast performance for the market excess return and the size premium even after the 1970s.  相似文献   

3.
In this paper, we forecast stock returns using time‐varying parameter (TVP) models with parameters driven by economic conditions. An in‐sample specification test shows significant variation in the parameters. Out‐of‐sample results suggest that the TVP models outperform their constant coefficient counterparts. We also find significant return predictability from both statistical and economic perspectives with the application of TVP models. The out‐of‐sample R2 of an equal‐weighted combination of TVP models is as high as 2.672%, and the gains in the certainty equivalent return are 214.7 basis points. Further analysis indicates that the improvement in predictability comes from the use of information on economic conditions rather than simply from allowing the coefficients to vary with time.  相似文献   

4.
In this paper we propose Granger (non‐)causality tests based on a VAR model allowing for time‐varying coefficients. The functional form of the time‐varying coefficients is a logistic smooth transition autoregressive (LSTAR) model using time as the transition variable. The model allows for testing Granger non‐causality when the VAR is subject to a smooth break in the coefficients of the Granger causal variables. The proposed test then is applied to the money–output relationship using quarterly US data for the period 1952:2–2002:4. We find that causality from money to output becomes stronger after 1978:4 and the model is shown to have a good out‐of‐sample forecasting performance for output relative to a linear VAR model. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

5.
The period of extraordinary volatility in euro area headline inflation starting in 2007 raised the question whether forecast combination methods can be used to hedge against bad forecast performance of single models during such periods and provide more robust forecasts. We investigate this issue for forecasts from a range of short‐term forecasting models. Our analysis shows that there is considerable variation of the relative performance of the different models over time. To take that into account we suggest employing performance‐based forecast combination methods—in particular, one with more weight on the recent forecast performance. We compare such an approach with equal forecast combination that has been found to outperform more sophisticated forecast combination methods in the past, and investigate whether it can improve forecast accuracy over the single best model. The time‐varying weights assign weights to the economic interpretations of the forecast stemming from different models. We also include a number of benchmark models in our analysis. The combination methods are evaluated for HICP headline inflation and HICP excluding food and energy. We investigate how forecast accuracy of the combination methods differs between pre‐crisis times, the period after the global financial crisis and the full evaluation period, including the global financial crisis with its extraordinary volatility in inflation. Overall, we find that forecast combination helps hedge against bad forecast performance and that performance‐based weighting outperforms simple averaging. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

6.
This paper models bond term premia empirically in terms of the maturity composition of the federal debt and other observable economic variables in a time‐varying framework with potential regime shifts. We present regression and out‐of sample forecasting results demonstrating that information on the age composition of the Federal debt is useful for forecasting term premia. We show that the multiprocess mixture model, a multi‐state time‐varying parameter model, outperforms the commonly used GARCH model in out‐of‐sample forecasts of term premia. The results underscore the importance of modelling term premia, as a function of economic variables rather than just as a function of asset covariances as in the conditional heteroscedasticity models. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

7.
A long‐standing puzzle to financial economists is the difficulty of outperforming the benchmark random walk model in out‐of‐sample contests. Using data from the USA over the period of 1872–2007, this paper re‐examines the out‐of‐sample predictability of real stock prices based on price–dividend (PD) ratios. The current research focuses on the significance of the time‐varying mean and nonlinear dynamics of PD ratios in the empirical analysis. Empirical results support the proposed nonlinear model of the PD ratio and the stationarity of the trend‐adjusted PD ratio. Furthermore, this paper rejects the non‐predictability hypothesis of stock prices statistically based on in‐ and out‐of‐sample tests and economically based on the criteria of expected real return per unit of risk. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

8.
Value‐at‐risk (VaR) is a standard measure of market risk in financial markets. This paper proposes a novel, adaptive and efficient method to forecast both volatility and VaR. Extending existing exponential smoothing as well as GARCH formulations, the method is motivated from an asymmetric Laplace distribution, where skewness and heavy tails in return distributions, and their potentially time‐varying nature, are taken into account. The proposed volatility equation also involves novel time‐varying dynamics. Back‐testing results illustrate that the proposed method offers a viable, and more accurate, though conservative, improvement in forecasting VaR compared to a range of popular alternatives. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

9.
We study intraday return volatility dynamics using a time‐varying components approach, and the method is applied to analyze IBM intraday returns. Empirical evidence indicates that with three additive components—a time‐varying mean of absolute returns and two cosine components with time‐varying amplitudes—together they capture very well the pronounced periodicity and persistence behaviors exhibited in the empirical autocorrelation pattern of IBM returns. We find that the long‐run volatility persistence is driven predominantly by daily level shifts in mean absolute returns. After adjusting for these intradaily components, the filtered returns behave much like a Gaussian noise, suggesting that the three‐components structure is adequately specified. Furthermore, a new volatility measure (TCV) can be constructed from these components. Results from extensive out‐of‐sample rolling forecast experiments suggest that TCV fares well in predicting future volatility against alternative methods, including GARCH model, realized volatility and realized absolute value. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

10.
A large number of models have been developed in the literature to analyze and forecast changes in output dynamics. The objective of this paper was to compare the predictive ability of univariate and bivariate models, in terms of forecasting US gross national product (GNP) growth at different forecasting horizons, with the bivariate models containing information on a measure of economic uncertainty. Based on point and density forecast accuracy measures, as well as on equal predictive ability (EPA) and superior predictive ability (SPA) tests, we evaluate the relative forecasting performance of different model specifications over the quarterly period of 1919:Q2 until 2014:Q4. We find that the economic policy uncertainty (EPU) index should improve the accuracy of US GNP growth forecasts in bivariate models. We also find that the EPU exhibits similar forecasting ability to the term spread and outperforms other uncertainty measures such as the volatility index and geopolitical risk in predicting US recessions. While the Markov switching time‐varying parameter vector autoregressive model yields the lowest values for the root mean squared error in most cases, we observe relatively low values for the log predictive density score, when using the Bayesian vector regression model with stochastic volatility. More importantly, our results highlight the importance of uncertainty in forecasting US GNP growth rates.  相似文献   

11.
We evaluate forecasting models of US business fixed investment spending growth over the recent 1995:1–2004:2 out‐of‐sample period. The forecasting models are based on the conventional Accelerator, Neoclassical, Average Q, and Cash‐Flow models of investment spending, as well as real stock prices and excess stock return predictors. The real stock price model typically generates the most accurate forecasts, and forecast‐encompassing tests indicate that this model contains most of the information useful for forecasting investment spending growth relative to the other models at longer horizons. In a robustness check, we also evaluate the forecasting performance of the models over two alternative out‐of‐sample periods: 1975:1–1984:4 and 1985:1–1994:4. A number of different models produce the most accurate forecasts over these alternative out‐of‐sample periods, indicating that while the real stock price model appears particularly useful for forecasting the recent behavior of investment spending growth, it may not continue to perform well in future periods. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

12.
We propose a method for improving the predictive ability of standard forecasting models used in financial economics. Our approach is based on the functional partial least squares (FPLS) model, which is capable of avoiding multicollinearity in regression by efficiently extracting information from the high‐dimensional market data. By using its well‐known ability, we can incorporate auxiliary variables that improve the predictive accuracy. We provide an empirical application of our proposed methodology in terms of its ability to predict the conditional average log return and the volatility of crude oil prices via exponential smoothing, Bayesian stochastic volatility, and GARCH (generalized autoregressive conditional heteroskedasticity) models, respectively. In particular, what we call functional data analysis (FDA) traces in this article are obtained via the FPLS regression from both the crude oil returns and auxiliary variables of the exchange rates of major currencies. For forecast performance evaluation, we compare out‐of‐sample forecasting accuracy of the standard models with FDA traces to the accuracy of the same forecasting models with the observed crude oil returns, principal component regression (PCR), and least absolute shrinkage and selection operator (LASSO) models. We find evidence that the standard models with FDA traces significantly outperform our competing models. Finally, they are also compared with the test for superior predictive ability and the reality check for data snooping. Our empirical results show that our new methodology significantly improves predictive ability of standard models in forecasting the latent average log return and the volatility of financial time series.  相似文献   

13.
In this paper, I investigate the effects of cross‐border capital flows induced by the rate of risk‐adjusted excess returns (Sharpe ratio) on the transitional dynamics of the nominal exchange rate's deviation from its fundamental value. For this purpose, a two‐state time‐varying transition probability Markov regime‐switching process is added to the sticky price exchange rate model with shares. I estimated this model using quarterly data on the four most active floating rate currencies for the years 1973–2009: the Australian dollar, Canadian dollar, Japanese yen and the British pound. The results provide evidence that the Sharpe ratios of debt and equity investments influence the evolution of transitional dynamics of the currencies' deviation from their fundamental values. In addition, I found that the relationship between economic fundamentals and the nominal exchange rates vary depending on the overvaluation or undervaluation of the currencies. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

14.
Micro‐founded dynamic stochastic general equilibrium (DSGE) models appear to be particularly suited to evaluating the consequences of alternative macroeconomic policies. Recently, increasing efforts have been undertaken by policymakers to use these models for forecasting, although this proved to be problematic due to estimation and identification issues. Hybrid DSGE models have become popular for dealing with some of the model misspecifications and the trade‐off between theoretical coherence and empirical fit, thus allowing them to compete in terms of predictability with VAR models. However, DSGE and VAR models are still linear and they do not consider time variation in parameters that could account for inherent nonlinearities and capture the adaptive underlying structure of the economy in a robust manner. This study conducts a comparative evaluation of the out‐of‐sample predictive performance of many different specifications of DSGE models and various classes of VAR models, using datasets for the real GDP, the harmonized CPI and the nominal short‐term interest rate series in the euro area. Simple and hybrid DSGE models were implemented, including DSGE‐VAR and factor‐augmented DGSE, and tested against standard, Bayesian and factor‐augmented VARs. Moreover, a new state‐space time‐varying VAR model is presented. The total period spanned from 1970:Q1 to 2010:Q4 with an out‐of‐sample testing period of 2006:Q1–2010:Q4, which covers the global financial crisis and the EU debt crisis. The results of this study can be useful in conducting monetary policy analysis and macro‐forecasting in the euro area. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

15.
In this paper, we forecast EU area inflation with many predictors using time‐varying parameter models. The facts that time‐varying parameter models are parameter rich and the time span of our data is relatively short motivate a desire for shrinkage. In constant coefficient regression models, the Bayesian Lasso is gaining increasing popularity as an effective tool for achieving such shrinkage. In this paper, we develop econometric methods for using the Bayesian Lasso with time‐varying parameter models. Our approach allows for the coefficient on each predictor to be: (i) time varying; (ii) constant over time; or (iii) shrunk to zero. The econometric methodology decides automatically to which category each coefficient belongs. Our empirical results indicate the benefits of such an approach. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

16.
The difficulty in modelling inflation and the significance in discovering the underlying data‐generating process of inflation is expressed in an extensive literature regarding inflation forecasting. In this paper we evaluate nonlinear machine learning and econometric methodologies in forecasting US inflation based on autoregressive and structural models of the term structure. We employ two nonlinear methodologies: the econometric least absolute shrinkage and selection operator (LASSO) and the machine‐learning support vector regression (SVR) method. The SVR has never been used before in inflation forecasting considering the term spread as a regressor. In doing so, we use a long monthly dataset spanning the period 1871:1–2015:3 that covers the entire history of inflation in the US economy. For comparison purposes we also use ordinary least squares regression models as a benchmark. In order to evaluate the contribution of the term spread in inflation forecasting in different time periods, we measure the out‐of‐sample forecasting performance of all models using rolling window regressions. Considering various forecasting horizons, the empirical evidence suggests that the structural models do not outperform the autoregressive ones, regardless of the model's method. Thus we conclude that the term spread models are not more accurate than autoregressive models in inflation forecasting. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

17.
This paper applies the GARCH‐MIDAS (mixed data sampling) model to examine whether information contained in macroeconomic variables can help to predict short‐term and long‐term components of the return variance. A principal component analysis is used to incorporate the information contained in different variables. Our results show that including low‐frequency macroeconomic information in the GARCH‐MIDAS model improves the prediction ability of the model, particularly for the long‐term variance component. Moreover, the GARCH‐MIDAS model augmented with the first principal component outperforms all other specifications, indicating that the constructed principal component can be considered as a good proxy of the business cycle. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

18.
We provide a comprehensive study of out‐of‐sample forecasts for the EUR/USD exchange rate based on multivariate macroeconomic models and forecast combinations. We use profit maximization measures based on directional accuracy and trading strategies in addition to standard loss minimization measures. When comparing predictive accuracy and profit measures, data snooping bias free tests are used. The results indicate that forecast combinations, in particular those based on principal components of forecasts, help to improve over benchmark trading strategies, although the excess return per unit of deviation is limited. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

19.
In this paper, we provide a novel way to estimate the out‐of‐sample predictive ability of a trading rule. Usually, this ability is estimated using a sample‐splitting scheme, true out‐of‐sample data being rarely available. We argue that this method makes poor use of the available data and creates data‐mining possibilities. Instead, we introduce an alternative.632 bootstrap approach. This method enables building in‐sample and out‐of‐sample bootstrap datasets that do not overlap but exhibit the same time dependencies. We show in a simulation study that this technique drastically reduces the mean squared error of the estimated predictive ability. We illustrate our methodology on IBM, MSFT and DJIA stock prices, where we compare 11 trading rules specifications. For the considered datasets, two different filter rule specifications have the highest out‐of‐sample mean excess returns. However, all tested rules cannot beat a simple buy‐and‐hold strategy when trading at a daily frequency. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

20.
The directional news impact curve (DNIC) is a relationship between returns and the probability of next period's return exceeding a certain threshold—zero in particular. Using long series of S&P500 index returns and a number of parametric models suggested in the literature, as well and flexible semiparametric models, we investigate the shape of the DNIC and forecasting abilities of these models. The semiparametric approach reveals that the DNIC has complicated shapes characterized by nonsymmetry with respect to past returns and their signs, heterogeneity across the thresholds, and changes over time. Simple parametric models often miss some important features of the DNIC, but some nevertheless exhibit superior out‐of‐sample performance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号