首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This article discusses the use of Bayesian methods for inference and forecasting in dynamic term structure models through integrated nested Laplace approximations (INLA). This method of analytical approximation allows accurate inferences for latent factors, parameters and forecasts in dynamic models with reduced computational cost. In the estimation of dynamic term structure models it also avoids some simplifications in the inference procedures, such as the inefficient two‐step ordinary least squares (OLS) estimation. The results obtained in the estimation of the dynamic Nelson–Siegel model indicate that this method performs more accurate out‐of‐sample forecasts compared to the methods of two‐stage estimation by OLS and also Bayesian estimation methods using Markov chain Monte Carlo (MCMC). These analytical approaches also allow efficient calculation of measures of model selection such as generalized cross‐validation and marginal likelihood, which may be computationally prohibitive in MCMC estimations. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

2.
This paper proposes a parsimonious threshold stochastic volatility (SV) model for financial asset returns. Instead of imposing a threshold value on the dynamics of the latent volatility process of the SV model, we assume that the innovation of the mean equation follows a threshold distribution in which the mean innovation switches between two regimes. In our model, the threshold is treated as an unknown parameter. We show that the proposed threshold SV model can not only capture the time‐varying volatility of returns, but can also accommodate the asymmetric shape of conditional distribution of the returns. Parameter estimation is carried out by using Markov chain Monte Carlo methods. For model selection and volatility forecast, an auxiliary particle filter technique is employed to approximate the filter and prediction distributions of the returns. Several experiments are conducted to assess the robustness of the proposed model and estimation methods. In the empirical study, we apply our threshold SV model to three return time series. The empirical analysis results show that the threshold parameter has a non‐zero value and the mean innovations belong to two separately distinct regimes. We also find that the model with an unknown threshold parameter value consistently outperforms the model with a known threshold parameter value. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

3.
We propose a method for improving the predictive ability of standard forecasting models used in financial economics. Our approach is based on the functional partial least squares (FPLS) model, which is capable of avoiding multicollinearity in regression by efficiently extracting information from the high‐dimensional market data. By using its well‐known ability, we can incorporate auxiliary variables that improve the predictive accuracy. We provide an empirical application of our proposed methodology in terms of its ability to predict the conditional average log return and the volatility of crude oil prices via exponential smoothing, Bayesian stochastic volatility, and GARCH (generalized autoregressive conditional heteroskedasticity) models, respectively. In particular, what we call functional data analysis (FDA) traces in this article are obtained via the FPLS regression from both the crude oil returns and auxiliary variables of the exchange rates of major currencies. For forecast performance evaluation, we compare out‐of‐sample forecasting accuracy of the standard models with FDA traces to the accuracy of the same forecasting models with the observed crude oil returns, principal component regression (PCR), and least absolute shrinkage and selection operator (LASSO) models. We find evidence that the standard models with FDA traces significantly outperform our competing models. Finally, they are also compared with the test for superior predictive ability and the reality check for data snooping. Our empirical results show that our new methodology significantly improves predictive ability of standard models in forecasting the latent average log return and the volatility of financial time series.  相似文献   

4.
This paper develops a New‐Keynesian Dynamic Stochastic General Equilibrium (NKDSGE) model for forecasting the growth rate of output, inflation, and the nominal short‐term interest rate (91 days Treasury Bill rate) for the South African economy. The model is estimated via maximum likelihood technique for quarterly data over the period of 1970:1–2000:4. Based on a recursive estimation using the Kalman filter algorithm, out‐of‐sample forecasts from the NKDSGE model are compared with forecasts generated from the classical and Bayesian variants of vector autoregression (VAR) models for the period 2001:1–2006:4. The results indicate that in terms of out‐of‐sample forecasting, the NKDSGE model outperforms both the classical and Bayesian VARs for inflation, but not for output growth and nominal short‐term interest rate. However, differences in RMSEs are not significant across the models. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

5.
We propose a new methodology for filtering and forecasting the latent variance in a two‐factor diffusion process with jumps from a continuous‐time perspective. For this purpose we use a continuous‐time Markov chain approximation with a finite state space. Essentially, we extend Markov chain filters to processes of higher dimensions. We assess forecastability of the models under consideration by measuring forecast error of model expected realized variance, trading in variance swap contracts, producing value‐at‐risk estimates as well as examining sign forecastability. We provide empirical evidence using two sources, the S&P 500 index values and its corresponding cumulative risk‐neutral expected variance (namely the VIX index). Joint estimation reveals the market prices of equity and variance risk implicit by the two probability measures. A further simulation study shows that the proposed methodology can filter the variance of virtually any type of diffusion process (coupled with a jump process) with a non‐analytical density function. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

6.
We investigate the realized volatility forecast of stock indices under the structural breaks. We utilize a pure multiple mean break model to identify the possibility of structural breaks in the daily realized volatility series by employing the intraday high‐frequency data of the Shanghai Stock Exchange Composite Index and the five sectoral stock indices in Chinese stock markets for the period 4 January 2000 to 30 December 2011. We then conduct both in‐sample tests and out‐of‐sample forecasts to examine the effects of structural breaks on the performance of ARFIMAX‐FIGARCH models for the realized volatility forecast by utilizing a variety of estimation window sizes designed to accommodate potential structural breaks. The results of the in‐sample tests show that there are multiple breaks in all realized volatility series. The results of the out‐of‐sample point forecasts indicate that the combination forecasts with time‐varying weights across individual forecast models estimated with different estimation windows perform well. In particular, nonlinear combination forecasts with the weights chosen based on a non‐parametric kernel regression and linear combination forecasts with the weights chosen based on the non‐negative restricted least squares and Schwarz information criterion appear to be the most accurate methods in point forecasting for realized volatility under structural breaks. We also conduct an interval forecast of the realized volatility for the combination approaches, and find that the interval forecast for nonlinear combination approaches with the weights chosen according to a non‐parametric kernel regression performs best among the competing models. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

7.
The first purpose of this paper is to assess the short‐run forecasting capabilities of two competing financial duration models. The forecast performance of the Autoregressive Conditional Multinomial–Autoregressive Conditional Duration (ACM‐ACD) model is better than the Asymmetric Autoregressive Conditional Duration (AACD) model. However, the ACM‐ACD model is more complex in terms of the computational setting and is more sensitive to starting values. The second purpose is to examine the effects of market microstructure on the forecasting performance of the two models. The results indicate that the forecast performance of the models generally decreases as the liquidity of the stock increases, with the exception of the most liquid stocks. Furthermore, a simple filter of the raw data improves the performance of both models. Finally, the results suggest that both models capture the characteristics of the micro data very well with a minimum sample length of 20 days. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

8.
We utilize mixed‐frequency factor‐MIDAS models for the purpose of carrying out backcasting, nowcasting, and forecasting experiments using real‐time data. We also introduce a new real‐time Korean GDP dataset, which is the focus of our experiments. The methodology that we utilize involves first estimating common latent factors (i.e., diffusion indices) from 190 monthly macroeconomic and financial series using various estimation strategies. These factors are then included, along with standard variables measured at multiple different frequencies, in various factor‐MIDAS prediction models. Our key empirical findings as follows. (i) When using real‐time data, factor‐MIDAS prediction models outperform various linear benchmark models. Interestingly, the “MSFE‐best” MIDAS models contain no autoregressive (AR) lag terms when backcasting and nowcasting. AR terms only begin to play a role in “true” forecasting contexts. (ii) Models that utilize only one or two factors are “MSFE‐best” at all forecasting horizons, but not at any backcasting and nowcasting horizons. In these latter contexts, much more heavily parametrized models with many factors are preferred. (iii) Real‐time data are crucial for forecasting Korean gross domestic product, and the use of “first available” versus “most recent” data “strongly” affects model selection and performance. (iv) Recursively estimated models are almost always “MSFE‐best,” and models estimated using autoregressive interpolation dominate those estimated using other interpolation methods. (v) Factors estimated using recursive principal component estimation methods have more predictive content than those estimated using a variety of other (more sophisticated) approaches. This result is particularly prevalent for our “MSFE‐best” factor‐MIDAS models, across virtually all forecast horizons, estimation schemes, and data vintages that are analyzed.  相似文献   

9.
Several studies have tested for long‐range dependence in macroeconomic and financial time series but very few have assessed the usefulness of long‐memory models as forecast‐generating mechanisms. This study tests for fractional differencing in the US monetary indices (simple sum and divisia) and compares the out‐of‐sample fractional forecasts to benchmark forecasts. The long‐memory parameter is estimated using Robinson's Gaussian semi‐parametric and multivariate log‐periodogram methods. The evidence amply suggests that the monetary series possess a fractional order between one and two. Fractional out‐of‐sample forecasts are consistently more accurate (with the exception of the M3 series) than benchmark autoregressive forecasts but the forecasting gains are not generally statistically significant. In terms of forecast encompassing, the fractional model encompasses the autoregressive model for the divisia series but neither model encompasses the other for the simple sum series. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

10.
The objectives of this paper are: first, to show empirically the relevance of using adaptive estimation techniques over more traditional estimation approaches when economic systems are believed to be structurally unstable over time; and secondly, to compare in an empirical framework two adaptive estimation techniques: Kalman filtering and the Carbone–Longini filter. For that purpose, an econometric model for the U.S. pulp and paper market is examined under the assumption of structural instability and, hence, constitutes the basis for comparing forecasting performances and estimation accuracy achieved by each technique. A version of Kalman filtering, modified in line with the basic idea of ‘tracking’ characterizing the Carbone–Longini filter, is also presented and applied. The analysis of the results shows that it may be worth using adapative estimation methods to estimate structurally unstable models, even if there is no prior knowledge about the patterns of variation of the parameters. Also, it shows the Carbone–Longini filter and Kalman filtering as being complementary estimation techniques. An estimation/forecasting methodology involving a sequential application mode of these two techniques is suggested.  相似文献   

11.
This paper considers the problem of forecasting high‐dimensional time series. It employs a robust clustering approach to perform classification of the component series. Each series within a cluster is assumed to follow the same model and the data are then pooled for estimation. The classification is model‐based and robust to outlier contamination. The robustness is achieved by using the intrinsic mode functions of the Hilbert–Huang transform at lower frequencies. These functions are found to be robust to outlier contamination. The paper also compares out‐of‐sample forecast performance of the proposed method with several methods available in the literature. The other forecasting methods considered include vector autoregressive models with ∕ without LASSO, group LASSO, principal component regression, and partial least squares. The proposed method is found to perform well in out‐of‐sample forecasting of the monthly unemployment rates of 50 US states. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

12.
This paper applies a triple‐choice ordered probit model, corrected for nonstationarity to forecast monetary decisions of the Reserve Bank of Australia. The forecast models incorporate a mix of monthly and quarterly macroeconomic time series. Forecast combination is used as an alternative to one multivariate model to improve accuracy of out‐of‐sample forecasts. This accuracy is evaluated with scoring functions, which are also used to construct adaptive weights for combining probability forecasts. This paper finds that combined forecasts outperform multivariable models. These results are robust to different sample sizes and estimation windows. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

13.
In this paper we compare the in‐sample fit and out‐of‐sample forecasting performance of no‐arbitrage quadratic, essentially affine and dynamic Nelson–Siegel term structure models. In total, 11 model variants are evaluated, comprising five quadratic, four affine and two Nelson–Siegel models. Recursive re‐estimation and out‐of‐sample 1‐, 6‐ and 12‐month‐ahead forecasts are generated and evaluated using monthly US data for yields observed at maturities of 1, 6, 12, 24, 60 and 120 months. Our results indicate that quadratic models provide the best in‐sample fit, while the best out‐of‐sample performance is generated by three‐factor affine models and the dynamic Nelson–Siegel model variants. Statistical tests fail to identify one single best forecasting model class. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

14.
Since volatility is perceived as an explicit measure of risk, financial economists have long been concerned with accurate measures and forecasts of future volatility and, undoubtedly, the Generalized Autoregressive Conditional Heteroscedasticity (GARCH) model has been widely used for doing so. It appears, however, from some empirical studies that the GARCH model tends to provide poor volatility forecasts in the presence of additive outliers. To overcome the forecasting limitation, this paper proposes a robust GARCH model (RGARCH) using least absolute deviation estimation and introduces a valuable estimation method from a practical point of view. Extensive Monte Carlo experiments substantiate our conjectures. As the magnitude of the outliers increases, the one‐step‐ahead forecasting performance of the RGARCH model has a more significant improvement in two forecast evaluation criteria over both the standard GARCH and random walk models. Strong evidence in favour of the RGARCH model over other competitive models is based on empirical application. By using a sample of two daily exchange rate series, we find that the out‐of‐sample volatility forecasts of the RGARCH model are apparently superior to those of other competitive models. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

15.
This paper examines the importance of forecasting higher moments for optimal hedge ratio estimation. To this end, autoregressive conditional density (ARCD) models are employed which allow for time variation in variance, skewness and kurtosis. The performance of ARCD models is evaluated against that of GARCH and of other conventional hedge ratio estimation methodologies based on exponentially weighted moving averages, ordinary least squares and error correction, respectively. An empirical application using spot and futures data on the DJI, FTSE and DAX equity indices compares the in‐sample and out‐of‐sample hedging effectiveness of each approach in terms of risk minimization. The results show that the ARCD approach has the best performance, thus suggesting that forecasting higher moments is of practical importance for futures hedging. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

16.
This paper presents a simple empirical approach to modeling and forecasting market option prices using localized option regressions (LOR). LOR projects market option prices over localized regions of their state space and is robust to assumptions regarding the underlying asset dynamics (e.g. log‐normality) and volatility structure. Our empirical study using 3 years of daily S&P500 options shows that LOR yields smaller out‐of‐sample pricing errors (e.g. 32% 1‐day‐out) relative to an efficient benchmark from the literature and produces option prices free of the volatility smile. In addition to being an efficient and robust option‐modeling and valuation tool for large option books, LOR provides a simple‐to‐implement empirical benchmark for evaluating more complex risk‐neutral models. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

17.
A family of finite end filters is constructed using a minimum revisions criterion and based on a local dynamic model operating within the span of a given finite central filter. These end filters are equivalent to evaluating the central filter with unavailable future observations replaced by constrained optimal linear predictions. Two prediction methods are considered: best linear unbiased prediction and best linear biased prediction where the bias is time invariant. The properties of these end filters are determined. In particular, they are compared to X‐11 end filters and to the case where the central filter is evaluated with unavailable future observations predicted by global ARIMA models as in X‐11‐ARIMA or X‐12‐ARIMA. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

18.
A non‐linear dynamic model is introduced for multiplicative seasonal time series that follows and extends the X‐11 paradigm where the observed time series is a product of trend, seasonal and irregular factors. A selection of standard seasonal and trend component models used in additive dynamic time series models are adapted for the multiplicative framework and a non‐linear filtering procedure is proposed. The results are illustrated and compared to X‐11 and log‐additive models using real data. In particular it is shown that the new procedures do not suffer from the trend bias present in log‐additive models. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

19.
To forecast realized volatility, this paper introduces a multiplicative error model that incorporates heterogeneous components: weekly and monthly realized volatility measures. While the model captures the long‐memory property, estimation simply proceeds using quasi‐maximum likelihood estimation. This paper investigates its forecasting ability using the realized kernels of 34 different assets provided by the Oxford‐Man Institute's Realized Library. The model outperforms benchmark models such as ARFIMA, HAR, Log‐HAR and HEAVY‐RM in within‐sample fitting and out‐of‐sample (1‐, 10‐ and 22‐step) forecasts. It performed best in both pointwise and cumulative comparisons of multi‐step‐ahead forecasts, regardless of loss function (QLIKE or MSE). Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

20.
This paper studies the performance of GARCH model and its modifications, using the rate of returns from the daily stock market indices of the Kuala Lumpur Stock Exchange (KLSE) including Composite Index, Tins Index, Plantations Index, Properties Index, and Finance Index. The models are stationary GARCH, unconstrained GARCH, non‐negative GARCH, GARCH‐M, exponential GARCH and integrated GARCH. The parameters of these models and variance processes are estimated jointly using the maximum likelihood method. The performance of the within‐sample estimation is diagnosed using several goodness‐of‐fit statistics. We observed that, among the models, even though exponential GARCH is not the best model in the goodness‐of‐fit statistics, it performs best in describing the often‐observed skewness in stock market indices and in out‐of‐sample (one‐step‐ahead) forecasting. The integrated GARCH, on the other hand, is the poorest model in both respects. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号