首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
This paper examines interest rate forecasts made for the period 1982–90 and examines three issues: (1) Is there a general agreement among analysts about the level of interest rates six months in the future? (2) Are all the forecasters equally good? (3) Are the forecasts valuable to prospective users? We use distributions of the cross-sections of forecasts, Friedman's statistic for analysis of variance by rank, and tests of independence between forecasts and outcomes to examine these questions. We conclude that there usually was a consensus among analysts, that there was no significant difference in the ability to forecast short-term rates but there was a difference with respect to the long-term predictions, and that these forecasts were not significantly better than random walk forecasts.  相似文献   

2.
Density forecasts for weather variables are useful for the many industries exposed to weather risk. Weather ensemble predictions are generated from atmospheric models and consist of multiple future scenarios for a weather variable. The distribution of the scenarios can be used as a density forecast, which is needed for pricing weather derivatives. We consider one to 10‐day‐ahead density forecasts provided by temperature ensemble predictions. More specifically, we evaluate forecasts of the mean and quantiles of the density. The mean of the ensemble scenarios is the most accurate forecast for the mean of the density. We use quantile regression to debias the quantiles of the distribution of the ensemble scenarios. The resultant quantile forecasts compare favourably with those from a GARCH model. These results indicate the strong potential for the use of ensemble prediction in temperature density forecasting. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

3.
We extend the analysis of Christoffersen and Diebold (1998) on long‐run forecasting in cointegrated systems to multicointegrated systems. For the forecast evaluation we consider several loss functions, each of which has a particular interpretation in the context of stock‐flow models where multicointegration typically occurs. A loss function based on a standard mean square forecast error (MSFE) criterion focuses on the forecast errors of the flow variables alone. Likewise, a loss function based on the triangular representation of cointegrated systems (suggested by Christoffersen and Diebold) considers forecast errors associated with changes in both stock (modelled through the cointegrating restrictions) and flow variables. We suggest a new loss function based on the triangular representation of multicointegrated systems which further penalizes deviations from the long‐run relationship between the levels of stock and flow variables as well as changes in the flow variables. Among other things, we show that if one is concerned with all possible long‐run relations between stock and flow variables, this new loss function entails high and increasing forecasting gains compared to both the standard MSFE criterion and Christoffersen and Diebold's criterion. This paper demonstrates the importance of carefully selecting loss functions in forecast evaluation of models involving stock and flow variables. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

4.
We propose an ensemble of long–short‐term memory (LSTM) neural networks for intraday stock predictions, using a large variety of technical analysis indicators as network inputs. The proposed ensemble operates in an online way, weighting the individual models proportionally to their recent performance, which allows us to deal with possible nonstationarities in an innovative way. The performance of the models is measured by area under the curve of the receiver operating characteristic. We evaluate the predictive power of our model on several US large‐cap stocks and benchmark it against lasso and ridge logistic classifiers. The proposed model is found to perform better than the benchmark models or equally weighted ensembles.  相似文献   

5.
Many applications in science involve finding estimates of unobserved variables from observed data, by combining model predictions with observations. The sequential Monte Carlo (SMC) is a well‐established technique for estimating the distribution of unobserved variables that are conditional on current observations. While the SMC is very successful at estimating the first central moments, estimating the extreme quantiles of a distribution via the current SMC methods is computationally very expensive. The purpose of this paper is to develop a new framework using probability distortion. We use an SMC with distorted weights in order to make computationally efficient inferences about tail probabilities of future interest rates using the Cox–Ingersoll–Ross (CIR) model, as well as with an observed yield curve. We show that the proposed method yields acceptable estimates about tail quantiles at a fraction of the computational cost of the full Monte Carlo.  相似文献   

6.
This study compares the forecasting performance of a structural exchange rate model that combines the purchasing power parity condition with the interest rate differential in the long run, with some alternative exchange rate models. The analysis is applied to the Norwegian exchange rate. The long‐run equilibrium relationship is embedded in a parsimonious representation for the exchange rate. The structural exchange rate representation is stable over the sample and outperforms a random walk in an out‐of‐sample forecasting exercise at one to four horizons. Ignoring the interest rate differential in the long run, however, the structural model no longer outperforms a random walk. Copyright © 2006 John Wiley _ Sons, Ltd.  相似文献   

7.
We examine consistency properties of the exchange rate expectation formation process of short‐run and long‐run forecasts in the dollar/euro and yen/dollar market. Applying nonlinear consistency restrictions we show that in a simple expectation formation structure short‐run forecasts are indeed inconsistent with long‐run predictions. Moreover, we establish a ‘twist’ in the dollar/euro expectation formation process, i.e. market participants expect bandwagon effects in the short run, while they have stabilizing expectations in their long‐run forecasts. Applying a panel probit analysis we find that this twisting behavior is more likely to occur in periods of excess exchange rate volatility. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

8.
This paper examines the relationship between stock prices and commodity prices and whether this can be used to forecast stock returns. As both prices are linked to expected future economic performance they should exhibit a long‐run relationship. Moreover, changes in sentiment towards commodity investing may affect the nature of the response to disequilibrium. Results support cointegration between stock and commodity prices, while Bai–Perron tests identify breaks in the forecast regression. Forecasts are computed using a standard fixed (static) in‐sample/out‐of‐sample approach and by both recursive and rolling regressions, which incorporate the effects of changing forecast parameter values. A range of model specifications and forecast metrics are used. The historical mean model outperforms the forecast models in both the static and recursive approaches. However, in the rolling forecasts, those models that incorporate information from the long‐run stock price/commodity price relationship outperform both the historical mean and other forecast models. Of note, the historical mean still performs relatively well compared to standard forecast models that include the dividend yield and short‐term interest rates but not the stock/commodity price ratio. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

9.
This paper explores the ability of factor models to predict the dynamics of US and UK interest rate swap spreads within a linear and a non‐linear framework. We reject linearity for the US and UK swap spreads in favour of a regime‐switching smooth transition vector autoregressive (STVAR) model, where the switching between regimes is controlled by the slope of the US term structure of interest rates. We compare the ability of the STVAR model to predict swap spreads with that of a non‐linear nearest‐neighbours model as well as that of linear AR and VAR models. We find some evidence that the non‐linear models predict better than the linear ones. At short horizons, the nearest‐neighbours (NN) model predicts better than the STVAR model US swap spreads in periods of increasing risk conditions and UK swap spreads in periods of decreasing risk conditions. At long horizons, the STVAR model increases its forecasting ability over the linear models, whereas the NN model does not outperform the rest of the models. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

10.
We develop a semi‐structural model for forecasting inflation in the UK in which the New Keynesian Phillips curve (NKPC) is augmented with a time series model for marginal cost. By combining structural and time series elements we hope to reap the benefits of both approaches, namely the relatively better forecasting performance of time series models in the short run and a theory‐consistent economic interpretation of the forecast coming from the structural model. In our model we consider the hybrid version of the NKPC and use an open‐economy measure of marginal cost. The results suggest that our semi‐structural model performs better than a random‐walk forecast and most of the competing models (conventional time series models and strictly structural models) only in the short run (one quarter ahead) but it is outperformed by some of the competing models at medium and long forecast horizons (four and eight quarters ahead). In addition, the open‐economy specification of our semi‐structural model delivers more accurate forecasts than its closed‐economy alternative at all horizons. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

11.
This paper investigates the time-varying volatility patterns of some major commodities as well as the potential factors that drive their long-term volatility component. For this purpose, we make use of a recently proposed generalized autoregressive conditional heteroskedasticity–mixed data sampling approach, which typically allows us to examine the role of economic and financial variables of different frequencies. Using commodity futures for Crude Oil (WTI and Brent), Gold, Silver and Platinum, as well as a commodity index, our results show the necessity for disentangling the short-term and long-term components in modeling and forecasting commodity volatility. They also indicate that the long-term volatility of most commodity futures is significantly driven by the level of global real economic activity as well as changes in consumer sentiment, industrial production, and economic policy uncertainty. However, the forecasting results are not alike across commodity futures as no single model fits all commodities.  相似文献   

12.
In this paper, we put dynamic stochastic general equilibrium DSGE forecasts in competition with factor forecasts. We focus on these two models since they represent nicely the two opposing forecasting philosophies. The DSGE model on the one hand has a strong theoretical economic background; the factor model on the other hand is mainly data‐driven. We show that incorporating a large information set using factor analysis can indeed improve the short‐horizon predictive ability, as claimed by many researchers. The micro‐founded DSGE model can provide reasonable forecasts for US inflation, especially with growing forecast horizons. To a certain extent, our results are consistent with the prevailing view that simple time series models should be used in short‐horizon forecasting and structural models should be used in long‐horizon forecasting. Our paper compares both state‐of‐the‐art data‐driven and theory‐based modelling in a rigorous manner. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

13.
Artificial neural network modelling has recently attracted much attention as a new technique for estimation and forecasting in economics and finance. The chief advantages of this new approach are that such models can usually find a solution for very complex problems, and that they are free from the assumption of linearity that is often adopted to make the traditional methods tractable. In this paper we compare the performance of Back‐Propagation Artificial Neural Network (BPN) models with the traditional econometric approaches to forecasting the inflation rate. Of the traditional econometric models we use a structural reduced‐form model, an ARIMA model, a vector autoregressive model, and a Bayesian vector autoregression model. We compare each econometric model with a hybrid BPN model which uses the same set of variables. Dynamic forecasts are compared for three different horizons: one, three and twelve months ahead. Root mean squared errors and mean absolute errors are used to compare quality of forecasts. The results show the hybrid BPN models are able to forecast as well as all the traditional econometric methods, and to outperform them in some cases. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

14.
Do long‐run equilibrium relations suggested by economic theory help to improve the forecasting performance of a cointegrated vector error correction model (VECM)? In this paper we try to answer this question in the context of a two‐country model developed for the Canadian and US economies. We compare the forecasting performance of the exactly identified cointegrated VECMs to the performance of the over‐identified VECMs with the long‐run theory restrictions imposed. We allow for model uncertainty and conduct this comparison for every possible combination of the cointegration ranks of the Canadian and US models. We show that the over‐identified structural cointegrated models generally outperform the exactly identified models in forecasting Canadian macroeconomic variables. We also show that the pooled forecasts generated from the over‐identified models beat most of the individual exactly identified and over‐identified models as well as the VARs in levels and in differences. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

15.
Asymmetry has been well documented in the business cycle literature. The asymmetric business cycle suggests that major macroeconomic series, such as a country's unemployment rate, are non‐linear and, therefore, the use of linear models to explain their behaviour and forecast their future values may not be appropriate. Many researchers have focused on providing evidence for the non‐linearity in the unemployment series. Only recently have there been some developments in applying non‐linear models to estimate and forecast unemployment rates. A major concern of non‐linear modelling is the model specification problem; it is very hard to test all possible non‐linear specifications, and to select the most appropriate specification for a particular model. Artificial neural network (ANN) models provide a solution to the difficulty of forecasting unemployment over the asymmetric business cycle. ANN models are non‐linear, do not rely upon the classical regression assumptions, are capable of learning the structure of all kinds of patterns in a data set with a specified degree of accuracy, and can then use this structure to forecast future values of the data. In this paper, we apply two ANN models, a back‐propagation model and a generalized regression neural network model to estimate and forecast post‐war aggregate unemployment rates in the USA, Canada, UK, France and Japan. We compare the out‐of‐sample forecast results obtained by the ANN models with those obtained by several linear and non‐linear times series models currently used in the literature. It is shown that the artificial neural network models are able to forecast the unemployment series as well as, and in some cases better than, the other univariate econometrics time series models in our test. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

16.
This paper is concerned with one-day-ahead hourly predictions of electricity demand for Puget Power, a local electricity utility for the Seattle area. Standard modelling techniques, including neural networks, will fail when the assumptions of the model are violated. It is demonstrated that typical modelling assumptions such as no outliers or level shifts are incorrect for electric power demand time series. A filter which removes or lessens the significance of outliers and level shifts is demonstrated. This filter produces ‘clean data’ which is used as the basis for future robust predictions. The robust predictions are shown to be better than non-robust counterparts on electricity load data. The outliers identified by the filter are shown to correspond with suspicious data. Finally, the estimated level shifts are in agreement with the belief that load growth is taking place year to year.  相似文献   

17.
This paper investigates inference and volatility forecasting using a Markov switching heteroscedastic model with a fat‐tailed error distribution to analyze asymmetric effects on both the conditional mean and conditional volatility of financial time series. The motivation for extending the Markov switching GARCH model, previously developed to capture mean asymmetry, is that the switching variable, assumed to be a first‐order Markov process, is unobserved. The proposed model extends this work to incorporate Markov switching in the mean and variance simultaneously. Parameter estimation and inference are performed in a Bayesian framework via a Markov chain Monte Carlo scheme. We compare competing models using Bayesian forecasting in a comparative value‐at‐risk study. The proposed methods are illustrated using both simulations and eight international stock market return series. The results generally favor the proposed double Markov switching GARCH model with an exogenous variable. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

18.
We show that the effects of overfitting and underfitting a vector autoregressive (VAR) model are strongly asymmetric for VAR summary statistics involving higher‐order dynamics (such as impulse response functions, variance decompositions, or long‐run forecasts) . Underfit models often underestimate the true dynamics of the population process and may result in spuriously tight confidence intervals. These insights are important for applied work, regardless of how the lag order is determined. In addition, they provide a new perspective on the trade‐offs between alternative lag order selection criteria. We provide evidence that, contrary to conventional wisdom, for many statistics of interest to VAR users the point and interval estimates based on the AIC compare favourably to those based on the more parsimonious Schwarz Information Criterion and Hannan – Quinn Criterion. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

19.
This paper deals with the economic interpretation of the unobserved components model in the light of the apparent problem posed by previous work in that several practiced methodologies seem to lead to very different models of certain economic variables. A detailed empirical analysis is carried out to show how the failure in obtaining quasi-orthogonal components can seriously bias the interpretation of some decomposition procedures. Finally, the forecasting performance (in both the short and long run) of these decomposition models is analyzed in comparison with other alternatives.  相似文献   

20.
This paper proposes a parsimonious threshold stochastic volatility (SV) model for financial asset returns. Instead of imposing a threshold value on the dynamics of the latent volatility process of the SV model, we assume that the innovation of the mean equation follows a threshold distribution in which the mean innovation switches between two regimes. In our model, the threshold is treated as an unknown parameter. We show that the proposed threshold SV model can not only capture the time‐varying volatility of returns, but can also accommodate the asymmetric shape of conditional distribution of the returns. Parameter estimation is carried out by using Markov chain Monte Carlo methods. For model selection and volatility forecast, an auxiliary particle filter technique is employed to approximate the filter and prediction distributions of the returns. Several experiments are conducted to assess the robustness of the proposed model and estimation methods. In the empirical study, we apply our threshold SV model to three return time series. The empirical analysis results show that the threshold parameter has a non‐zero value and the mean innovations belong to two separately distinct regimes. We also find that the model with an unknown threshold parameter value consistently outperforms the model with a known threshold parameter value. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号