首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Density forecasts for weather variables are useful for the many industries exposed to weather risk. Weather ensemble predictions are generated from atmospheric models and consist of multiple future scenarios for a weather variable. The distribution of the scenarios can be used as a density forecast, which is needed for pricing weather derivatives. We consider one to 10‐day‐ahead density forecasts provided by temperature ensemble predictions. More specifically, we evaluate forecasts of the mean and quantiles of the density. The mean of the ensemble scenarios is the most accurate forecast for the mean of the density. We use quantile regression to debias the quantiles of the distribution of the ensemble scenarios. The resultant quantile forecasts compare favourably with those from a GARCH model. These results indicate the strong potential for the use of ensemble prediction in temperature density forecasting. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

2.
We extend the analysis of Christoffersen and Diebold (1998) on long‐run forecasting in cointegrated systems to multicointegrated systems. For the forecast evaluation we consider several loss functions, each of which has a particular interpretation in the context of stock‐flow models where multicointegration typically occurs. A loss function based on a standard mean square forecast error (MSFE) criterion focuses on the forecast errors of the flow variables alone. Likewise, a loss function based on the triangular representation of cointegrated systems (suggested by Christoffersen and Diebold) considers forecast errors associated with changes in both stock (modelled through the cointegrating restrictions) and flow variables. We suggest a new loss function based on the triangular representation of multicointegrated systems which further penalizes deviations from the long‐run relationship between the levels of stock and flow variables as well as changes in the flow variables. Among other things, we show that if one is concerned with all possible long‐run relations between stock and flow variables, this new loss function entails high and increasing forecasting gains compared to both the standard MSFE criterion and Christoffersen and Diebold's criterion. This paper demonstrates the importance of carefully selecting loss functions in forecast evaluation of models involving stock and flow variables. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

3.
We propose an ensemble of long–short‐term memory (LSTM) neural networks for intraday stock predictions, using a large variety of technical analysis indicators as network inputs. The proposed ensemble operates in an online way, weighting the individual models proportionally to their recent performance, which allows us to deal with possible nonstationarities in an innovative way. The performance of the models is measured by area under the curve of the receiver operating characteristic. We evaluate the predictive power of our model on several US large‐cap stocks and benchmark it against lasso and ridge logistic classifiers. The proposed model is found to perform better than the benchmark models or equally weighted ensembles.  相似文献   

4.
Many applications in science involve finding estimates of unobserved variables from observed data, by combining model predictions with observations. The sequential Monte Carlo (SMC) is a well‐established technique for estimating the distribution of unobserved variables that are conditional on current observations. While the SMC is very successful at estimating the first central moments, estimating the extreme quantiles of a distribution via the current SMC methods is computationally very expensive. The purpose of this paper is to develop a new framework using probability distortion. We use an SMC with distorted weights in order to make computationally efficient inferences about tail probabilities of future interest rates using the Cox–Ingersoll–Ross (CIR) model, as well as with an observed yield curve. We show that the proposed method yields acceptable estimates about tail quantiles at a fraction of the computational cost of the full Monte Carlo.  相似文献   

5.
This study compares the forecasting performance of a structural exchange rate model that combines the purchasing power parity condition with the interest rate differential in the long run, with some alternative exchange rate models. The analysis is applied to the Norwegian exchange rate. The long‐run equilibrium relationship is embedded in a parsimonious representation for the exchange rate. The structural exchange rate representation is stable over the sample and outperforms a random walk in an out‐of‐sample forecasting exercise at one to four horizons. Ignoring the interest rate differential in the long run, however, the structural model no longer outperforms a random walk. Copyright © 2006 John Wiley _ Sons, Ltd.  相似文献   

6.
We examine consistency properties of the exchange rate expectation formation process of short‐run and long‐run forecasts in the dollar/euro and yen/dollar market. Applying nonlinear consistency restrictions we show that in a simple expectation formation structure short‐run forecasts are indeed inconsistent with long‐run predictions. Moreover, we establish a ‘twist’ in the dollar/euro expectation formation process, i.e. market participants expect bandwagon effects in the short run, while they have stabilizing expectations in their long‐run forecasts. Applying a panel probit analysis we find that this twisting behavior is more likely to occur in periods of excess exchange rate volatility. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

7.
This paper examines the relationship between stock prices and commodity prices and whether this can be used to forecast stock returns. As both prices are linked to expected future economic performance they should exhibit a long‐run relationship. Moreover, changes in sentiment towards commodity investing may affect the nature of the response to disequilibrium. Results support cointegration between stock and commodity prices, while Bai–Perron tests identify breaks in the forecast regression. Forecasts are computed using a standard fixed (static) in‐sample/out‐of‐sample approach and by both recursive and rolling regressions, which incorporate the effects of changing forecast parameter values. A range of model specifications and forecast metrics are used. The historical mean model outperforms the forecast models in both the static and recursive approaches. However, in the rolling forecasts, those models that incorporate information from the long‐run stock price/commodity price relationship outperform both the historical mean and other forecast models. Of note, the historical mean still performs relatively well compared to standard forecast models that include the dividend yield and short‐term interest rates but not the stock/commodity price ratio. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

8.
This paper explores the ability of factor models to predict the dynamics of US and UK interest rate swap spreads within a linear and a non‐linear framework. We reject linearity for the US and UK swap spreads in favour of a regime‐switching smooth transition vector autoregressive (STVAR) model, where the switching between regimes is controlled by the slope of the US term structure of interest rates. We compare the ability of the STVAR model to predict swap spreads with that of a non‐linear nearest‐neighbours model as well as that of linear AR and VAR models. We find some evidence that the non‐linear models predict better than the linear ones. At short horizons, the nearest‐neighbours (NN) model predicts better than the STVAR model US swap spreads in periods of increasing risk conditions and UK swap spreads in periods of decreasing risk conditions. At long horizons, the STVAR model increases its forecasting ability over the linear models, whereas the NN model does not outperform the rest of the models. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

9.
We develop a semi‐structural model for forecasting inflation in the UK in which the New Keynesian Phillips curve (NKPC) is augmented with a time series model for marginal cost. By combining structural and time series elements we hope to reap the benefits of both approaches, namely the relatively better forecasting performance of time series models in the short run and a theory‐consistent economic interpretation of the forecast coming from the structural model. In our model we consider the hybrid version of the NKPC and use an open‐economy measure of marginal cost. The results suggest that our semi‐structural model performs better than a random‐walk forecast and most of the competing models (conventional time series models and strictly structural models) only in the short run (one quarter ahead) but it is outperformed by some of the competing models at medium and long forecast horizons (four and eight quarters ahead). In addition, the open‐economy specification of our semi‐structural model delivers more accurate forecasts than its closed‐economy alternative at all horizons. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

10.
This paper investigates the time-varying volatility patterns of some major commodities as well as the potential factors that drive their long-term volatility component. For this purpose, we make use of a recently proposed generalized autoregressive conditional heteroskedasticity–mixed data sampling approach, which typically allows us to examine the role of economic and financial variables of different frequencies. Using commodity futures for Crude Oil (WTI and Brent), Gold, Silver and Platinum, as well as a commodity index, our results show the necessity for disentangling the short-term and long-term components in modeling and forecasting commodity volatility. They also indicate that the long-term volatility of most commodity futures is significantly driven by the level of global real economic activity as well as changes in consumer sentiment, industrial production, and economic policy uncertainty. However, the forecasting results are not alike across commodity futures as no single model fits all commodities.  相似文献   

11.
In this paper, we put dynamic stochastic general equilibrium DSGE forecasts in competition with factor forecasts. We focus on these two models since they represent nicely the two opposing forecasting philosophies. The DSGE model on the one hand has a strong theoretical economic background; the factor model on the other hand is mainly data‐driven. We show that incorporating a large information set using factor analysis can indeed improve the short‐horizon predictive ability, as claimed by many researchers. The micro‐founded DSGE model can provide reasonable forecasts for US inflation, especially with growing forecast horizons. To a certain extent, our results are consistent with the prevailing view that simple time series models should be used in short‐horizon forecasting and structural models should be used in long‐horizon forecasting. Our paper compares both state‐of‐the‐art data‐driven and theory‐based modelling in a rigorous manner. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

12.
Do long‐run equilibrium relations suggested by economic theory help to improve the forecasting performance of a cointegrated vector error correction model (VECM)? In this paper we try to answer this question in the context of a two‐country model developed for the Canadian and US economies. We compare the forecasting performance of the exactly identified cointegrated VECMs to the performance of the over‐identified VECMs with the long‐run theory restrictions imposed. We allow for model uncertainty and conduct this comparison for every possible combination of the cointegration ranks of the Canadian and US models. We show that the over‐identified structural cointegrated models generally outperform the exactly identified models in forecasting Canadian macroeconomic variables. We also show that the pooled forecasts generated from the over‐identified models beat most of the individual exactly identified and over‐identified models as well as the VARs in levels and in differences. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

13.
Asymmetry has been well documented in the business cycle literature. The asymmetric business cycle suggests that major macroeconomic series, such as a country's unemployment rate, are non‐linear and, therefore, the use of linear models to explain their behaviour and forecast their future values may not be appropriate. Many researchers have focused on providing evidence for the non‐linearity in the unemployment series. Only recently have there been some developments in applying non‐linear models to estimate and forecast unemployment rates. A major concern of non‐linear modelling is the model specification problem; it is very hard to test all possible non‐linear specifications, and to select the most appropriate specification for a particular model. Artificial neural network (ANN) models provide a solution to the difficulty of forecasting unemployment over the asymmetric business cycle. ANN models are non‐linear, do not rely upon the classical regression assumptions, are capable of learning the structure of all kinds of patterns in a data set with a specified degree of accuracy, and can then use this structure to forecast future values of the data. In this paper, we apply two ANN models, a back‐propagation model and a generalized regression neural network model to estimate and forecast post‐war aggregate unemployment rates in the USA, Canada, UK, France and Japan. We compare the out‐of‐sample forecast results obtained by the ANN models with those obtained by several linear and non‐linear times series models currently used in the literature. It is shown that the artificial neural network models are able to forecast the unemployment series as well as, and in some cases better than, the other univariate econometrics time series models in our test. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

14.
This paper investigates inference and volatility forecasting using a Markov switching heteroscedastic model with a fat‐tailed error distribution to analyze asymmetric effects on both the conditional mean and conditional volatility of financial time series. The motivation for extending the Markov switching GARCH model, previously developed to capture mean asymmetry, is that the switching variable, assumed to be a first‐order Markov process, is unobserved. The proposed model extends this work to incorporate Markov switching in the mean and variance simultaneously. Parameter estimation and inference are performed in a Bayesian framework via a Markov chain Monte Carlo scheme. We compare competing models using Bayesian forecasting in a comparative value‐at‐risk study. The proposed methods are illustrated using both simulations and eight international stock market return series. The results generally favor the proposed double Markov switching GARCH model with an exogenous variable. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

15.
This paper deals with the economic interpretation of the unobserved components model in the light of the apparent problem posed by previous work in that several practiced methodologies seem to lead to very different models of certain economic variables. A detailed empirical analysis is carried out to show how the failure in obtaining quasi-orthogonal components can seriously bias the interpretation of some decomposition procedures. Finally, the forecasting performance (in both the short and long run) of these decomposition models is analyzed in comparison with other alternatives.  相似文献   

16.
This paper proposes a parsimonious threshold stochastic volatility (SV) model for financial asset returns. Instead of imposing a threshold value on the dynamics of the latent volatility process of the SV model, we assume that the innovation of the mean equation follows a threshold distribution in which the mean innovation switches between two regimes. In our model, the threshold is treated as an unknown parameter. We show that the proposed threshold SV model can not only capture the time‐varying volatility of returns, but can also accommodate the asymmetric shape of conditional distribution of the returns. Parameter estimation is carried out by using Markov chain Monte Carlo methods. For model selection and volatility forecast, an auxiliary particle filter technique is employed to approximate the filter and prediction distributions of the returns. Several experiments are conducted to assess the robustness of the proposed model and estimation methods. In the empirical study, we apply our threshold SV model to three return time series. The empirical analysis results show that the threshold parameter has a non‐zero value and the mean innovations belong to two separately distinct regimes. We also find that the model with an unknown threshold parameter value consistently outperforms the model with a known threshold parameter value. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

17.
An Erratum has been published for this article in Journal of Forecasting 23(6): 461 (2004) . This paper examines the problem of intrusion in computer systems that causes major breaches or allows unauthorized information manipulation. A new intrusion‐detection system using Bayesian multivariate regression is proposed to predict such unauthorized invasions before they occur and to take further action. We develop and use a multivariate dynamic linear model based on a unique approach leaving the unknown observational variance matrix distribution unspecified. The result is simultaneous forecasting free of the Wishart limitations that is proved faster and more reliable. Our proposed system uses software agent technology. The distributed software agent environment places an agent in each of the computer system workstations. The agent environment creates a user profile for each user. Every user has his or her profile monitored by the agent system and according to our statistical model prediction is possible. Implementation aspects are discussed using real data and an assessment of the model is provided. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

18.
We introduce an approximate dynamic factor model for modeling and forecasting large panels of realized volatilities. Since the model is estimated by means of principal components and low‐dimensional maximum likelihood, it does not suffer from the curse of dimensionality. We apply the model to a panel of 90 daily realized volatilities pertaining to S&P 100 from January 2001 to December 2008. Results show that our model is able to capture the stylized facts of panels of volatilities (comovements, clustering, long memory, dynamic volatility, skewness and heavy tails), and that it performs fairly well in forecasting, in particular in periods of turmoil, in which it outperforms standard univariate benchmarks. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

19.
Most pricing and hedging models rely on the long‐run temporal stability of a sample covariance matrix. Using a large dataset of equity prices from four countries—the USA, UK, Japan and Germany—we test the stability of realized sample covariance matrices using two complementary approaches: a standard covariance equality test and a novel matrix loss function approach. Our results present a pessimistic outlook for equilibrium models that require the covariance of assets returns to mean revert in the long run. We find that, while a daily first‐order Wishart autoregression is the best covariance matrix‐generating candidate, this non‐mean‐reverting process cannot capture all of the time series variation in the covariance‐generating process. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

20.
This paper develops and estimates a dynamic factor model in which estimates for unobserved monthly US Gross Domestic Product (GDP) are consistent with observed quarterly data. In contrast to existing approaches, the quarterly averages of our monthly estimates are exactly equal to the Bureau of Economic Analysis (BEA) quarterly estimates. The relationship between our monthly estimates and the quarterly data is therefore the same as the relationship between quarterly and annual data. The study makes use of Bayesian Markov chain Monte Carlo and data augmentation techniques to simulate values for the logarithms on monthly US GDP. The imposition of the exact linear quarterly constraint produces a non‐standard distribution, necessitating the implementation of a Metropolis simulation step in the estimation. Our methodology can be easily generalized to cases where the variable of interest is monthly GDP and in such a way that the final results incorporate the statistical uncertainty associated with the monthly GDP estimates. We provide an example by incorporating our monthly estimates into a Markov switching model of the US business cycle. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号