首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
A unified method to detect and handle innovational and additive outliers, and permanent and transient level changes has been presented by R. S. Tsay. N. S. Balke has found that the presence of level changes may lead to misidentification and loss of test‐power, and suggests augmenting Tsay's procedure by conducting an additional disturbance search based on a white‐noise model. While Tsay allows level changes to be either permanent or transient, Balke considers only the former type. Based on simulated series with transient level changes this paper investigates how Balke's white‐noise model performs both when transient change is omitted from the model specification and when it is included. Our findings indicate that the alleged misidentification of permanent level changes may be influenced by the restrictions imposed by Balke. But when these restrictions are removed, Balke's procedure outperforms Tsay's in detecting changes in the data‐generating process. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

2.
We consider the linear time‐series model yt=dt+ut(t=1,...,n), where dt is the deterministic trend and ut the stochastic term which follows an AR(1) process; ut=θut−1t, with normal innovations ϵt. Various assumptions about the start‐up will be made. Our main interest lies in the behaviour of the l‐period‐ahead forecast yn+1 near θ=1. Unlike in other studies of the AR(1) unit root process, we do not wish to ask the question whether θ=1 but are concerned with the behaviour of the forecast estimate near and at θ=1. For this purpose we define the sth (s=1,2) order sensitivity measure λl(s) of the forecast yn+1 near θ=1. This measures the sensitivity of the forecast at the unit root. In this study we consider two deterministic trends: dtt and dtttt. The forecast will be the Best Linear Unbiased forecast. We show that, when dtt, the number of observations has no effect on forecast sensitivity. When the deterministic trend is linear, the sensitivity is zero. We also develop a large‐sample procedure to measure the forecast sensitivity when we are uncertain whether to include the linear trend. Our analysis suggests that, depending on the initial conditions, it is better to include a linear trend for reduced sensitivity of the medium‐term forecast. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

3.
It often occurs that no model may be exactly right, and that different portions of the data may favour different models. The purpose of this paper is to propose a new procedure for the detection of regime switches between stationary and nonstationary processes in economic time series and to show its usefulness in economic forecasting. In the proposed procedure, time series observations are divided into several segments, and a stationary or nonstationary autoregressive model is fitted to each segment. The goodness of fit of the global model composed of these local models is evaluated using the corresponding information criterion, and the division which minimizes the information criterion defines the best model. Simulation and forecasting results show the efficacy and limitations of the proposed procedure. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

4.
Exponential smoothing methods do not adapt well to a major level or slope change. In this paper, Bayesian statistical theory is applied to the dynamic linear model, altered by inclusion of dummy variables, and statistics are derived to detect such changes and to estimate both the change-point and the size. The paper also gives test statistics for such problems related to exponential smoothing. The statistics are simple functions of exponentially weighted moving averages of the forecast errors, using the same discount factor used in the exponential smoothing. Gardner has derived an approximate test statistic to detect a mean change in the constant mean model. When the present results are applied to this model they give the exact statistic.  相似文献   

5.
Bootstrap in time series models is not straightforward to implement, as in this case the observations are not independent. One of the alternatives is to bootstrap the residuals in order to obtain the bootstrap series and thus use these series for inference purposes. This work deals with the problem of assessing the accuracy of hyperparameters in structural models. We study the simplest case, the local level model, where the hyperparameters are given by the variances of the disturbance terms. As their distribution is not known, we employ the bootstrap to approximate the true distribution, using parametric and non‐parametric approaches. Bootstrap standard deviations are computed and their performances compared to the asymptotic and empirical standard errors, calculated using a Monte Carlo simulation. We also build confidence intervals to the hyperparameters, using four bootstrap methods and the results are compared by means of the length, shape and coverage probabilities of the intervals. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

6.
This paper considers the problems of statistically analysing the levels of financial time series rather than their differences, which are often equivalent to returns and which are traditionally analysed in econometric modelling. This focus on differences is a consequence of the inherent nonstationarity of the levels, and hence analysing the latter requires introducing an alternative framework for modelling nonstationary behaviour. We do this by considering randomized unit root processes, arguing that these can have a natural interpretation in the financial context. The paper thus develops methods for testing for randomized unit roots and for modelling such processes. It then applies these techniques to various financial time series, so as to ascertain their potential usefulness, particularly for forecasting.  相似文献   

7.
There is considerable interest in the index of industrial production (IIP) as an indicator of the state of the UK's industrial base and, more generally, as a leading economic indicator. However, this index, in common with a number of key macroeconomic time series, is subject to revision as more information becomes available. This raises the problem of forecasting the final vintage of data on IIP. We construct a state space model to solve this problem which incorporates bias adjustments, a model of the measurement error process, and a dynamic model for the final vintage of IIP. Application of the Kalman filter produces an optimal forecast of the final vintage of data.  相似文献   

8.
A physically based model for ground‐level ozone forecasting is evaluated for Santiago, Chile. The model predicts the daily peak ozone concentration, with the daily rise of air temperature as input variable; weekends and rainy days appear as interventions. This model was used to analyse historical data, using the Linear Transfer Function/Finite Impulse Response (LTF/FIR) formalism; the Simultaneous Transfer Function (STF) method was used to analyse several monitoring stations together. Model evaluation showed a good forecasting performance across stations—for low and high ozone impacts—with power of detection (POD) values between 70 and 100%, Heidke's Skill Scores between 40% and 70% and low false alarm rates (FAR). The model consistently outperforms a pure persistence forecast. Model performance was not sensitive to different implementation options. The model performance degrades for two‐ and three‐days ahead forecast, but is still acceptable for the purpose of developing an environmental warning system at Santiago. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

9.
The paper forecasts consumer price inflation in the euro area (EA) and in the USA between 1980:Q1 and 2012:Q4 based on a large set of predictors, with dynamic model averaging (DMA) and dynamic model selection (DMS). DMA/DMS allows not solely for coefficients to change over time, but also for changes in the entire forecasting model over time. DMA/DMS provides on average the best inflation forecasts with regard to alternative approaches (such as the random walk). DMS outperforms DMA. These results are robust for different sample periods and for various forecast horizons. The paper highlights common features between the USA and the EA. First, two groups of predictors forecast inflation: temporary fundamentals that have a frequent impact on inflation but only for short time periods; and persistent fundamentals whose switches are less frequent over time. Second, the importance of some variables (particularly international food commodity prices, house prices and oil prices) as predictors for consumer price index inflation increases when such variables experience large shocks. The paper also shows that significant differences prevail in the forecasting models between the USA and the EA. Such differences can be explained by the structure of these respective economies. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

10.
This article introduces a new model to capture simultaneously the mean and variance asymmetries in time series. Threshold non‐linearity is incorporated into the mean and variance specifications of a stochastic volatility model. Bayesian methods are adopted for parameter estimation. Forecasts of volatility and Value‐at‐Risk can also be obtained by sampling from suitable predictive distributions. Simulations demonstrate that the apparent variance asymmetry documented in the literature can be due to the neglect of mean asymmetry. Strong evidence of the mean and variance asymmetries was detected in US and Hong Kong data. Asymmetry in the variance persistence was also discovered in the Hong Kong stock market. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

11.
Simultaneous prediction intervals for forecasts from time series models that contain L (L ≤ 1) unknown future observations with a specified probability are derived. Our simultaneous intervals are based on two types of probability inequalities, i.e. the Bonferroni- and product-types. These differ from the marginal intervals in that they take into account the correlation structure between the forecast errors. For the forecasting methods commonly used with seasonal time series data, we show how to construct forecast error correlations and evaluate, using an example, the simultaneous and marginal prediction intervals. For all the methods, the simultaneous intervals are accurate with the accuracy increasing with the use of higher-order probability inequalities, whereas the marginal intervals are far too short in every case. Also, when L is greater than the seasonal period, the simultaneous intervals based on improved probability inequalities will be most accurate.  相似文献   

12.
We use state space methods to estimate a large dynamic factor model for the Norwegian economy involving 93 variables for 1978Q2–2005Q4. The model is used to obtain forecasts for 22 key variables that can be derived from the original variables by aggregation. To investigate the potential gain in using such a large information set, we compare the forecasting properties of the dynamic factor model with those of univariate benchmark models. We find that there is an overall gain in using the dynamic factor model, but that the gain is notable only for a few of the key variables. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

13.
This paper makes use of simple graphical techniques, a seasonal unit root test and a structural time-series model to obtain information on the time series properties of UK crude steel consumption. It shows that steel consumption has, after the removal of some quite substantial outliers, a fairly constant seasonal pattern, and a well-defined but stochastic business cycle. The long-run movement in steel consumption also appears to be stochastic in nature. These characteristics were used to identify a structural time-series model and the ex-post forecasts obtained from it performed reasonably well. Finally, this paper presents some ex-ante quarterly forecasts for crude steel consumption to the year 1999. © 1997 by John Wiley & Sons, Ltd.  相似文献   

14.
Outliers, level shifts, and variance changes are commonplace in applied time series analysis. However, their existence is often ignored and their impact is overlooked, for the lack of simple and useful methods to detect and handle those extraordinary events. The problem of detecting outliers, level shifts, and variance changes in a univariate time series is considered. The methods employed are extremely simple yet useful. Only the least squares techniques and residual variance ratios are used. The effectiveness of these simple methods is demonstrated by analysing three real data sets.  相似文献   

15.
A combination of VAR estimation and state space model reduction techniques are examined by Monte Carlo methods in order to find good, simple to use, procedures for determining models which have reasonable prediction properties. The presentation is largely graphical. This helps focus attention on the aspects of the model determination problem which are relatively important for forecasting. One surprising result is that, for prediction purposes, knowledge of the true structure of the model generating the data is not particularly useful unless parameter values are also known. This is because the difficulty in estimating parameters of the true model causes more prediction error than results from a more parsimonious approximate model.  相似文献   

16.
Structural change and the combination of forecasts   总被引:1,自引:0,他引:1  
Forecasters are generally concerned about the properties of model-based predictions in the presence of structural change. In this paper, it is argued that forecast errors can under those conditions be greatly reduced through systematic combination of forecasts. We propose various extensions of the standard regression-based theory of forecast combination. Rolling weighted least squares and time-varying parameter techniques are shown to be useful generalizations of the basic framework. Numerical examples, based on various types of structural change in the constituent forecasts, indicate that the potential reduction in forecast error variance through these methods is very significant. The adaptive nature of these updating procedures greatly enhances the effect of risk-spreading embodied in standard combination techniques.  相似文献   

17.
In this paper we propose and evaluate two new methods for the quantification of business surveys concerning the qualitative assessment of the state of the economy. The first is a nonparametric method based on the spectral envelope, originally proposed by Stoffer, Tyler and McDougall (Spectral analysis for categorical time series: scaling and the spectral envelope, Biometrika 80 : 611–622) to the multivariate time series of the counts in each response category. Secondly, we fit by maximum likelihood a cumulative logit unobserved components models featuring a common cycle. The conditional mean of the cycle, which can be evaluated by importance sampling, offers the required quantification. We assess the validity of the two methods by comparing the results with a standard quantification based on the balance of opinions and with a quantitative economic indicator. Copyright ? 2010 John Wiley & Sons, Ltd.  相似文献   

18.
We propose a framework to describe, analyze, and explain the conditions under which scientific communities organize themselves to do research, particularly within large-scale, multidisciplinary projects. The framework centers on the notion of a research repertoire, which encompasses well-aligned assemblages of the skills, behaviors, and material, social, and epistemic components that a group may use to practice certain kinds of science, and whose enactment affects the methods and results of research. This account provides an alternative to the idea of Kuhnian paradigms for understanding scientific change in the following ways: (1) it does not frame change as primarily generated and shaped by theoretical developments, but rather takes account of administrative, material, technological, and institutional innovations that contribute to change and explicitly questions whether and how such innovations accompany, underpin, and/or undercut theoretical shifts; (2) it thus allows for tracking of the organization, continuity, and coherence in research practices which Kuhn characterized as ‘normal science’ without relying on the occurrence of paradigmatic shifts and revolutions to be able to identify relevant components; and (3) it requires particular attention be paid to the performative aspects of science, whose study Kuhn pioneered but which he did not extensively conceptualize. We provide a detailed characterization of repertoires and discuss their relationship with communities, disciplines, and other forms of collaborative activities within science, building on an analysis of historical episodes and contemporary developments in the life sciences, as well as cases drawn from social and historical studies of physics, psychology, and medicine.  相似文献   

19.
In combining economic forecasts a problem often faced is that the individual forecasts display some degree of dependence. We discuss latent root regression for combining collinear GNP forecasts. Our results indicate that latent root regression produces more efficient combining weight estimates (regression parameter estimates) than ordinary least squares estimation (OLS), although out-of-sample forecasting performance is comparable to OLS.  相似文献   

20.
Numerical state space models are efficiently implemented for the estimation of the underlying level and trend of a time series. The model specification is chosen so that the estimation is insensitive to outliers yet adapts rapidly to step changes in level. An example illustrates, by means of projection plots, how at times of uncertainty in the evolution of the series the inferred distribution of level and trend may be multi-modal.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号