共查询到20条相似文献,搜索用时 0 毫秒
1.
The state space model is widely used to handle time series data driven by related latent processes in many fields. In this article, we suggest a framework to examine the relationship between state space models and autoregressive integrated moving average (ARIMA) models by examining the existence and positive‐definiteness conditions implied by auto‐covariance structures. This study covers broad types of state space models frequently used in previous studies. We also suggest a simple statistical test to check whether a certain state space model is appropriate for the specific data. For illustration, we apply the suggested procedure in the analysis of the United States real gross domestic product data. Copyright © 2011 John Wiley & Sons, Ltd. 相似文献
2.
Simultaneous prediction intervals for forecasts from time series models that contain L (L ≤ 1) unknown future observations with a specified probability are derived. Our simultaneous intervals are based on two types of probability inequalities, i.e. the Bonferroni- and product-types. These differ from the marginal intervals in that they take into account the correlation structure between the forecast errors. For the forecasting methods commonly used with seasonal time series data, we show how to construct forecast error correlations and evaluate, using an example, the simultaneous and marginal prediction intervals. For all the methods, the simultaneous intervals are accurate with the accuracy increasing with the use of higher-order probability inequalities, whereas the marginal intervals are far too short in every case. Also, when L is greater than the seasonal period, the simultaneous intervals based on improved probability inequalities will be most accurate. 相似文献
3.
Prasad V Bidarkota 《Journal of forecasting》2001,20(1):21-35
US inflation appears to undergo shifts in its mean level and variability. We evaluate the performance of three useful models for capturing such shifts. The models studied are the Markov switching models, state space models with heavy‐tailed errors, and state space models with compound error distributions. Our study shows that all three models have very similar performance when evaluated in terms of the mean squared or mean absolute forecast errors. However, the latter two models are considerably more parsimonious, and easily beat the more profligately parameterized Markov switching models in terms of model selection criteria, such as the AIC or the SBC. Thus, these may serve as useful continuous alternatives to the popular discrete Markov switching models for capturing shifts in time series. Copyright © 2001 John Wiley & Sons, Ltd. 相似文献
4.
In this paper we present results of a simulation study to assess and compare the accuracy of forecasting techniques for long‐memory processes in small sample sizes. We analyse differences between adaptive ARMA(1,1) L‐step forecasts, where the parameters are estimated by minimizing the sum of squares of L‐step forecast errors, and forecasts obtained by using long‐memory models. We compare widths of the forecast intervals for both methods, and discuss some computational issues associated with the ARMA(1,1) method. Our results illustrate the importance and usefulness of long‐memory models for multi‐step forecasting. Copyright © 1999 John Wiley & Sons, Ltd. 相似文献
5.
In this study, we verify the existence of predictability in the Brazilian equity market. Unlike other studies in the same sense, which evaluate original series for each stock, we evaluate synthetic series created on the basis of linear models of stocks. Following the approach of Burgess (Computational Finance, 1999; 99, 297–312), we use the ‘stepwise regression’ model for the formation of models of each stock. We then use the variance ratio profile together with a Monte Carlo simulation for the selection of models with potential predictability using data from 1 April 1999 to 30 December 2003. Unlike the approach of Burgess, we carry out White's Reality Check (Econometrica, 2000; 68, 1097–1126) in order to verify the existence of positive returns for the period outside the sample from 2 January 2004 to 28 August 2007. We use the strategies proposed by Sullivan, Timmermann and White (Journal of Finance, 1999; 54, 1647–1691) and Hsu and Kuan (Journal of Financial Econometrics, 2005; 3, 606–628) amounting to 26,410 simulated strategies. Finally, using the bootstrap methodology, with 1000 simulations, we find strong evidence of predictability in the models, including transaction costs. Copyright © 2015 John Wiley & Sons, Ltd. 相似文献
6.
K. D. Patterson 《Journal of forecasting》1995,14(4):337-350
There is considerable interest in the index of industrial production (IIP) as an indicator of the state of the UK's industrial base and, more generally, as a leading economic indicator. However, this index, in common with a number of key macroeconomic time series, is subject to revision as more information becomes available. This raises the problem of forecasting the final vintage of data on IIP. We construct a state space model to solve this problem which incorporates bias adjustments, a model of the measurement error process, and a dynamic model for the final vintage of IIP. Application of the Kalman filter produces an optimal forecast of the final vintage of data. 相似文献
7.
Hans C. Ohanian 《Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics》2009,40(2):167-173
Although Einstein's name is closely linked with the celebrated relation E=mc2 between mass and energy, a critical examination of the more than half dozen “proofs” of this relation that Einstein produced over a span of forty years reveals that all these proofs suffer from mistakes. Einstein introduced unjustified assumptions, committed fatal errors in logic, or adopted low-speed, restrictive approximations. He never succeeded in producing a valid general proof applicable to a realistic system with arbitrarily large internal speeds. The first such general proof was produced by Max Laue in 1911 (for “closed” systems with a time-independent energy–momentum tensor) and it was generalized by Felix Klein in 1918 (for arbitrary time-dependent “closed” systems). 相似文献
8.
9.
Survey‐based indicators are widely seen as leading indicators for economic activity. As such, consumer confidence might be informative for the future path of private consumption. Although the indicators receive high attention in the media, their forecasting power often appears to be very limited. This paper takes a fresh look at the data that serve as a basis for the consumer confidence indicator (CCI) reported by the EU Commission for the euro area. Different pooling methods are applied to exploit the survey information. Forecasts are based on mixed data sampling (MIDAS) and bridge equations. While the CCI does not outperform the autoregressive benchmark, the new indicators are able to raise forecasting performance. The best performing indicator should be built upon pre‐selection methods. Data‐driven aggregation methods should be preferred to determine the weights of the individual ingredients. Copyright © 2011 John Wiley & Sons, Ltd. 相似文献
10.
G. Rünstler K. Barhoumi S. Benk R. Cristadoro A. Den Reijer A. Jakaitiene P. Jelonek A. Rua K. Ruth C. Van Nieuwenhuyze 《Journal of forecasting》2009,28(7):595-611
This paper performs a large‐scale forecast evaluation exercise to assess the performance of different models for the short‐term forecasting of GDP, resorting to large datasets from ten European countries. Several versions of factor models are considered and cross‐country evidence is provided. The forecasting exercise is performed in a simulated real‐time context, which takes account of publication lags in the individual series. In general, we find that factor models perform best and models that exploit monthly information outperform models that use purely quarterly data. However, the improvement over the simpler, quarterly models remains contained. Copyright © 2009 John Wiley & Sons, Ltd. 相似文献
11.
Bert Schroer 《Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics》2010,41(4):293-308
The main topics of this second part of a two-part essay are some consequences of the phenomenon of vacuum polarization as the most important physical manifestation of modular localization. Besides philosophically unexpected consequences, it has led to a new constructive “outside-inwards approach” in which the pointlike fields and the compactly localized operator algebras which they generate only appear from intersecting much simpler algebras localized in noncompact wedge regions whose generators have extremely mild almost free field behavior.Another consequence of vacuum polarization presented in this essay is the localization entropy near a causal horizon which follows a logarithmically modified area law in which a dimensionless area (the area divided by the square of dR where dR is the thickness of a light-sheet) appears. There are arguments that this logarithmically modified area law corresponds to the volume law of the standard heat bath thermal behavior. We also explain the symmetry enhancing effect of holographic projections onto the causal horizon of a region and show that the resulting infinite dimensional symmetry groups contain the Bondi–Metzner–Sachs group. This essay is the second part of a partitioned longer paper. 相似文献
12.
Testing the existence of unit root and/or level change is necessary in order to understand the underlying processes of time series. In many studies carried out so far, the focus was only on a single aspect of unit root and level change, therefore limiting a full assessment of the given problems. Our study aims to find a solution to the given problems by testing the two hypotheses simultaneously. We derive the likelihood ratio test statistic based on the state space model, and their distributions are created by the simulation method. The performance of the proposed method is validated by simulated time series and also applied to two Korean macroeconomic time series to confirm its practical application. This analysis can provide a solution to determine the underlying structure of arguable time series. Copyright © 2009 John Wiley & Sons, Ltd. 相似文献
13.
We use state space methods to estimate a large dynamic factor model for the Norwegian economy involving 93 variables for 1978Q2–2005Q4. The model is used to obtain forecasts for 22 key variables that can be derived from the original variables by aggregation. To investigate the potential gain in using such a large information set, we compare the forecasting properties of the dynamic factor model with those of univariate benchmark models. We find that there is an overall gain in using the dynamic factor model, but that the gain is notable only for a few of the key variables. Copyright © 2009 John Wiley & Sons, Ltd. 相似文献
14.
张彦宇 《世界科技研究与发展》2012,(4):594-595,616
经过详细研究信号和噪声时频域的不同特性,提出了一种改进的VAD算法。改进前的VAD算法是一种静态的语音停顿周期检测算法,在信噪比很低时,检测性能变差;改进后的VAD算法在信号的全频带,低频带和高频带动态跟踪信号的短时功率包络,多次门限比较后,当出现语音停顿周期时,将做出基于帧的判决。为了验证改进VAD算法在TD_LTE手机NC系统中的应用能力,测试了在不同噪声、不同信噪比情况下的NC系统指标,并与传统NC系统做了比较。经评测,含新VAD算法的改进后NC算法能提高信噪比,并帮助TD_LTE移动终端顺利通过入网测试。 相似文献
15.
Several authors (King and Rebelo, 1993; Cogley and Nason, 1995) have questioned the use of exponentially weighted moving average filters such as the Hodrick–Prescott filter in decomposing a series into a trend and cycle, claiming that they lead to the observation of spurious or induced cycles and to misinterpretation of stylized facts. However, little has been done to propose different methods of estimation or other ways of defining trend extraction. This paper has two main contributions. First, we suggest that the decomposition between the trend and cycle has not been done in an appropriate way. Second, we argue for a general to specific approach based on a more general filter, the stochastic trend model, that allows us to estimate all the parameters of the model rather than fixing them arbitrarily, as is done with mainly of the commonly used filters. We illustrate the properties of the proposed technique relative to the conventional ones by employing a Monte Carlo study. Copyright © 1999 John Wiley & Sons, Ltd. 相似文献
16.
Everette S. Gardner 《Journal of forecasting》1985,4(1):1-28
This paper is a critical review of exponential smoothing since the original work by Brown and Holt in the 1950s. Exponential smoothing is based on a pragmatic approach to forecasting which is shared in this review. The aim is to develop state-of-the-art guidelines for application of the exponential smoothing methodology. The first part of the paper discusses the class of relatively simple models which rely on the Holt-Winters procedure for seasonal adjustment of the data. Next, we review general exponential smoothing (GES), which uses Fourier functions of time to model seasonality. The research is reviewed according to the following questions. What are the useful properties of these models? What parameters should be used? How should the models be initialized? After the review of model-building, we turn to problems in the maintenance of forecasting systems based on exponential smoothing. Topics in the maintenance area include the use of quality control models to detect bias in the forecast errors, adaptive parameters to improve the response to structural changes in the time series, and two-stage forecasting, whereby we use a model of the errors or some other model of the data to improve our initial forecasts. Some of the major conclusions: the parameter ranges and starting values typically used in practice are arbitrary and may detract from accuracy. The empirical evidence favours Holt's model for trends over that of Brown. A linear trend should be damped at long horizons. The empirical evidence favours the Holt-Winters approach to seasonal data over GES. It is difficult to justify GES in standard form–the equivalent ARIMA model is simpler and more efficient. The cumulative sum of the errors appears to be the most practical forecast monitoring device. There is no evidence that adaptive parameters improve forecast accuracy. In fact, the reverse may be true. 相似文献
17.
A. C. Harvey 《Journal of forecasting》1984,3(3):245-275
A large number of statistical forecasting procedures for univariate time series have been proposed in the literature. These range from simple methods, such as the exponentially weighted moving average, to more complex procedures such as Box–Jenkins ARIMA modelling and Harrison–Stevens Bayesian forecasting. This paper sets out to show the relationship between these various procedures by adopting a framework in which a time series model is viewed in terms of trend, seasonal and irregular components. The framework is then extended to cover models with explanatory variables. From the technical point of view the Kalman filter plays an important role in allowing an integrated treatment of these topics. 相似文献
18.
Mortality models used for forecasting are predominantly based on the statistical properties of time series and do not generally incorporate an understanding of the forces driving secular trends. This paper addresses three research questions: Can the factors found in stochastic mortality‐forecasting models be associated with real‐world trends in health‐related variables? Does inclusion of health‐related factors in models improve forecasts? Do resulting models give better forecasts than existing stochastic mortality models? We consider whether the space spanned by the latent factor structure in mortality data can be adequately described by developments in gross domestic product, health expenditure and lifestyle‐related risk factors using statistical techniques developed in macroeconomics and finance. These covariates are then shown to improve forecasts when incorporated into a Bayesian hierarchical model. Results are comparable or better than benchmark stochastic mortality models. Copyright © 2014 John Wiley & Sons, Ltd. 相似文献
19.
Daniel Parker 《Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics》2003,34(4):607-620
Lewis (Br. J. Philos. Sci. 48 (1997) 313) has recently presented an argument claiming that, under the Ghirardi–Rimini–Weber (GRW) theory of quantum mechanics, arithmetic does not apply to ordinary macroscopic objects such as marbles (known as the Counting Anomaly). In this paper, I disentangle two different lines of Lewis's argument, one devoted to what I call the standard GRW interpretation and the other to the mass density interpretation (MDI). I present both strains of Lewis's argument, and move on to criticise Lewis's position, focusing on his argument with respect to MDI. I analyse the structure of his argument, and follow this with a novel refutation of Lewis's argument, drawing on the original presentation of MDI as developed by Ghirardi et al. (Found. Phys. 25 (1995) 5). I briefly consider the debate that ensued between Bassi and Ghirardi and Clifton and Monton and interpret it within the context of my analysis. I conclude that Lewis's Counting Anomaly fails to generate a genuine problem. 相似文献
20.
Luis C. Nunes 《Journal of forecasting》2005,24(8):575-592
This paper presents an extension of the Stock and Watson coincident indicator model that allows one to include variables available at different frequencies while taking care of missing observations at any time period. The proposed procedure provides estimates of the unobserved common coincident component, of the unobserved monthly series underlying any included quarterly indicator, and of any missing values in the series. An application to a coincident indicator model for the Portuguese economy is presented. We use monthly indicators from business surveys whose results are published with a very short delay. By using the available data for the monthly indicators and for quarterly real GDP, it becomes possible to produce simultaneously a monthly composite index of coincident indicators and an estimate of the latest quarter real GDP growth well ahead of the release of the first official figures. Copyright © 2005 John Wiley & Son, Ltd. 相似文献