首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Improving the prediction accuracy of agricultural product futures prices is important for investors, agricultural producers, and policymakers. This is to evade risks and enable government departments to formulate appropriate agricultural regulations and policies. This study employs the ensemble empirical mode decomposition (EEMD) technique to decompose six different categories of agricultural futures prices. Subsequently, three models—support vector machine (SVM), neural network (NN), and autoregressive integrated moving average (ARIMA)—are used to predict the decomposition components. The final hybrid model is then constructed by comparing the prediction performance of the decomposition components. The predicting performance of the combination model is then compared with the benchmark individual models: SVM, NN, and ARIMA. Our main interest in this study is on short-term forecasting, and thus we only consider 1-day and 3-day forecast horizons. The results indicate that the prediction performance of the EEMD combined model is better than that of individual models, especially for the 3-day forecasting horizon. The study also concluded that the machine learning methods outperform the statistical methods in forecasting high-frequency volatile components. However, there is no obvious difference between individual models in predicting low-frequency components.  相似文献   

2.
In their seminal book Time Series Analysis: Forecasting and Control, Box and Jenkins (1976) introduce the Airline model, which is still routinely used for the modelling of economic seasonal time series. The Airline model is for a differenced time series (in levels and seasons) and constitutes a linear moving average of lagged Gaussian disturbances which depends on two coefficients and a fixed variance. In this paper a novel approach to seasonal adjustment is developed that is based on the Airline model and that accounts for outliers and breaks in time series. For this purpose we consider the canonical representation of the Airline model. It takes the model as a sum of trend, seasonal and irregular (unobserved) components which are uniquely identified as a result of the canonical decomposition. The resulting unobserved components time series model is extended by components that allow for outliers and breaks. When all components depend on Gaussian disturbances, the model can be cast in state space form and the Kalman filter can compute the exact log‐likelihood function. Related filtering and smoothing algorithms can be used to compute minimum mean squared error estimates of the unobserved components. However, the outlier and break components typically rely on heavy‐tailed densities such as the t or the mixture of normals. For this class of non‐Gaussian models, Monte Carlo simulation techniques will be used for estimation, signal extraction and seasonal adjustment. This robust approach to seasonal adjustment allows outliers to be accounted for, while keeping the underlying structures that are currently used to aid reporting of economic time series data. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

3.
A univariate structural time series model based on the traditional decomposition into trend, seasonal and irregular components is defined. A number of methods of computing maximum likelihood estimators are then considered. These include direct maximization of various time domain likelihood function. The asymptotic properties of the estimators are given and a comparison between the various methods in terms of computational efficiency and accuracy is made. The methods are then extended to models with explanatory variables.  相似文献   

4.
Artificial neural network (ANN) combined with signal decomposing methods is effective for long‐term streamflow time series forecasting. ANN is a kind of machine learning method utilized widely for streamflow time series, and which performs well in forecasting nonstationary time series without the need of physical analysis for complex and dynamic hydrological processes. Most studies take multiple factors determining the streamflow as inputs such as rainfall. In this study, a long‐term streamflow forecasting model depending only on the historical streamflow data is proposed. Various preprocessing techniques, including empirical mode decomposition (EMD), ensemble empirical mode decomposition (EEMD) and discrete wavelet transform (DWT), are first used to decompose the streamflow time series into simple components with different timescale characteristics, and the relation between these components and the original streamflow at the next time step is analyzed by ANN. Hybrid models EMD‐ANN, EEMD‐ANN and DWT‐ANN are developed in this study for long‐term daily streamflow forecasting, and performance measures root mean square error (RMSE), mean absolute percentage error (MAPE) and Nash–Sutcliffe efficiency (NSE) indicate that the proposed EEMD‐ANN method performs better than EMD‐ANN and DWT‐ANN models, especially in high flow forecasting.  相似文献   

5.
In this paper we propose and test a forecasting model on monthly and daily spot prices of five selected exchange rates. In doing so, we combine a novel smoothing technique (initially applied in signal processing) with a variable selection methodology and two regression estimation methodologies from the field of machine learning (ML). After the decomposition of the original exchange rate series using an ensemble empirical mode decomposition (EEMD) method into a smoothed and a fluctuation component, multivariate adaptive regression splines (MARS) are used to select the most appropriate variable set from a large set of explanatory variables that we collected. The selected variables are then fed into two distinctive support vector machines (SVR) models that produce one‐period‐ahead forecasts for the two components. Neural networks (NN) are also considered as an alternative to SVR. The sum of the two forecast components is the final forecast of the proposed scheme. We show that the above implementation exhibits a superior in‐sample and out‐of‐sample forecasting ability when compared to alternative forecasting models. The empirical results provide evidence against the efficient market hypothesis for the selected foreign exchange markets. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

6.
A reliable and efficient forecasting system can be used to warn the general public against the increasing PM2.5 concentration. This paper proposes a novel AdaBoost-ensemble technique based on a hybrid data preprocessing-analysis strategy, with the following contributions: (i) a new decomposition strategy is proposed based on the hybrid data preprocessing-analysis strategy, which combines the merits of two popular decomposition algorithms and has been proven to be a promising decomposition strategy; (ii) the long short-term memory (LSTM), as a powerful deep learning forecasting algorithm, is applied to individually forecast the decomposed components, which can effectively capture the long-short patterns of complex time series; and (iii) a novel AdaBoost-LSTM ensemble technique is then developed to integrate the individual forecasting results into the final forecasting results, which provides significant improvement to the forecasting performance. To evaluate the proposed model, a comprehensive and scientific assessment system with several evaluation criteria, comparison models, and experiments is designed. The experimental results indicate that our developed hybrid model considerably surpasses the compared models in terms of forecasting precision and statistical testing and that its excellent forecasting performance can guide in developing effective control measures to decrease environmental contamination and prevent the health issues caused by a high PM2.5 concentration.  相似文献   

7.
We decompose economic uncertainty into "good" and "bad" components according to the sign of innovations. Our results indicate that bad uncertainty provides stronger predictive content regarding future market volatility than good uncertainty. The asymmetric models with good and bad uncertainties forecast market volatility in a better way than the symmetric models with overall uncertainty. The combination for asymmetric uncertainty models significantly outperforms the benchmark of autoregression, as well as the combination for symmetric models. The revealed volatility predictability is further demonstrated to be economically significant in the framework of portfolio allocation.  相似文献   

8.
Trend and seasonality are the most prominent features of economic time series that are observed at the subannual frequency. Modeling these components serves a variety of analytical purposes, including seasonal adjustment and forecasting. In this paper we introduce unobserved components models for which both the trend and seasonal components arise from systematically sampling a multivariate transition equation, according to which each season evolves as a random walk with a drift. By modeling the disturbance covariance matrix we can encompass traditional models for seasonal time series, like the basic structural model, and can formulate more elaborate ones, dealing with season specific features, such as seasonal heterogeneity and correlation, along with the different role of the nonstationary cycles defined at the fundamental and the harmonic frequencies in determining the shape of the seasonal pattern.  相似文献   

9.
This paper uses high‐frequency continuous intraday electricity price data from the EPEX market to estimate and forecast realized volatility. Three different jump tests are used to break down the variation into jump and continuous components using quadratic variation theory. Several heterogeneous autoregressive models are then estimated for the logarithmic and standard deviation transformations. Generalized autoregressive conditional heteroskedasticity (GARCH) structures are included in the error terms of the models when evidence of conditional heteroskedasticity is found. Model selection is based on various out‐of‐sample criteria. Results show that decomposition of realized volatility is important for forecasting and that the decision whether to include GARCH‐type innovations might depend on the transformation selected. Finally, results are sensitive to the jump test used in the case of the standard deviation transformation.  相似文献   

10.
We develop a semi‐structural model for forecasting inflation in the UK in which the New Keynesian Phillips curve (NKPC) is augmented with a time series model for marginal cost. By combining structural and time series elements we hope to reap the benefits of both approaches, namely the relatively better forecasting performance of time series models in the short run and a theory‐consistent economic interpretation of the forecast coming from the structural model. In our model we consider the hybrid version of the NKPC and use an open‐economy measure of marginal cost. The results suggest that our semi‐structural model performs better than a random‐walk forecast and most of the competing models (conventional time series models and strictly structural models) only in the short run (one quarter ahead) but it is outperformed by some of the competing models at medium and long forecast horizons (four and eight quarters ahead). In addition, the open‐economy specification of our semi‐structural model delivers more accurate forecasts than its closed‐economy alternative at all horizons. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

11.
This paper examines the relative importance of allowing for time‐varying volatility and country interactions in a forecast model of economic activity. Allowing for these issues is done by augmenting autoregressive models of growth with cross‐country weighted averages of growth and the generalized autoregressive conditional heteroskedasticity framework. The forecasts are evaluated using statistical criteria through point and density forecasts, and an economic criterion based on forecasting recessions. The results show that, compared to an autoregressive model, both components improve forecast ability in terms of point and density forecasts, especially one‐period‐ahead forecasts, but that the forecast ability is not stable over time. The random walk model, however, still dominates in terms of forecasting recessions.  相似文献   

12.
This paper evaluates multistep estimation for the purposes of signal extraction, and in particular the separation of the trend from the cycle in economic time series, and long‐range forecasting, in the presence of a misspecified, but simply parameterized model. Our workhorse models are two popular unobserved components models, namely the local level and the local linear model. The paper introduces a metric for assessing the accuracy of the unobserved components estimates and concludes that multistep estimation can be valuable. However, its performance depends crucially on the properties of the series and the paper explores the role of the order of integration and the relative size of the cyclical variation. On the contrary, cross‐validation is usually not suitable for the purposes considered. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

13.
In the present study we examine the predictive power of disagreement amongst forecasters. In our empirical work, we find that in some situations this variable can signal upcoming structural and temporal changes in an economic process and in the predictive power of the survey forecasts. We examine a variety of macroeconomic variables, and we use different measurements for the degree of disagreement, together with measures for location of the survey data and autoregressive components. Forecasts from simple linear models and forecasts from Markov regime‐switching models with constant and with time‐varying transition probabilities are constructed in real time and compared on forecast accuracy. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

14.
One method for judgemental forecasting involves the use of decomposition; i.e. estimating the conditional means of an unknown quantity of interest for a finite number of conditioning events, and weighting these estimated conditional means by the estimated marginal probabilities of the corresponding conditioning events. In this paper we investigate how the level of decomposition (i.e. the number of conditioning events) affects the precision of the resulting forecast. Previous analyses assume that key parameters (the informativeness of the decomposition, and the precision of estimation for the conditional means and the marginal probabilities) remain constant as the number of conditioning events increases. However, this assumption is unreasonable, and for some parameters mathematically impossible; the values of these parameters are likely to change significantly even for small numbers of conditioning events. Therefore, we introduce models for how these key parameters may depend on the level of decomposition. We then investigate the implications of these models for the precision of the resulting forecast. In particular, we identify cases in which decomposition is never desirable, always desirable, or desirable only near the optimal number of conditioning events. This second case was not observed previously. We focus throughout on the situation likely to be of greatest interest in practice; namely, the behaviour of decomposition for relatively small numbers of conditioning events.© 1997 John Wiley & Sons, Ltd.  相似文献   

15.
The existing contradictory findings on the contribution of trading volume to volatility forecasting prompt us to seek new solutions to test the sequential information arrival hypothesis (SIAH). Departing from other empirical analyses that mainly focus on sophisticated testing methods, this research offers new insights into the volume-volatility nexus by decomposing and reconstructing the trading activity into short-run components that typically represent irregular information flow and long-run components that denote extreme information flow in the stock market. We are the first to attempt at incorporating an improved empirical mode decomposition (EMD) method to investigate the volatility forecasting ability of trading volume along with the Heterogeneous Autoregressive (HAR) model. Previous trading volume is used to obtain the decompositions to forecast the future volatility to ensure an ex ante forecast, and both the decomposition and forecasting processes are carried out by the rolling window scheme. Rather than trading volume by itself, the results show that the reconstructed components are also able to significantly improve out-of-sample realized volatility (RV) forecasts. This finding is robust both in one-step ahead and multiple-step ahead forecasting horizons under different estimation windows. We thus fill the gap in studies by (1) extending the literature on the volume-volatility linkage to EMD-HAR analysis and (2) providing a clear view on how trading volume helps improve RV forecasting accuracy.  相似文献   

16.
In this paper, we forecast stock returns using time‐varying parameter (TVP) models with parameters driven by economic conditions. An in‐sample specification test shows significant variation in the parameters. Out‐of‐sample results suggest that the TVP models outperform their constant coefficient counterparts. We also find significant return predictability from both statistical and economic perspectives with the application of TVP models. The out‐of‐sample R2 of an equal‐weighted combination of TVP models is as high as 2.672%, and the gains in the certainty equivalent return are 214.7 basis points. Further analysis indicates that the improvement in predictability comes from the use of information on economic conditions rather than simply from allowing the coefficients to vary with time.  相似文献   

17.
Predicting the future evolution of GDP growth and inflation is a central concern in economics. Forecasts are typically produced either from economic theory‐based models or from simple linear time series models. While a time series model can provide a reasonable benchmark to evaluate the value added of economic theory relative to the pure explanatory power of the past behavior of the variable, recent developments in time series analysis suggest that more sophisticated time series models could provide more serious benchmarks for economic models. In this paper we evaluate whether these complicated time series models can outperform standard linear models for forecasting GDP growth and inflation. We consider a large variety of models and evaluation criteria, using a bootstrap algorithm to evaluate the statistical significance of our results. Our main conclusion is that in general linear time series models can hardly be beaten if they are carefully specified. However, we also identify some important cases where the adoption of a more complicated benchmark can alter the conclusions of economic analyses about the driving forces of GDP growth and inflation. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

18.
Behavioral economics is a field of study that is often thought of as interdisciplinary, insofar as it uses psychological insights to inform economic models. Yet the level of conceptual and methodological exchange between the two disciplines is disputed in the literature. On the one hand, behavioral economic models are often presented as psychologically informed models of individual decision-making (Camerer & Loewenstein, 2003). On the other hand, these models have often been criticized for being merely more elaborated “as if” economic models (Berg & Gigerenzer, 2010). The aim of this paper is to contribute to this debate by looking at a central topic in behavioral economics: the case of social preferences. Have findings or research methods been exchanged between psychology and economics in this research area? Have scientists with different backgrounds “travelled” across domains, thus transferring their expertise from one discipline to another? By addressing these and related questions, this paper will assess the level of knowledge transfer between psychology and economics in the study of social preferences.  相似文献   

19.
Forecasters are concerned with the accuracy of a forecast and whether the forecast can be modified to yield an improved performance. Theil has proposed statistics to measure forecast performance and to identify components of forecast error. However, the most commonly used of Theil's statistics have been shown to have serious shortcomings. This paper discusses Theil's decomposition of forecast error into bias, regression and disturbance proportions. Examples using price expectations and new housing starts data are given to show how decomposition suggests a linear correction procedure that may improve forecast accuracy.  相似文献   

20.
With the development of artificial intelligence, deep learning is widely used in the field of nonlinear time series forecasting. It is proved in practice that deep learning models have higher forecasting accuracy compared with traditional linear econometric models and machine learning models. With the purpose of further improving forecasting accuracy of financial time series, we propose the WT-FCD-MLGRU model, which is the combination of wavelet transform, filter cycle decomposition and multilag neural networks. Four major stock indices are chosen to test the forecasting performance among traditional econometric model, machine learning model and deep learning models. According to the result of empirical analysis, deep learning models perform better than traditional econometric model such as autoregressive integrated moving average and improved machine learning model SVR. Besides, our proposed model has the minimum forecasting error in stock index prediction.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号