共查询到20条相似文献,搜索用时 15 毫秒
1.
The existing contradictory findings on the contribution of trading volume to volatility forecasting prompt us to seek new solutions to test the sequential information arrival hypothesis (SIAH). Departing from other empirical analyses that mainly focus on sophisticated testing methods, this research offers new insights into the volume-volatility nexus by decomposing and reconstructing the trading activity into short-run components that typically represent irregular information flow and long-run components that denote extreme information flow in the stock market. We are the first to attempt at incorporating an improved empirical mode decomposition (EMD) method to investigate the volatility forecasting ability of trading volume along with the Heterogeneous Autoregressive (HAR) model. Previous trading volume is used to obtain the decompositions to forecast the future volatility to ensure an ex ante forecast, and both the decomposition and forecasting processes are carried out by the rolling window scheme. Rather than trading volume by itself, the results show that the reconstructed components are also able to significantly improve out-of-sample realized volatility (RV) forecasts. This finding is robust both in one-step ahead and multiple-step ahead forecasting horizons under different estimation windows. We thus fill the gap in studies by (1) extending the literature on the volume-volatility linkage to EMD-HAR analysis and (2) providing a clear view on how trading volume helps improve RV forecasting accuracy. 相似文献
2.
Gke Soydemir 《Journal of forecasting》2000,19(3):149-176
This paper investigates the transmission patterns of stock market movements between developed and emerging market economies by estimating a four‐variable VAR model. The underlying economic fundamentals and trade links are considered as possible determinants of differences in transmission patterns. The results of the impulse response functions and variance decompositions indicate that significant links exist between the stock markets of the USA and Mexico and weaker links between the markets of the USA, Argentina, and Brazil. Differences in the patterns of stock market responses are consistent with differences in trade flows. The response of emerging markets to a shock to the US market lasts longer than that of a developed market such as the UK. While no single emerging market can affect the US stock market, the combined effect of emerging markets on the US stock market is found to be statistically significant. These findings can be linked to differences in the speed of information processing and to the institutional structure governing the market. Overall the findings suggest that the transmission of stock market movements is in accord with underlying economic fundamentals rather than irrational contagion effects. Copyright © 2000 John Wiley & Sons, Ltd. 相似文献
3.
The success of any timing strategy depends on the accuracy of market forecasts. In this paper, we test five indices to forecast the 1‐month‐ahead performance of the S&P 500 Index. These indices reflect investor sentiment, current business conditions, economic policy uncertainty, and market dislocation information. Each model is used in a logistic regression analysis to predict the 1‐month‐ahead market direction, and the forecasts are used to adjust the portfolio's beta. Beta optimization refers to a strategy designed to create a portfolio beta of 1.0 when the market is expected to go up, and a beta of ?1.0 when a bear market is expected. Successful application of this strategy generates returns that are consistent with a call option or an option straddle position; that is, positive returns are generated in both up and down markets. Analysis reveals that the models' forecasts have discriminatory power in identifying substantial market movements, particularly during the bursting of the tech bubble and the financial crisis. Four of the five forecast models tested outperform the benchmark index. 相似文献
4.
Modeling and forecasting aggregate stock market volatility in unstable environments using mixture innovation regressions 下载免费PDF全文
Nima Nonejad 《Journal of forecasting》2017,36(6):718-740
We perform Bayesian model averaging across different regressions selected from a set of predictors that includes lags of realized volatility, financial and macroeconomic variables. In our model average, we entertain different channels of instability by either incorporating breaks in the regression coefficients of each individual model within our model average, breaks in the conditional error variance, or both. Changes in these parameters are driven by mixture distributions for state innovations (MIA) of linear Gaussian state‐space models. This framework allows us to compare models that assume small and frequent as well as models that assume large but rare changes in the conditional mean and variance parameters. Results using S&P 500 monthly and quarterly realized volatility data from 1960 to 2014 suggest that Bayesian model averaging in combination with breaks in the regression coefficients and the error variance through MIA dynamics generates statistically significantly more accurate forecasts than the benchmark autoregressive model. However, compared to a MIA autoregression with breaks in the regression coefficients and the error variance, we fail to provide any drastic improvements. 相似文献
5.
In this paper we apply cointegration and Granger-causality analyses to construct linear and neural network error-correction models for an Austrian Initial Public Offerings IndeX (IPOXATX). We use the significant relationship between the IPOXATX and the Austrian Stock Market Index ATX to forecast the IPOXATX. For prediction purposes we apply augmented feedforward neural networks whose architecture is determined by Sequential Network Construction with the Schwartz Information Criterion as an estimator for the prediction risk. Trading based on the forecasts yields results superior to Buy and Hold or Moving Average trading strategies in terms of mean-variance considerations. 相似文献
6.
This study is the first to examine the impacts of overnight and intraday oil futures cross-market information on predicting the US stock market volatility the high-frequency data. In-sample estimations present that high overnight oil futures RV can lead to high RV of the S&P 500. Moreover, negative overnight returns are more powerful than positive components, implying the existence of the leverage effect. From statistical and economic perspectives, out-of-sample results indicate that the decompositions of overnight oil futures and intraday RVs, based on signed intraday returns, can significantly increase the models' predictive ability. Finally, when considering the US stock market overnight effect, the decompositions are still useful to predict volatility, especially during high US stock market fluctuations and high and low EPU states. 相似文献
7.
Janchung Wang 《Journal of forecasting》2009,28(4):277-292
This study attempts to apply the general equilibrium model of stock index futures with both stochastic market volatility and stochastic interest rates to the TAIFEX and the SGX Taiwan stock index futures data, and compares the predictive power of the cost of carry and the general equilibrium models. This study also represents the first attempt to investigate which of the five volatility estimators can enhance the forecasting performance of the general equilibrium model. Additionally, the impact of the up‐tick rule and other various explanatory factors on mispricing is also tested using a regression framework. Overall, the general equilibrium model outperforms the cost of carry model in forecasting prices of the TAIFEX and the SGX futures. This finding indicates that in the higher volatility of the Taiwan stock market incorporating stochastic market volatility into the pricing model helps in predicting the prices of these two futures. Furthermore, the comparison results of different volatility estimators support the conclusion that the power EWMA and the GARCH(1,1) estimators can enhance the forecasting performance of the general equilibrium model compared to the other estimators. Additionally, the relaxation of the up‐tick rule helps reduce the degree of mispricing. Copyright © 2008 John Wiley & Sons, Ltd. 相似文献
8.
As a representative emerging financial market, the Chinese stock market is more prone to volatility because of investor sentiment. It is reasonable to use efficient predictive methods to analyze the influence of investor sentiment on stock price forecasting. This paper conducts a comparative study about the predictive performance of artificial neural network, support vector regression (SVR) and autoregressive integrated moving average and selects SVR to study the asymmetry effect of investor sentiment on different industry index predictions. After studying the relevant financial indicators, the results divide the Shenwan first-class industries into two types and show that the industries affected by investor sentiment are composed of young companies with high growth and high operative pressure and there are a great number of investment bubbles in those companies. 相似文献
9.
10.
This paper introduces a novel generalized autoregressive conditional heteroskedasticity–mixed data sampling–extreme shocks (GARCH-MIDAS-ES) model for stock volatility to examine whether the importance of extreme shocks changes in different time ranges. Based on different combinations of the short- and long-term effects caused by extreme events, we extend the standard GARCH-MIDAS model to characterize the different responses of the stock market for short- and long-term horizons, separately or in combination. The unique timespan of nearly 100 years of the Dow Jones Industrial Average (DJIA) daily returns allows us to understand the stock market volatility under extreme shocks from a historical perspective. The in-sample empirical results clearly show that the DJIA stock volatility is best fitted to the GARCH-MIDAS-SLES model by including the short- and long-term impacts of extreme shocks for all forecasting horizons. The out-of-sample results and robustness tests emphasize the significance of decomposing the effect of extreme shocks into short- and long-term effects to improve the accuracy of the DJIA volatility forecasts. 相似文献
11.
Auditors must assess their clients' ability to function as a going concern for at least the year following the financial statement date. The audit profession has been severely criticized for failure to ‘blow the whistle’ in numerous highly visible bankruptcies that occurred shortly after unmodified audit opinions were issued. Financial distress indicators examined in this study are one mechanism for making such assessments. This study measures and compares the predictive accuracy of an easily implemented two‐variable bankruptcy model originally developed using recursive partitioning on an equally proportioned data set of 202 firms. In this study, we test the predictive accuracy of this model, as well as previously developed logit and neural network models, using a realistically proportioned set of 14,212 firms' financial data covering the period 1981–1990. The previously developed recursive partitioning model had an overall accuracy for all firms ranging from 95 to 97% which outperformed both the logit model at 93 to 94% and the neural network model at 86 to 91%. The recursive partitioning model predicted the bankrupt firms with 33–58% accuracy. A sensitivity analysis of recursive partitioning cutting points indicated that a newly specified model could achieve an all firm and a bankrupt firm predictive accuracy of approximately 85%. Auditors will be interested in the Type I and Type II error tradeoffs revealed in a detailed sensitivity table for this easily implemented model. Copyright © 2000 John Wiley & Sons, Ltd. 相似文献
12.
The study of brand choice decisions with multiple alternatives has been successfully modelled for more than a decade using the Multinomial Logit model. Recently, neural network modelling has received increasing attention and has been applied to an array of marketing problems such as market response or segmentation. We show that a Feedforward Neural Network with Softmax output units and shared weights can be viewed as a generalization of the Multinomial Logit model. The main difference between the two approaches lies in the ability of neural networks to model non‐linear preferences with few (if any) a priori assumptions about the nature of the underlying utility function, while the Multinomial Logit can suffer from a specification bias. Being complementary, these approaches are combined into a single framework. The neural network is used as a diagnostic and specification tool for the Logit model, which will provide interpretable coefficients and significance statistics. The method is illustrated on an artificial dataset where the market is heterogeneous. We then apply the approach to panel scanner data of purchase records, using the Logit to analyse the non‐linearities detected by the neural network. Copyright © 2000 John Wiley & Sons, Ltd. 相似文献
13.
Nicholas Apergis 《Journal of forecasting》2020,39(2):220-241
This paper investigates the impact of financial market imperfections on small and medium-sized enterprise (SME) firms' profitability by using a unique panel data of US SME firms, spanning the period 1979–2017. The data set makes use of unique information on proxies of market imperfections pertaining to each firm in the sample. First, the findings document the statistical impact of those financial market imperfections on profitability. Moreover, the forecasting exercise illustrate the superiority of the model that explicitly includes those proxies. 相似文献
14.
Issuing a going-concern opinion is a difficult and complex task for auditors. The auditors have to take into account different critical factors in order to make the right decision based on information obtained from the auditing process. This study adopts the so-called “random forest” approach (based on the ensemble method) to assist auditors in making such a decision. To investigate the corresponding effect of the proposed approach, we conduct a series of experiments and a performance comparison. The results show that the random forest method outperforms the baseline methods in terms of the accuracy rate, ROC area, kappa value, type II error, precision, and recall rate. The proposed approach is proven to be more accurate and stable than previous methods. 相似文献
15.
Henrik Amilon 《Journal of forecasting》2003,22(4):317-335
An Erratum has been published for this article in Journal of Forecasting 22(6‐7) 2003, 551 The Black–Scholes formula is a well‐known model for pricing and hedging derivative securities. It relies, however, on several highly questionable assumptions. This paper examines whether a neural network (MLP) can be used to find a call option pricing formula better corresponding to market prices and the properties of the underlying asset than the Black–Scholes formula. The neural network method is applied to the out‐of‐sample pricing and delta‐hedging of daily Swedish stock index call options from 1997 to 1999. The relevance of a hedge‐analysis is stressed further in this paper. As benchmarks, the Black–Scholes model with historical and implied volatility estimates are used. Comparisons reveal that the neural network models outperform the benchmarks both in pricing and hedging performances. A moving block bootstrap is used to test the statistical significance of the results. Although the neural networks are superior, the results are sometimes insignificant at the 5% level. Copyright © 2003 John Wiley & Sons, Ltd. 相似文献
16.
Predicting the Distribution of Stock Returns: Model Formulation,Statistical Evaluation,VaR Analysis and Economic Significance 下载免费PDF全文
Daniele Massacci 《Journal of forecasting》2015,34(3):191-208
A large literature has investigated predictability of the conditional mean of low‐frequency stock returns by macroeconomic and financial variables; however, little is known about predictability of the conditional distribution. We look at one‐step‐ahead out‐of‐sample predictability of the conditional distribution of monthly US stock returns in relation to the macroeconomic and financial environment. Our methodological approach is innovative: we consider several specifications for the conditional density and combinations schemes. Our results are as follows: the entire density is predicted under combination schemes as applied to univariate GARCH models with Gaussian innovations; the Bayesian winner in relation to GARCH‐skewed‐t models is informative about the 5% value at risk; the average realised utility of a mean–variance investor is maximised under the Bayesian winner as applied to GARCH models with symmetric Student t innovations. Our results have two implications: the best prediction model depends on the evaluation criterion; and combination schemes outperform individual models. Copyright © 2015 John Wiley & Sons, Ltd. 相似文献
17.
Last Night a Shrinkage Saved My Life: Economic Growth,Model Uncertainty and Correlated Regressors 下载免费PDF全文
Paul Hofmarcher Jesús Crespo Cuaresma Bettina Grün Kurt Hornik 《Journal of forecasting》2015,34(2):133-144
We compare the predictive ability of Bayesian methods which deal simultaneously with model uncertainty and correlated regressors in the framework of cross‐country growth regressions. In particular, we assess methods with spike and slab priors combined with different prior specifications for the slope parameters in the slab. Our results indicate that moving away from Gaussian g‐priors towards Bayesian ridge, LASSO or elastic net specifications has clear advantages for prediction when dealing with datasets of (potentially highly) correlated regressors, a pervasive characteristic of the data used hitherto in the econometric literature. Copyright © 2015 John Wiley & Sons, Ltd. 相似文献
18.
Probability distributions,trading strategies and leverage: an application of Gaussian mixture models
The purpose of this paper is twofold. Firstly, to assess the merit of estimating probability density functions rather than level or classification estimations on a one‐day‐ahead forecasting task of the EUR/USD time series. This is implemented using a Gaussian mixture model neural network, benchmarking the results against standard forecasting models, namely a naïve model, a moving average convergence divergence technical model (MACD), an autoregressive moving average model (ARMA), a logistic regression model (LOGIT) and a multi‐layer perceptron network (MLP). Secondly, to examine the possibilities of improving the trading performance of those models with confirmation filters and leverage. While the benchmark models perform best without confirmation filters and leverage, the Gaussian mixture model outperforms all of the benchmarks when taking advantage of the possibilities offered by a combination of more sophisticated trading strategies and leverage. This might be due to the ability of the Gaussian mixture model to identify successfully trades with a high Sharpe ratio. Copyright © 2004 John Wiley & Sons, Ltd. 相似文献
19.
The objectives of this paper are: first, to show empirically the relevance of using adaptive estimation techniques over more traditional estimation approaches when economic systems are believed to be structurally unstable over time; and secondly, to compare in an empirical framework two adaptive estimation techniques: Kalman filtering and the Carbone–Longini filter. For that purpose, an econometric model for the U.S. pulp and paper market is examined under the assumption of structural instability and, hence, constitutes the basis for comparing forecasting performances and estimation accuracy achieved by each technique. A version of Kalman filtering, modified in line with the basic idea of ‘tracking’ characterizing the Carbone–Longini filter, is also presented and applied. The analysis of the results shows that it may be worth using adapative estimation methods to estimate structurally unstable models, even if there is no prior knowledge about the patterns of variation of the parameters. Also, it shows the Carbone–Longini filter and Kalman filtering as being complementary estimation techniques. An estimation/forecasting methodology involving a sequential application mode of these two techniques is suggested. 相似文献
20.
This article compares the forecast accuracy of different methods, namely prediction markets, tipsters and betting odds, and assesses the ability of prediction markets and tipsters to generate profits systematically in a betting market. We present the results of an empirical study that uses data from 678–837 games of three seasons of the German premier soccer league. Prediction markets and betting odds perform equally well in terms of forecasting accuracy, but both methods strongly outperform tipsters. A weighting‐based combination of the forecasts of these methods leads to a slightly higher forecast accuracy, whereas a rule‐based combination improves forecast accuracy substantially. However, none of the forecasts leads to systematic monetary gains in betting markets because of the high fees (25%) charged by the state‐owned bookmaker in Germany. Lower fees (e.g., approximately 12% or 0%) would provide systematic profits if punters exploited the information from prediction markets and bet only on a selected number of games. Copyright © 2008 John Wiley & Sons, Ltd. 相似文献