首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We propose a method approach. We use six international stock price indices and three hypothetical portfolios formed by these indices. The sample was observed daily from 1 January 1996 to 31 December 2006. Confirmed by the failure rates and backtesting developed by Kupiec (Technique for verifying the accuracy of risk measurement models. Journal of Derivatives 1995; 3 : 73–84) and Christoffersen (Evaluating interval forecasts. International Economic Review 1998; 39 : 841–862), the empirical results show that our method can considerably improve the estimation accuracy of value‐at‐risk. Thus the study establishes an effective alternative model for risk prediction and hence also provides a reliable tool for the management of portfolios. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

2.
This article develops a new method for detrending time series. It is shown how, in a Bayesian framework, a generalized version of the Hodrick–Prescott filter is obtained by specifying prior densities on the signal‐to‐noise ratio (q) in the underlying unobserved components model. This helps ensure an appropriate degree of smoothness in the estimated trend while allowing for uncertainty in q. The article discusses the important issue of prior elicitation for time series recorded at different frequencies. By combining prior expectations with the likelihood, the Bayesian approach permits detrending in a way that is more consistent with the properties of the series. The method is illustrated with some quarterly and annual US macroeconomic series. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

3.
Forecasts from quarterly econometric models are typically revised on a monthly basis to reflect the information in current economic data. The revision process usually involves setting targets for the quarterly values of endogenous variables for which monthly observations are available and then altering the intercept terms in the quarterly forecasting model to achieve the target values. A formal statistical approach to the use of monthly data to update quarterly forecasts is described and the procedure is applied to the Michigan Quarterly Econometric Model of the US Economy. The procedure is evaluated in terms of both ex post and ex ante forecasting performance. The ex ante results for 1986 and 1987 indicate that the method is quite promising. With a few notable exceptions, the formal procedure produces forecasts of GNP growth that are very close to the published ex ante forecasts.  相似文献   

4.
The vector multiplicative error model (vector MEM) is capable of analyzing and forecasting multidimensional non‐negative valued processes. Usually its parameters are estimated by generalized method of moments (GMM) and maximum likelihood (ML) methods. However, the estimations could be heavily affected by outliers. To overcome this problem, in this paper an alternative approach, the weighted empirical likelihood (WEL) method, is proposed. This method uses moment conditions as constraints and the outliers are detected automatically by performing a k‐means clustering on Oja depth values of innovations. The performance of WEL is evaluated against those of GMM and ML methods through extensive simulations, in which three different kinds of additive outliers are considered. Moreover, the robustness of WEL is demonstrated by comparing the volatility forecasts of the three methods on 10‐minute returns of the S&P 500 index. The results from both the simulations and the S&P 500 volatility forecasts have shown preferences in using the WEL method. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

5.
Forecast combination based on a model selection approach is discussed and evaluated. In addition, a combination approach based on ex ante predictive ability is outlined. The model selection approach which we examine is based on the use of Schwarz (SIC) or the Akaike (AIC) Information Criteria. Monte Carlo experiments based on combination forecasts constructed using possibly (misspecified) models suggest that the SIC offers a potentially useful combination approach, and that further investigation is warranted. For example, combination forecasts from a simple averaging approach MSE‐dominate SIC combination forecasts less than 25% of the time in most cases, while other ‘standard’ combination approaches fare even worse. Alternative combination approaches are also compared by conducting forecasting experiments using nine US macroeconomic variables. In particular, artificial neural networks (ANN), linear models, and professional forecasts are used to form real‐time forecasts of the variables, and it is shown via a series of experiments that SIC, t‐statistic, and averaging combination approaches dominate various other combination approaches. An additional finding is that while ANN models may not MSE‐dominate simpler linear models, combinations of forecasts from these two models outperform either individual forecast, for a subset of the economic variables examined. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

6.
The metabolism of all-trans- and 9-cis-retinol/ retinaldehyde has been investigated with focus on the activities of human, mouse and rat alcohol dehydrogenase 2 (ADH2), an intriguing enzyme with apparently different functions in human and rodents. Kinetic constants were determined with an HPLC method and a structural approach was implemented by in silico substrate dockings. For human ADH2, the determined Km values ranged from 0.05 to 0.3 μM and kcat values from 2.3 to 17.6 min−1, while the catalytic efficiency for 9-cis-retinol showed the highest value for any substrate. In contrast, poor activities were detected for the rodent enzymes. A mouse ADH2 mutant (ADH2Pro47His) was studied that resembles the human ADH2 setup. This mutation increased the retinoid activity up to 100-fold. The Km values of human ADH2 are the lowest among all known human retinol dehydrogenases, which clearly support a role in hepatic retinol oxidation at physiological concentrations. Received 12 October 2006; received after revision 6 December 2006; accepted 8 January 2007  相似文献   

7.
In recent papers, Zurek [(2005). Probabilities from entanglement, Born's rule pk=|ψk|2 from entanglement. Physical Review A, 71, 052105] has objected to the decision-theoretic approach of Deutsch [(1999) Quantum theory of probability and decisions. Proceedings of the Royal Society of London A, 455, 3129–3137] and Wallace [(2003). Everettian rationality: defending Deutsch's approach to probability in the Everett interpretation. Studies in History and Philosophy of Modern Physics, 34, 415–438] to deriving the Born rule for quantum probabilities on the grounds that it courts circularity. Deutsch and Wallace assume that the many worlds theory is true and that decoherence gives rise to a preferred basis. However, decoherence arguments use the reduced density matrix, which relies upon the partial trace and hence upon the Born rule for its validity. Using the Heisenberg picture and quantum Darwinism—the notion that classical information is quantum information that can proliferate in the environment pioneered in Ollivier et al. [(2004). Objective properties from subjective quantum states: Environment as a witness. Physical Review Letters, 93, 220401 and (2005). Environment as a witness: Selective proliferation of information and emergence of objectivity in a quantum universe. Physical Review A, 72, 042113]—I show that measurement interactions between two systems only create correlations between a specific set of commuting observables of system 1 and a specific set of commuting observables of system 2. This argument picks out a unique basis in which information flows in the correlations between those sets of commuting observables. I then derive the Born rule for both pure and mixed states and answer some other criticisms of the decision theoretic approach to quantum probability.  相似文献   

8.
We propose an innovative approach to model and predict the outcome of football matches based on the Poisson autoregression with exogenous covariates (PARX) model recently proposed by Agosto, Cavaliere, Kristensen, and Rahbek (Journal of Empirical Finance, 2016, 38(B), 640–663). We show that this methodology is particularly suited to model the goal distribution of a football team and provides a good forecast performance that can be exploited to develop a profitable betting strategy. This paper improves the strand of literature on Poisson‐based models, by proposing a specification able to capture the main characteristics of goal distribution. The betting strategy is based on the idea that the odds proposed by the market do not reflect the true probability of the match because they may also incorporate the betting volumes or strategic price settings in order to exploit betters' biases. The out‐of‐sample performance of the PARX model is better than the reference approach by Dixon and Coles (Applied Statistics, 1997, 46(2), 265–280). We also evaluate our approach in a simple betting strategy, which is applied to English football Premier League data for the 2013–2014, 2014–2015, and 2015–2016 seasons. The results show that the return from the betting strategy is larger than 30% in most of the cases considered and may even exceed 100% if we consider an alternative strategy based on a predetermined threshold, which makes it possible to exploit the inefficiency of the betting market.  相似文献   

9.
An approach is proposed for obtaining estimates of the basic (disaggregated) series, xi, when only an aggregate series, yt, of k period non-overlapping sums of xi's is available. The approach is based on casting the problem in a dynamic linear model form. Then estimates of xi can be obtained by application of the Kalman filtering techniques. An ad hoc procedure is introduced for deriving a model form for the unobserved basic series from the observed model of the aggregates. An application of this approach to a set of real data is given.  相似文献   

10.
The standard approach to combining n expert forecasts involves taking a weighted average. Granger and Ramanathan proposed introducing an intercept term and unnormalized weights. This paper deduces their proposal from Bayesian principles. We find that their formula is equivalent to taking a weighted average of the n expert forecasts plus the decision-maker's prior forecast.  相似文献   

11.
Finding the right partner is a central problem in homologous recombination. Common to all models for general recombination is a homologous pairing and DNA strand exchange step. In prokaryotes this process has mainly been studied with the RecA protein ofEscherichia coli. Two approaches have been used to find homologous pairing and DNA strand exchange proteins in eukaryotes. A biochemical approach has resulted in numerous proteins from various organisms. Almost all of these proteins are biochemically fundamentally different from RecA. The in vivo role of these proteins is largely not understood. A molecular-genetical approach has identified structural homologs to theE. coli RecA protein in the yeastSaccharomyces cerevisiae and subsequently in other organisms including other fungi, mammals, birds, and plants. The biochemistry of the eukaryotic RecA homologs is largely unsolved. For the fungal RecA homologs (S. cerevisiae RAD51, RAD55, RAD57, DMC1; Schizosaccharomyces pombe rad51; Neurospora crassa mei3) a role in homologous recombination and recombinational repair is evident. Besides recombination, homologous pairing proteins might be involved in other cellular processes like chromosome pairing or gene inactivation.  相似文献   

12.
Screening for differentially expressed genes is a straightforward approach to study the molecular basis for changes in gene expression. Differential display analysis has been used by investigators in diverse fields of research since it was developed. Differential display has also been the approach of choice to investigate changes in gene expression in response to various biological challenges in invertebrates. We review the application of differential display analysis of gene expression in invertebrates, and provide a specific example using this technique for novel gene discovery in the nematode Caenorhabditis elegans.  相似文献   

13.
In this paper, an optimized multivariate singular spectrum analysis (MSSA) approach is proposed to find leading indicators of cross‐industry relations between 24 monthly, seasonally unadjusted industrial production (IP) series for German, French, and UK economies. Both recurrent and vector forecasting algorithms of horizontal MSSA (HMSSA) are considered. The results from the proposed multivariate approach are compared with those obtained via the optimized univariate singular spectrum analysis (SSA) forecasting algorithm to determine the statistical significance of each outcome. The data are rigorously tested for normality, seasonal unit root hypothesis, and structural breaks. The results are presented such that users can not only identify the most appropriate model based on the aim of the analysis, but also easily identify the leading indicators for each IP variable in each country. Our findings show that, for all three countries, forecasts from the proposed MSSA algorithm outperform the optimized SSA algorithm in over 70% of cases. Accordingly, this new approach succeeds in identifying leading indicators and is a viable option for selecting the SSA choices L and r, which minimizes a loss function.  相似文献   

14.
A general Bayesian approach to combining n expert forecasts is developed. Under some moderate assumptions on the distributions of the expert errors, it leads to a consistent, monotonic, quasi-linear average formula. This generalizes Bordley's results.  相似文献   

15.
The paper proposes a simulation‐based approach to multistep probabilistic forecasting, applied for predicting the probability and duration of negative inflation. The essence of this approach is in counting runs simulated from a multivariate distribution representing the probabilistic forecasts, which enters the negative inflation regime. The marginal distributions of forecasts are estimated using the series of past forecast errors, and the joint distribution is obtained by a multivariate copula approach. This technique is applied for estimating the probability of negative inflation in China and its expected duration, with the marginal distributions computed by fitting weighted skew‐normal and two‐piece normal distributions to autoregressive moving average ex post forecast errors and using the multivariate Student t copula.  相似文献   

16.
In this paper, we examine the use of non‐parametric Neural Network Regression (NNR) and Recurrent Neural Network (RNN) regression models for forecasting and trading currency volatility, with an application to the GBP/USD and USD/JPY exchange rates. Both the results of the NNR and RNN models are benchmarked against the simpler GARCH alternative and implied volatility. Two simple model combinations are also analysed. The intuitively appealing idea of developing a nonlinear nonparametric approach to forecast FX volatility, identify mispriced options and subsequently develop a trading strategy based upon this process is implemented for the first time on a comprehensive basis. Using daily data from December 1993 through April 1999, we develop alternative FX volatility forecasting models. These models are then tested out‐of‐sample over the period April 1999–May 2000, not only in terms of forecasting accuracy, but also in terms of trading efficiency: in order to do so, we apply a realistic volatility trading strategy using FX option straddles once mispriced options have been identified. Allowing for transaction costs, most trading strategies retained produce positive returns. RNN models appear as the best single modelling approach yet, somewhat surprisingly, model combination which has the best overall performance in terms of forecasting accuracy, fails to improve the RNN‐based volatility trading results. Another conclusion from our results is that, for the period and currencies considered, the currency option market was inefficient and/or the pricing formulae applied by market participants were inadequate. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

17.
This study proposes an explanation for the choice of topics Galileo addressed in Day 1 of his 1638 Two New Sciences, a section of the work which has long puzzled historians of science. I argue that Galileo’s agenda in Day 1, that is the topics he discusses and the questions he poses, was shaped by contemporary teaching commentaries on Books 3 through 8 of Aristotle’s Physics. Building on the insights and approach of theorists of reader reception, I confirm this interpretation by examining the response of professors of natural philosophy at the Jesuit Collegio Romano to Galileo’s text.  相似文献   

18.
Three problems in book I of Diophantus’ Arithmetica contain the adjective plasmatikon, that appears to qualify an implicit reference to some theorems in Elements, book II. The translation and meaning of the adjective sparked a long-lasting controversy that has become a nonnegligible aspect of the debate about the possibility of interpreting Diophantus’ approach and, more generally, Greek mathematics in algebraic terms. The correct interpretation of the word, a technical term in the Greek rhetorical tradition that perfectly fits the context in which it is inserted in the Arithmetica, entails that Diophantus’ text contained no (implicit) reference to Euclid’s Elements. The clause containing the adjective turns out to be a later interpolation, that cannot be used to support any algebraic interpretation of the Arithmetica.  相似文献   

19.
Summary An approach to the isolation of neurosecretory material from planarians is described. This material stimulated RNA synthesis, in a dose-dependent response, in regeneratingDugesia tigrina. The data support the concept that neurosecretion plays a key role in the process of regeneration in planarians.Supported by funds from the National Research Council of Canada.  相似文献   

20.
Variance intervention is a simple state-space approach to handling sharp discontinuities of level or slope in the states or parameters of models for non-stationary time-series. It derives from earlier procedures used in the 1960s for the design of self-adaptive, state variable feedback control systems. In the alternative state-space forecasting context considered in the present paper, it is particularly useful when applied to structural time series models. The paper compares the variance intervention procedure with the related ‘subjective intervention’ approach proposed by West and Harrison in a recent issue of the Journal of Forecasting, and demonstrates it efficacy by application to various time-series data, including those used by West and Harrison.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号