首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
2.
We develop a model to forecast the Federal Open Market Committee's (FOMC's) interest rate setting behavior in a nonstationary discrete choice model framework by Hu and Phillips (2004). We find that if the model selection criterion is strictly empirical, correcting for nonstationarity is extremely important, whereas it may not be an issue if one has an a priori model. Evaluating an array of models in terms of their out‐of‐sample forecasting ability, we find that those favored by the in‐sample criteria perform worst, while theory‐based models perform best. We find the best model for forecasting the FOMC's behavior is a forward‐looking Taylor rule model. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

3.
Prior studies use a linear adaptive expectations model to describe how analysts revise their forecasts of future earnings in response to current forecast errors. However, research shows that extreme forecast errors are less likely than small forecast errors to persist in future years. If analysts recognize this property, their marginal forecast revisions should decrease with the forecast error's magnitude. Therefore, a linear model is likely to be unsatisfactory at describing analysts' forecast revisions. We find that a non‐linear model better describes the relation between analysts' forecast revisions and their forecast errors, and provides a richer theoretical framework for explaining analysts' forecasting behaviour. Our results are consistent with analysts' recognizing the permanent and temporary nature of forecast errors of differing magnitudes. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

4.
We consider a forecasting problem that arises when an intervention is expected to occur on an economic system during the forecast horizon. The time series model employed is seen as a statistical device that serves to capture the empirical regularities of the observed data on the variables of the system without relying on a particular theoretical structure. Either the deterministic or the stochastic structure of a vector autoregressive error correction model of the system is assumed to be affected by the intervention. The information about the intervention effect is just provided by some linear restrictions imposed on the future values of the variables involved. Formulas for restricted forecasts with intervention effects and their mean squared errors are derived as a particular case of Catlin's static updating theorem. An empirical illustration uses Mexican macroeconomic data on five variables and the restricted forecasts consider targets for years 2011–2014. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

5.
It is widely acknowledged that the patient's perspective should be considered when making decisions about how her care will be managed. Patient participation in the decision making process may play an important role in bringing to light and incorporating her perspective. The GRADE framework is touted as an evidence-based process for determining recommendations for clinical practice; i.e. determining how care ought to be managed. GRADE recommendations are categorized as “strong” or “weak” based on several factors, including the “values and preferences” of a “typical” patient. The strength of the recommendation also provides instruction to the clinician about when and how patients should participate in the clinical encounter, and thus whether an individual patient's values and preferences will be heard in her clinical encounter. That is, a “strong” recommendation encourages “paternalism” and a “weak” recommendation encourages shared decision making. We argue that adoption of the GRADE framework is problematic to patient participation and may result in care that is not respectful of the individual patient's values and preferences. We argue that the root of the problem is the conception of “values and preferences” in GRADE – the framework favours population thinking (e.g. “typical” patient “values and preferences”), despite the fact that “values and preferences” are individual in the sense that they are deeply personal. We also show that tying the strength of a recommendation to a model of decision making (paternalism or shared decision making) constrains patient participation and is not justified (theoretically and/or empirically) in the GRADE literature.  相似文献   

6.
This paper proposes a new approach to forecasting intermittent demand by considering the effects of external factors. We classify intermittent demand data into two parts—zero value and nonzero value—and fit nonzero values into a mixed zero-truncated Poisson model. All the parameters in this model are obtained by an EM algorithm, which regards external factors as independent variables of a logistic regression model and log-linear regression model. We then calculate the probability of occurrence of zero value at each period and predict demand occurrence by comparing it with critical value. When demand occurs, we use the weighted average of the mixed zero-truncated Poisson model as predicted nonzero demands, which are combined with predicted demand occurrences to form the final forecasting demand series. Two performance measures are developed to assess the forecasting methods. By presenting a case study of electric power material from the State Grid Shanghai Electric Power Company in China, we show that our approach provides greater accuracy in forecasting than the Poisson model, the hurdle shifted Poisson model, the hurdle Poisson model, and Croston's method.  相似文献   

7.
James McAllister’s 2003 article, ‘Algorithmic randomness in empirical data’, claims that empirical data sets are algorithmically random, and hence incompressible. We show that this claim is mistaken. We present theoretical arguments and empirical evidence for compressibility, and discuss the matter in the framework of Minimum Message Length (MML) inference.  相似文献   

8.
A new multivariate stochastic volatility model is developed in this paper. The main feature of this model is to allow threshold asymmetry in a factor covariance structure. The new model provides a parsimonious characterization of volatility and correlation asymmetry in response to market news. Statistical inferences are drawn from Markov chain Monte Carlo methods. We introduce news impact analysis to analyze volatility asymmetry with a factor structure. This analysis helps us to study different responses of volatility to historical market information in a multivariate volatility framework. Our model is successful when applied to an extensive empirical study of twenty stocks. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

9.
Existing scholarship on animal models tends to foreground either of the two major roles research organisms play in different epistemic contexts, treating their representational and instrumental roles separately. Based on an empirical case study, this article explores the changing relationship between the two epistemic roles of a research organism over the span of a decade, while the organism was used to achieve various knowledge ends. This rat model was originally intended as a replica of human susceptibility to cardiac arrest. In a fortunate stroke of serendipity, however, the experimenters detected the way mother-infant interactions regulated the pups’ resting cardiac rate. This intriguing outcome thus became the model’s new representational target and began driving the development of an experimental system. Henceforth, the model acquired an instrumental function, serving to detect and measure system-specific differences. Its subsequent development involved creating stimulus-response measures to explain and theorize those differences. It was this instrumental use of the model that pushed the experimenters into unchartered territory and conferred to the model an ability to adapt to varied epistemic contexts. Despite the prominence of this instrumental role, however, the model’s representational power continued to guide research. The model’s representational target was widened beyond heart rate to reflect other functional phenomena, such as behavioral activity and sleep/wake rhythm. The rat model was thus transformed from an experimental organism designed to instantiate cardiac regulation to a model organism taken to represent the development of a whole, intact animal under the regulatory influence of maternal care. This article examines this multifaceted transformation within the context of the salient shifts in modeling practice and variations in the model’s representational power. It thus explores how the relationship between the representational and instrumental uses of the model changed with respect to the varying exigencies of the investigative context, foregrounding its contextual versatility.  相似文献   

10.
The leverage effect—the correlation between an asset's return and its volatility—has played a key role in forecasting and understanding volatility and risk. While it is a long standing consensus that leverage effects exist and improve forecasts, empirical evidence puzzlingly does not show that this effect exists for many individual stocks, mischaracterizing risk, and therefore leading to poor predictive performance. We examine this puzzle, with the goal to improve density forecasts, by relaxing the assumption of linearity of the leverage effect. Nonlinear generalizations of the leverage effect are proposed within the Bayesian stochastic volatility framework in order to capture flexible leverage structures. Efficient Bayesian sequential computation is developed and implemented to estimate this effect in a practical, on-line manner. Examining 615 stocks that comprise the S&P500 and Nikkei 225, we find that our proposed nonlinear leverage effect model improves predictive performances for 89% of all stocks compared to the conventional stochastic volatility model.  相似文献   

11.
This paper aims at closing a gap in recent Weyl research by investigating the role played by Leibniz for the development and consolidation of Weyl's notion of theoretical (symbolic) construction. For Weyl, just as for Leibniz, mathematics was not simply an accompanying tool when doing physics—for him it meant the ability to engage in well-guided speculations about a general framework of reality and experience. The paper first introduces some of the background of Weyl's notion of theoretical construction and then discusses particular Leibnizian inheritances in Weyl's ‘Philosophie der Mathematik und Naturwissenschaft’, such as the general appreciation of the principles of sufficient reason and of continuity. Afterwards the paper focuses on three themes: first, Leibniz's primary quality phenomenalism, which according to Weyl marked the decisive step in realizing that physical qualities are never apprehended directly; second, the conceptual relation between continuity and freedom; and third, Leibniz's notion of ‘expression’, which allows for a certain type of (surrogative) reasoning by structural analogy and which gave rise to Weyl's optimism regarding the scope of theoretical construction.  相似文献   

12.
It is well understood that the standard formulation for the variance of a regression‐model forecast produces interval estimates that are too narrow, principally because it ignores regressor forecast error. While the theoretical problem has been addressed, there has not been an adequate explanation of the effect of regressor forecast error, and the empirical literature has supplied a disparate variety of bits and pieces of evidence. Most business‐forecasting software programs continue to supply only the standard formulation. This paper extends existing analysis to derive and evaluate large‐sample approximations for the forecast error variance in a single‐equation regression model. We show how these approximations substantially clarify the expected effects of regressor forecast error. We then present a case study, which (a) demonstrates how rolling out‐of‐sample evaluations can be applied to obtain empirical estimates of the forecast error variance, (b) shows that these estimates are consistent with our large‐sample approximations and (c) illustrates, for ‘typical’ data, how seriously the standard formulation can understate the forecast error variance. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

13.
This paper explores how the boundaries of the UK's Animals (Scientific Procedures) Act (A(SP)A) are constituted, as illustrative of the rising importance of legal procedures around animal research and how these are continuously being challenged and questioned. Drawing on empirical work in animal research communities, we consider how it is decided whether activities are undertaken for an “experimental or other scientific purpose”. We do this by focusing on “edge cases”, where debates occur about whether to include an activity within A(SP)A's remit. We demonstrate that the boundaries of animal research regulation in the UK are products of past and present decisions, dependencies, and social relationships. Boundaries are therefore not clear-cut and fixed, but rather flexible and changing borderlands. We particularly highlight the roles of: historical precedent; the management of risk, workload, and cost; institutional and professional identities; and research design in constituting A(SP)A's edges. In doing so, we demonstrate the importance of paying attention to how, in practice, animal law requires a careful balance between adhering to legal paragraphs and allowing for discretion. This in turn has real-world implications for what and how science is done, who does it, and how animals are used in its service.  相似文献   

14.
Review     
In this paper we study some methodological problems associated with the development of one of the major theories in low temperature physics, that of superconductivity. The first experimental results of 1911 were interpreted within a framework that hindered the paradoxical aspects of the new phenomenon. Various research programmes degenerated until new experimental results forced a reappraisal of the existing theoretical framework making possible a different formulation of the problem that had to be solved. This led to a progressive research programme, whose positive heuristic we also study.  相似文献   

15.
We propose a new class of limited information estimators built upon an explicit trade‐off between data fitting and a priori model specification. The estimators offer the researcher a continuum of estimators that range from an extreme emphasis on data fitting and robust reduced‐form estimation to the other extreme of exact model specification and efficient estimation. The approach used to generate the estimators illustrates why ULS often outperforms 2SLS‐PRRF even in the context of a correctly specified model, provides a new interpretation of 2SLS, and integrates Wonnacott and Wonnacott's (1970) least weighted variance estimators with other techniques. We apply the new class of estimators to Klein's Model I and generate forecasts. We find for this example that an emphasis on specification (as opposed to data fitting) produces better out‐of‐sample predictions. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

16.
The paper derives the scalar special case of the well‐known BEKK multivariate GARCH model using a multivariate extension of the random coefficient autoregressive (RCA) model. This representation establishes the relevant structural and asymptotic properties of the scalar BEKK model using the theoretical results available in the literature for general multivariate GARCH. Sufficient conditions for the (direct) DCC model to be consistent with a scalar BEKK representation are established. Moreover, an indirect DCC model that is consistent with the scalar BEKK representation is obtained, and is compared with the direct DCC model using an empirical example. The paper shows, within an asset allocation and risk measurement framework, that the two models are similar in terms of providing parameter estimates and forecasting value‐at‐risk thresholds for equally weighted and minimum variance portfolios. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

17.
How do scientific innovations spread within and across scientific communities? In this paper, we propose a general account of the diffusion of scientific innovations. This account acknowledges that novel ideas must be elaborated on and conceptually translated before they can be adopted and applied to field-specific problems. We motivate our account by examining an exemplary case of knowledge diffusion, namely, the early spread of theories of rational decision-making. These theories were grounded in a set of novel mathematical tools and concepts that originated in John von Neumann and Oskar Morgenstern's Theory of Games and Economic Behavior (1944/1947) and subsequently spread widely across the social and behavioral sciences. Introducing a network-based diffusion measure, we trace the spread of those tools and concepts into distinct research areas. We furthermore present an analytically tractable typology for classifying publications according to their roles in the diffusion process. The proposed framework allows for a systematic examination of the conditions under which scientific innovations spread within and across preexisting and newly emerging scientific communities.  相似文献   

18.
We propose a simple and flexible framework for forecasting the joint density of asset returns. The multinormal distribution is augmented with a polynomial in (time‐varying) non‐central co‐moments of assets. We estimate the coefficients of the polynomial via the method of moments for a carefully selected set of co‐moments. In an extensive empirical study, we compare the proposed model with a range of other models widely used in the literature. Employing a recently proposed as well as standard techniques to evaluate multivariate forecasts, we conclude that the augmented joint density provides highly accurate forecasts of the ‘negative tail’ of the joint distribution. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

19.
We outline a framework for analyzing episodes from the history of science in which the application of mathematics plays a constitutive role in the conceptual development of empirical sciences. Our starting point is the inferential conception of the application of mathematics, recently advanced by Bueno and Colyvan (2011). We identify and discuss some systematic problems of this approach. We propose refinements of the inferential conception based on theoretical considerations and on the basis of a historical case study. We demonstrate the usefulness of the refined, dynamical inferential conception using the well-researched example of the genesis of general relativity. Specifically, we look at the collaboration of the physicist Einstein and the mathematician Grossmann in the years 1912–1913, which resulted in the jointly published “Outline of a Generalized Theory of Relativity and a Theory of Gravitation,” a precursor theory of the final theory of general relativity. In this episode, independently developed mathematical theories, the theory of differential invariants and the absolute differential calculus, were applied in the process of finding a relativistic theory of gravitation. The dynamical inferential conception not only provides a natural framework to describe and analyze this episode, but it also generates new questions and insights. We comment on the mathematical tradition on which Grossmann drew, and on his own contributions to mathematical theorizing. The dynamical inferential conception allows us to identify both the role of heuristics and of mathematical resources as well as the systematic role of problems and mistakes in the reconstruction of episodes of conceptual innovation and theory change.  相似文献   

20.
This essay examines the curious relationship between Charles Darwin and the palaeontologist William Boyd Dawkins (1837–1929). Dawkins was a beneficiary of Darwin's patronage and styled himself as a Darwinian to Darwin and the public, yet viciously attacked Darwin and his theory in anonymous reviews. This has confused historians who have misunderstood the exact nature of Dawkins's attitude towards evolution and his relationship to Darwin. The present study explains both the reasons for Dawkins's contradictory statements and his relationship with Darwin. I introduce Batesian mimicry as a conceptual framework to make sense of Dawkins's actions, suggesting that Dawkins mimicked a Darwinian persona in order to secure advancement in the world of Victorian science. Dawkins's pro-Darwinian stance, therefore, was a façade, an act of mimicry. I argue that Dawkins exploited Darwin for his patronage – which took the form of advice, support from Darwin's well-placed friends, and monetary assistance – while safely expressing his dissent from Darwinian orthodoxy in the form of anonymous reviews. This is, therefore, a case study in how scientific authority and power could be gained and maintained in Victorian science by professing allegiance to Darwin and Darwinism.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号