首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
To study climate change, scientists employ computer models, which approximate target systems with various levels of skill. Given the imperfection of climate models, how do scientists use simulations to generate knowledge about the causes of observed climate change? Addressing a similar question in the context of biological modelling, Levins (1966) proposed an account grounded in robustness analysis. Recent philosophical discussions dispute the confirmatory power of robustness, raising the question of how the results of computer modelling studies contribute to the body of evidence supporting hypotheses about climate change. Expanding on Staley’s (2004) distinction between evidential strength and security, and Lloyd’s (2015) argument connecting variety-of-evidence inferences and robustness analysis, I address this question with respect to recent challenges to the epistemology robustness analysis. Applying this epistemology to case studies of climate change, I argue that, despite imperfections in climate models, and epistemic constraints on variety-of-evidence reasoning and robustness analysis, this framework accounts for the strength and security of evidence supporting climatological inferences, including the finding that global warming is occurring and its primary causes are anthropogenic.  相似文献   

2.
Although the basic principles of exponential smoothing and discounted least squares are easily understood, the full power of the technique is only rarely exploited. The reason for this failure lies in the complexity of the standard procedures. Often they require fairly complex mathematical models and use a variety of cumbersome algebraic manipulations. An alternative formulation for exponential smoothing is presented. It simplifies these procedures and allows an easier use of the full range of models. This new formulation is obtained by considering the relationship between general exponential smoothing (GES) and the well-known ARMA process of Box and Jenkins. The three commonest seasonal models have only recently been considered for GES systems. They are discussed in some detail here. The computational requirements of the GES and equivalent ARMA procedures are reviewed and some recommendations for their application are made. The initialization of GES forecasting systems and the important problem of model selection is also discussed. A brief illustrative example is given.  相似文献   

3.
Microbial model systems have a long history of fruitful use in fields that include evolution and ecology. In order to develop further insight into modelling practice, we examine how the competitive exclusion and coexistence of competing species have been modelled mathematically and materially over the course of a long research history. In particular, we investigate how microbial models of these dynamics interact with mathematical or computational models of the same phenomena. Our cases illuminate the ways in which microbial systems and equations work as models, and what happens when they generate inconsistent findings about shared targets. We reveal an iterative strategy of comparative modelling in different media, and suggest reasons why microbial models have a special degree of epistemic tractability in multimodel inquiry.  相似文献   

4.
In this paper, we put dynamic stochastic general equilibrium DSGE forecasts in competition with factor forecasts. We focus on these two models since they represent nicely the two opposing forecasting philosophies. The DSGE model on the one hand has a strong theoretical economic background; the factor model on the other hand is mainly data‐driven. We show that incorporating a large information set using factor analysis can indeed improve the short‐horizon predictive ability, as claimed by many researchers. The micro‐founded DSGE model can provide reasonable forecasts for US inflation, especially with growing forecast horizons. To a certain extent, our results are consistent with the prevailing view that simple time series models should be used in short‐horizon forecasting and structural models should be used in long‐horizon forecasting. Our paper compares both state‐of‐the‐art data‐driven and theory‐based modelling in a rigorous manner. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

5.
This paper presents an application of the gene expression programming (GEP) and integrated genetic programming (GP) algorithms to the modelling of ASE 20 Greek index. GEP and GP are robust evolutionary algorithms that evolve computer programs in the form of mathematical expressions, decision trees or logical expressions. The results indicate that GEP and GP produce significant trading performance when applied to ASE 20 and outperform the well‐known existing methods. The trading performance of the derived models is further enhanced by applying a leverage filter. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

6.
The use of linear error correction models based on stationarity and cointegration analysis, typically estimated with least squares regression, is a common technique for financial time series prediction. In this paper, the same formulation is extended to a nonlinear error correction model using the idea of a kernel‐based implicit nonlinear mapping to a high‐dimensional feature space in which linear model formulations are specified. Practical expressions for the nonlinear regression are obtained in terms of the positive definite kernel function by solving a linear system. The nonlinear least squares support vector machine model is designed within the Bayesian evidence framework that allows us to find appropriate trade‐offs between model complexity and in‐sample model accuracy. From straightforward primal–dual reasoning, the Bayesian framework allows us to derive error bars on the prediction in a similar way as for linear models and to perform hyperparameter and input selection. Starting from the results of the linear modelling analysis, the Bayesian kernel‐based prediction is successfully applied to out‐of‐sample prediction of an aggregated equity price index for the European chemical sector. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

7.
The increasing penetration of wind power has resulted in larger shares of volatile sources of supply in power systems worldwide. In order to operate such systems efficiently, methods for reliable probabilistic forecasts of future wind power production are essential. It is well known that the conditional density of wind power production is highly dependent on the level of predicted wind power and prediction horizon. This paper describes a new approach for wind power forecasting based on logistic‐type stochastic differential equations (SDEs). The SDE formulation allows us to calculate both state‐dependent conditional uncertainties as well as correlation structures. Model estimation is performed by maximizing the likelihood of a multidimensional random vector while accounting for the correlation structure defined by the SDE formulation. We use non‐parametric modelling to explore conditional correlation structures, and skewness of the predictive distributions as a function of explanatory variables. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

8.
Summary An exact analysis of the coordination of movements in arthropods and vertebrates leads to the rejection of older explanations in terms of reflex physiology and to a dynamic conception of the process in the central nervous system, which admits of representation by a physical model (as well as of a mathematical formulation). This conception carries with it the implication that locomotion is caused by automatic elements that work in the rhythm of locomotion and are prior to the motor elements.This view stands in close relation to the physiology of the nervous system and to Gestalt psychology.  相似文献   

9.
I analyse the construction and transfer of models in complexity science. Thereby, I introduce a distinction between (i) vertical model construction, which is based on knowledge about a specific target system, (ii) horizontal model construction, which is based on the alteration of an existing model and therefore does not require any references to a specific target system; and (iii) the transfer of models, which consists of the assignment of an existing model to a new target system. I argue that, in complexity science, all three of those modelling activities take place. Furthermore, I show that these activities can be divided into two general categories: (i) the creation of a repository of models without specific target systems, which have been created by large-scale horizontal construction; and (ii) the transfer of these models to particular target systems in the natural sciences, which can also be followed by an extension of the transferred model through vertical construction of adaptions and additions to its dynamics. I then argue that this interplay of different modelling activities in complexity science provides a mechanism for the transfer of knowledge between different scientific fields. It is also crucial to the interdisciplinary nature of complexity science.  相似文献   

10.
11.
12.
13.
The notion of template has been advocated by Paul Humphreys and others as an illuminating unit of analysis in the philosophy of scientific modelling. Templates are supposed to have the dual functions of representing target systems and of facilitating quantitative manipulation. A resulting worry is that wide-ranging cross-disciplinary use of templates might compromise their representational function and reduce them to mere formalisms. In this paper, we argue that templates are valuable units of analysis in reconstructing cross-disciplinary modelling. Central to our discussion are the ways in which Lotka-Volterra models are used to analyse processes of technology diffusion. We illuminate both the similarities and differences between contributions to this case of cross-disciplinary modelling by reconstructing them as transfer of a template, without reducing the template to a mere formalism or a computational model. This requires differentiating the interpretation of templates from that of the models based on them. This differentiation allows us to claim that the LV models of technology diffusion that we review are the result of template transfer - conformist in some contributions, creative in others.  相似文献   

14.
C.F Gauss’s computational work in number theory attracted renewed interest in the twentieth century due to, on the one hand, the edition of Gauss’s Werke, and, on the other hand, the birth of the digital electronic computer. The involvement of the U.S. American mathematicians Derrick Henry Lehmer and Daniel Shanks with Gauss’s work is analysed, especially their continuation of work on topics as arccotangents, factors of n 2 + a 2, composition of binary quadratic forms. In general, this strand in Gauss’s reception is part of a more general phenomenon, i.e. the influence of the computer on mathematics and one of its effects, the reappraisal of mathematical exploration. I would like to thank the Alexander-von-Humboldt-Stiftung for funding this research. For their comments I would like to thank Catherine Goldstein, Norbert Schappacher and especially John Brillhart.  相似文献   

15.
In this paper I take a sceptical view of the standard cosmological model and its variants, mainly on the following grounds: (i) The method of mathematical modelling that characterises modern natural philosophy—as opposed to Aristotle's—goes well with the analytic, piecemeal approach to physical phenomena adopted by Galileo, Newton and their followers, but it is hardly suited for application to the whole world. (ii) Einstein's first cosmological model (1917) was not prompted by the intimations of experience but by a desire to satisfy Mach's Principle. (iii) The standard cosmological model—a Friedmann–Lemaı̂tre–Robertson–Walker spacetime expanding with or without end from an initial singularity—is supported by the phenomena of redshifted light from distant sources and very nearly isotropic thermal background radiation provided that two mutually inconsistent physical theories are jointly brought to bear on these phenomena, viz the quantum theory of elementary particles and Einstein's theory of gravity. (iv) While the former is certainly corroborated by high-energy experiments conducted under conditions allegedly similar to those prevailing in the early world, precise tests of the latter involve applications of the Schwarzschild solution or the PPN formalism for which there is no room in a Friedmann–Lemaı̂tre–Robertson–Walker spacetime.  相似文献   

16.
The paper presents a unified, fully recursive approach to the modelling and forecasting of non-stationary time-series. The basic time-series model, which is based on the well-known ‘component’ or ‘structuraL’ form, is formulated in state-space terms. A novel spectral decomposition procedure, based on the exploitation of recursive smoothing algorithms, is then utilized to simplify the procedures of model identification and estimation. Finally, the fully recursive formulation allows for conventional or self-adaptive implementation of state-space forecasting and seasonal adjustment. Although the paper is restricted to the consideration of univariate time series, the basic approach can be extended to handle explanatory variables or full multivariable (vector) series.  相似文献   

17.
This paper estimates a forecasting equation for the hourly peak electricity demand one day in the future. The models incorporate deterministic influences such as holidays, stochastic influences such as average loads by building bivariate models, and exogenous influences such as the weather which is given a careful non-linear formulation. Out-of-sample comparisons are made using an additional year of data.  相似文献   

18.
Seasonal adjustment is performed in some data-producing agencies according to the ARIMA-model-based signal extraction theory. A stochastic linear process parametrized in terms of an ARIMA model is first fitted to the series, and from this model the models for the trend, cycle, seasonal, and irregular component can be derived. A spectrum is associated to every component model and is used to compute the optimal Wiener–Kolmogorov filter. Since the modelling is linear, prior linearization of the series with intervention techniques is performed. This paper discusses the performance of linear signal extraction with intervention techniques in non-linear processes. In particular, the following issues are discussed: (1) the ability of intervention techniques to linearize time series which present non-linearities; (2) the stability of the linear projection giving the components estimators under non-linear misspecifications; (3) the capacity of the WK filter to preserve the linearity in some components and the non-linearities in others. Copyright © 1998 John Wiley & Sons, Ltd.  相似文献   

19.
20.
This work compares two classes of multiple time series models which have been developed in past decades and are usually believed to be equivalent: the vector ARMA model and the system of simultaneous transfer functions (STF). The first part analyzes the mathematical structure of the two schemes; their properties of stability, structural identification and realization. In the second, algorithms of order identification and parameter estimation are derived, following the approach of stochastic approximation. The proposed solutions are easily implementable on standard statistical software and in an extended empirical example their performance is checked. The superiority of the STF model will be well established.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号