首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 609 毫秒
1.
Philosophers continue to debate both the actual and the ideal roles of values in science. Recently, Eric Winsberg has offered a novel, model-based challenge to those who argue that the internal workings of science can and should be kept free from the influence of social values. He contends that model-based assignments of probability to hypotheses about future climate change are unavoidably influenced by social values. I raise two objections to Winsberg’s argument, neither of which can wholly undermine its conclusion but each of which suggests that his argument exaggerates the influence of social values on estimates of uncertainty in climate prediction. I then show how a more traditional challenge to the value-free ideal seems tailor-made for the climate context.  相似文献   

2.
Non-epistemic values pervade climate modelling, as is now well documented and widely discussed in the philosophy of climate science. Recently, Parker and Winsberg have drawn attention to what can be termed “epistemic inequality”: this is the risk that climate models might more accurately represent the future climates of the geographical regions prioritised by the values of the modellers. In this paper, we promote value management as a way of overcoming epistemic inequality. We argue that value management can be seriously considered as soon as the value-free ideal and inductive risk arguments commonly used to frame the discussions of value influence in climate science are replaced by alternative social accounts of objectivity. We consider objectivity in Longino's sense as well as strong objectivity in Harding's sense to be relevant options here, because they offer concrete proposals that can guide scientific practice in evaluating and designing so-called multi-model ensembles and, in fine, improve their capacity to quantify and express uncertainty in climate projections.  相似文献   

3.
While no one denies that science depends on epistemic values, many philosophers of science have wrestled with the appropriate role of non-epistemic values, such as social, ethical, and political values. Recently, philosophers of science have overwhelmingly accepted that non-epistemic values should play a legitimate role in science. The recent philosophical debate has shifted from the value-free ideal in science to questions about how science should incorporate non-epistemic values. This article engages with such questions through an exploration of the environmental sciences. These sciences are a mosaic of diverse fields characterized by interdisciplinarity, problem-orientation, policy-directedness, and ubiquitous non-epistemic values. This article addresses a frequently voiced concern about many environmental science practices: that they ‘crowd out’ or displace significant non-epistemic values by either (1) entailing some non-epistemic values, rather than others, or by (2) obscuring discussion of non-epistemic values altogether. With three detailed case studies – monetizing nature, nature-society dualism, and ecosystem health – we show that the alleged problem of crowding out emerges from active debates within the environmental sciences. In each case, critics charge that the scientific practice in question displaces non-epistemic values in at least one of the two senses distinguished above. We show that crowding out is neither necessary nor always harmful when it occurs. However, we do see these putative objections to the application of environmental science as teaching valuable lessons about what matters for successful environmental science, all things considered. Given the significant role that many environmental scientists see for non-epistemic values in their fields, we argue that these cases motivate lessons about the importance of value-flexibility (that practices can accommodate a plurality of non-epistemic values), transparency about value-based decisions that inform practice, and environmental pragmatism.  相似文献   

4.
Simulation-based weather and climate prediction now involves the use of methods that reflect a deep concern with uncertainty. These methods, known as ensemble prediction methods, produce multiple simulations for predictive periods of interest, using different initial conditions, parameter values and/or model structures. This paper provides a non-technical overview of current ensemble methods and considers how the results of studies employing these methods should be interpreted, paying special attention to probabilistic interpretations. A key conclusion is that, while complicated inductive arguments might be given for the trustworthiness of probabilistic weather forecasts obtained from ensemble studies, analogous arguments are out of reach in the case of long-term climate prediction. In light of this, the paper considers how predictive uncertainty should be conveyed to decision makers.  相似文献   

5.
In previous works, I examine inferential methods employed in Probabilistic Weather Event Attribution studies (PEAs), and explored various ways they can be used to aid in climate policy decisions and decision-making about climate justice issues. This paper evaluates limitations of PEAs and considers how PEA researchers’ attributions of “liability” to specific countries for specific extreme weather events could be made more ethical. In sum, I show that it is routinely presupposed that PEA methods are not prone to inductive risks and presuppose that PEA researchers thus have no epistemic consequences or responsibilities for their attributions of liability. I argue that although PEAs are nevertheless crucially useful for practical decision-making, the attributions of liability made by PEA researchers are in fact prone to indicative risks and are influenced by non-epistemic values that PEA researchers should make transparent to make such studies more ethical. Finally, I outline possible normative approaches for making sciences, including PEAs, more ethical; and discuss implications of my arguments for the ongoing debate about how PEAs should guide climate policy and relevant legal decisions.  相似文献   

6.
Recent literature in the scientific realism debate has been concerned with a particular species of statistical fallacy concerning base-rates, and the worry that no matter how predictively successful our contemporary scientific theories may be, this will tell us absolutely nothing about the likelihood of their truth if our overall sample space contains enough empirically adequate theories that are nevertheless false. In response, both realists and anti-realists have switched their focus from general arguments concerning the reliability and historical track-records of our scientific methodology, to a series of specific arguments and case-studies concerning our reasons to believe individual scientific theories to be true. Such a development however sits in tension with the usual understanding of the scientific realism debate as offering a second-order assessment of our first-order scientific practices, and threatens to undermine the possibility of a distinctive philosophical debate over the approximate truth of our scientific theories. I illustrate this concern with three recent attempts to offer a more localised understanding of the scientific realism debate—due to Stathis Psillos, Juha Saatsi, and Kyle Stanford—and argue that none of these alternatives offer a satisfactory response to the problem.  相似文献   

7.
Climate scientists have been engaged in a decades-long debate over the standing of satellite measurements of the temperature trends of the atmosphere above the surface of the earth. This is especially significant because skeptics of global warming and the greenhouse effect have utilized this debate to spread doubt about global climate models used to predict future states of climate. I use this case from an understudied science to illustrate two distinct philosophical approaches to the relations among data, scientist, measurement, models, and theory. I argue that distinguishing between ‘direct’ empiricist and ‘complex’ empiricist approaches helps us understand and analyze this important scientific episode. I also introduce a complex empiricist account of testing and evaluation, and contrast it with the basic Hypothetico-Deductive approach to the climate models used by the direct empiricists. This more developed complex empiricist approach will serve philosophy of science well, as computational models become more widespread in the sciences.  相似文献   

8.
Although both direct multi‐step‐ahead forecasting and iterated one‐step‐ahead forecasting are two popular methods for predicting future values of a time series, it is not clear that the direct method is superior in practice, even though from a theoretical perspective it has lower mean squared error (MSE). A given model can be fitted according to either a multi‐step or a one‐step forecast error criterion, and we show here that discrepancies in performance between direct and iterative forecasting arise chiefly from the method of fitting, and is dictated by the nuances of the model's misspecification. We derive new formulas for quantifying iterative forecast MSE, and present a new approach for assessing asymptotic forecast MSE. Finally, the direct and iterative methods are compared on a retail series, which illustrates the strengths and weaknesses of each approach. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

9.
The last decade and a half has seen an ardent development of self-organised criticality (SOC), a new approach to complex systems, which has become important in many domains of natural as well as social science, such as geology, biology, astronomy, and economics, to mention just a few. This has led many to adopt a generalist stance towards SOC, which is now repeatedly claimed to be a universal theory of complex behaviour. The aim of this paper is twofold. First, I provide a brief and non-technical introduction to SOC. Second, I critically discuss the various bold claims that have been made in connection with it. Throughout, I will adopt a rather sober attitude and argue that some people have been too readily carried away by fancy contentions. My overall conclusion will be that none of these bold claims can be maintained. Nevertheless, stripped of exaggerated expectations and daring assertions, many SOC models are interesting vehicles for promising scientific research.  相似文献   

10.
I bring out the limitations of four important views of what the target of useful climate model assessment is. Three of these views are drawn from philosophy. They include the views of Elisabeth Lloyd and Wendy Parker, and an application of Bayesian confirmation theory. The fourth view I criticise is based on the actual practice of climate model assessment. In bringing out the limitations of these four views, I argue that an approach to climate model assessment that neither demands too much of such assessment nor threatens to be unreliable will, in typical cases, have to aim at something other than the confirmation of claims about how the climate system actually is. This means, I suggest, that the Intergovernmental Panel on Climate Change’s (IPCC׳s) focus on establishing confidence in climate model explanations and predictions is misguided. So too, it means that standard epistemologies of science with pretensions to generality, e.g., Bayesian epistemologies, fail to illuminate the assessment of climate models. I go on to outline a view that neither demands too much nor threatens to be unreliable, a view according to which useful climate model assessment typically aims to show that certain climatic scenarios are real possibilities and, when the scenarios are determined to be real possibilities, partially to determine how remote they are.  相似文献   

11.
Scientism applies the ideas and methods of the natural sciences to the humanities and social sciences. Herbert Spencer applied the law of the conservation of energy to social questions and arrived at formula answers to the issues of the day. The kind of certitude that Spencer aimed for was possible only by ignoring a system of values. Much as he may have believed that he was above personal beliefs, there are values implicit in Spencer's theories and they are the values of the nineteenth-century British middle class. Reasoning by analogy is as valid in social theory as it is in the natural sciences. Spencer's error was in universally applying the idea of the conservation of energy to social systems by means of identity rather than by analogy. Scientists in Britain, where there was a self-assured scientific community, dismissed Spencer's theories as being unscientific, but he enjoyed a vogue in the United States.  相似文献   

12.
The present paper draws on climate science and the philosophy of science in order to evaluate climate-model-based approaches to assessing climate projections. We analyze the difficulties that arise in such assessment and outline criteria of adequacy for approaches to it. In addition, we offer a critical overview of the approaches used in the IPCC working group one fourth report, including the confidence building, Bayesian and likelihood approaches. Finally, we consider approaches that do not feature in the IPCC reports, including three approaches drawn from the philosophy of science. We find that all available approaches face substantial challenges, with IPCC approaches having as a primary source of difficulty their goal of providing probabilistic assessments.  相似文献   

13.
Scientific realism driven by inference to the best explanation (IBE) takes empirically confirmed objects to exist, independent, pace empiricism, of whether those objects are observable or not. This kind of realism, it has been claimed, does not need probabilistic reasoning to justify the claim that these objects exist. But I show that there are scientific contexts in which a non-probabilistic IBE-driven realism leads to a puzzle. Since IBE can be applied in scientific contexts in which empirical confirmation has not yet been reached, realists will in these contexts be committed to the existence of empirically unconfirmed objects. As a consequence of such commitments, because they lack probabilistic features, the possible empirical confirmation of those objects is epistemically redundant with respect to realism.  相似文献   

14.
Much discussion was inspired by the publication of Harvey Brown's book Physical Relativity and the so-called dynamical approach to Special Relativity there advocated. At the center of the debate there is the question about the nature of the relation between spacetime and laws or, more specifically, between spacetime symmetries and the symmetries of laws. Originally, the relation was mainly assumed to be explanatory and the dispute expressed in terms of the arrow of explanation – whether it goes from spacetime (symmetries) to (symmetries of) laws or vice-versa. Not everybody agreed with a setting that involves leaving ontology out. In a recent turn, the relation has been claimed to be analytical or definitional. In this paper I intend to examine critically this claim and propose a way to generally understand the relation between spacetime symmetries and symmetries of laws as deriving from constitutive principles.  相似文献   

15.
Coping with recent heritage is troublesome for history of science museums, since modern scientific artefacts often suffer from a lack of esthetic and artistic qualities and expressiveness. The traditional object-oriented approach, in which museums collect and present objects as individual showpieces is inadequate to bring recent heritage to life. This paper argues that recent artefacts should be regarded as “key pieces.” In this approach the object derives its meaning not from its intrinsic qualities but from its place in an important historical event or development. The “key pieces” approach involves a more organic way of collecting and displaying, focussing less on the individual object and more on the context in which it functioned and its place in the storyline. Finally, I argue that the “key pieces” approach should not be limited to recent heritage. Using this method as a general guiding principle could be a way for history of science museums to appeal to today’s audiences.  相似文献   

16.
Hume’s Theorem     
A common criticism of Hume’s famous anti-induction argument is that it is vitiated because it fails to foreclose the possibility of an authentically probabilistic justification of induction. I argue that this claim is false, and that on the contrary, the probability calculus itself, in the form of an elementary consequence that I call Hume’s Theorem, fully endorses Hume’s argument. Various objections, including the often-made claim that Hume is defeated by de Finetti’s exchangeability results, are considered and rejected.  相似文献   

17.
In 1820, J. Pelletier and J.-B. Caventou, two French pharmacist-chemists working at the Ecole de Pharmacie of Paris, extracted quinine, a new substance, from cinchona bark. We use this example to illustrate the processes which lead from a crude natural product through the isolation of an active principle to the production of a pure manufactured drug. This allows us to discuss the development of chemical analysis in relation to pharmacy, natural history, medicine and the early pharmaceutical industry. The dynamics of the disciplines involved here show how organic chemistry, which was developing rapidly during these crucial years, expanded and became autonomous. Theoretical aspects (and in particular atomic theory) and practical innovations are relevant to the scientific methods developed by the first generation of those who integrated the new chemistry into their daily work. Beyond these historical issues, this paper aims to show how a holistic approach can contribute to the debate on discovery and invention in a science that is often considered empirical.  相似文献   

18.
At some point during the 1950s, mainstream American philosophy of science began increasingly to avoid questions about the role of non-cognitive values in science and, accordingly, increasingly to avoid active engagement with social, political and moral concerns. Such questions and engagement eventually ceased to be part of the mainstream. Here we show that the eventual dominance of ‘value-free’ philosophy of science can be attributed, at least in part, to the policies of the U.S. National Science Foundation's “History and Philosophy of Science” sub-program. In turn, the sub-program's policies were set by logical empiricists who espoused value-free philosophy of science; these philosophers' actions, we also point out, fit a broad pattern, one in which analytic philosophers used institutional control to marginalize rival approaches to philosophy. We go on to draw on existing knowledge of this pattern to suggest two further, similar, contributors to the withdrawal from value-laden philosophy of science, namely decisions by the editors of Philosophy of Science and by the editors of The Journal of Philosophy. Political climate was, we argue, at most an indirect contributor to the withdrawal and was neither a factor that decided whether it occurred nor one that was sufficient to bring it about. Moreover, we argue that the actions at the National Science Foundation went beyond what was required by its senior administrators and are better viewed as part of what drove, rather than as what was being driven by, the adoption of logical empiricism by the philosophy of science community.  相似文献   

19.
In climate science, climate models are one of the main tools for understanding phenomena. Here, we develop a framework to assess the fitness of a climate model for providing understanding. The framework is based on three dimensions: representational accuracy, representational depth, and graspability. We show that this framework does justice to the intuition that classical process-based climate models give understanding of phenomena. While simple climate models are characterized by a larger graspability, state-of-the-art models have a higher representational accuracy and representational depth. We then compare the fitness-for-providing understanding of process-based to data-driven models that are built with machine learning. We show that at first glance, data-driven models seem either unnecessary or inadequate for understanding. However, a case study from atmospheric research demonstrates that this is a false dilemma. Data-driven models can be useful tools for understanding, specifically for phenomena for which scientists can argue from the coherence of the models with background knowledge to their representational accuracy and for which the model complexity can be reduced such that they are graspable to a satisfactory extent.  相似文献   

20.
We introduce a new strategy for the prediction of linear temporal aggregates; we call it ‘hybrid’ and study its performance using asymptotic theory. This scheme consists of carrying out model parameter estimation with data sampled at the highest available frequency and the subsequent prediction with data and models aggregated according to the forecasting horizon of interest. We develop explicit expressions that approximately quantify the mean square forecasting errors associated with the different prediction schemes and that take into account the estimation error component. These approximate estimates indicate that the hybrid forecasting scheme tends to outperform the so‐called ‘all‐aggregated’ approach and, in some instances, the ‘all‐disaggregated’ strategy that is known to be optimal when model selection and estimation errors are neglected. Unlike other related approximate formulas existing in the literature, those proposed in this paper are totally explicit and require neither assumptions on the second‐order stationarity of the sample nor Monte Carlo simulations for their evaluation. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号