首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper discusses some philosophical aspects related to the recent publication of the experimental results of the 2017 black hole experiment, namely the first image of the supermassive black hole at the center of galaxy M87. In this paper I present a philosophical analysis of the 2017 Event Horizon Telescope (EHT) black hole experiment. I first present Hacking's philosophy of experimentation. Hacking gives his taxonomy of elements of laboratory science and distinguishes a list of elements. I show that the EHT experiment conforms to major elements from Hacking's list. I then describe with the help of Galison's Philosophy of the Shadow how the EHT Collaboration created the famous black hole image. Galison outlines three stages for the reconstruction of the black hole image: Socio-Epistemology, Mechanical Objectivity, after which there is an additional Socio-Epistemology stage. I subsequently present my own interpretation of the reconstruction of the black hole image and I discuss model fitting to data. I suggest that the main method used by the EHT Collaboration to assure trust in the results of the EHT experiment is what philosophers call the Argument from Coincidence. I show that using this method for the above purpose is problematic. I present two versions of the Argument from Coincidence: Hacking's Coincidence and Cartwright's Reproducibility by which I analyse the EHT experiment. The same estimation of the mass of the black hole is reproduced in four different procedures. The EHT Collaboration concludes: the value we have converged upon is robust. I analyse the mass measurements of the black hole with the help of Cartwright's notion of robustness. I show that the EHT Collaboration construe Coincidence/Reproducibility as Technological Agnosticism and I contrast this interpretation with van Fraassen's scientific agnosticism.  相似文献   

2.
In a recent paper, Otávio Bueno (2012) introduced a narrower understanding of Hacking's concept of styles of scientific reasoning. Although its ultimate goal is to serve a pluralist view of science, Bueno's proposal is a thought-provoking attempt at outlining a concept of style that would keep most of the original understanding's heuristic value, while providing some analytical grip on the specific details of particular scientific practices. In this reply, I consider solely this latter more proximate goal. More precisely, I assess whether or not Bueno's narrower understanding of styles could provide historians and philosophers of science with a workable unit to investigate particular transformations in scientific practices. While the author's proposal is certainly interesting overall, the usefulness of the unit it describes may be compromised by three shortcomings: 1° the extent to which the unit is meant to be narrower is indeterminate; 2° it does not improve much on the analytical capabilities of Hacking's concept; and 3° like Hacking's concept it is rather powerless to capture the dynamical character of particular scientific practices.  相似文献   

3.
J. D. Trout has recently developed a new defense of scientific realism, a new version of the No Miracles Argument. I critically evaluate Trout's novel defense of realism. I argue that Trout's argument for scientific realism and the related explanation for the success of science are self-defeating. In the process of arguing against the traditional realist strategies for explaining the success of science, he inadvertently undermines his own argument.  相似文献   

4.
According to what I call the ‘argument from public bads’, if a researcher deceived subjects in the past, there is a chance that subjects will discount the information that a subsequent researcher provides, thus compromising the validity of the subsequent researcher's experiment. While this argument is taken to justify an existing informal ban on explicit deception in experimental economics, it can also apply to implicit deception, yet implicit deception is not banned and is sometimes used in experimental economics. Thus, experimental economists are being inconsistent when they appeal to the argument from public bads to justify banning explicit deception but not implicit deception.  相似文献   

5.
Most scientific realists today in one way or another confine the object of their commitment to certain components of a successful theory and thereby seek to make realism compatible with the history of theory change. Kyle Stanford calls this move by realists the strategy of selective confirmation and raises a challenge against its contemporary, reliable applicability. In this paper, I critically examine Stanford's inductive argument that is based on past scientists' failures to identify the confirmed components of their contemporary theories. I argue that our ability to make such identification should be evaluated based on the performance of the scientific community as a whole rather than that of individual scientists and that Stanford's challenge fails to raise a serious concern because it focuses solely on individual scientists' judgments, which are either made before the scientific community has reached a consensus or about the value of the posit as a locus for further research rather than its confirmed status.  相似文献   

6.
The goal of this paper, both historical and philosophical, is to launch a new case into the scientific realism debate: geocentric astronomy. Scientific realism about unobservables claims that the non-observational content of our successful/justified empirical theories is true, or approximately true. The argument that is currently considered the best in favor of scientific realism is the No Miracles Argument: the predictive success of a theory that makes (novel) observational predictions while making use of non-observational content would be inexplicable unless such non-observational content approximately corresponds to the world “out there”. Laudan's pessimistic meta-induction challenged this argument, and realists reacted by moving to a “selective” version of realism: the approximately true part of the theory is not its full non-observational content but only the part of it that is responsible for the novel, successful observational predictions. Selective scientific realism has been tested against some of the theories in Laudan's list, but the first member of this list, geocentric astronomy, has been traditionally ignored. Our goal here is to defend that Ptolemy's Geocentrism deserves attention and poses a prima facie strong case against selective realism, since it made several successful, novel predictions based on theoretical hypotheses that do not seem to be retained, not even approximately, by posterior theories. Here, though, we confine our work just to the detailed reconstruction of what we take to be the main novel, successful Ptolemaic predictions, leaving the full analysis and assessment of their significance for the realist thesis to future works.  相似文献   

7.
Recent philosophy has paid increasing attention to the nature of the relationship between the philosophy of science and metaphysics. In The Structure of the World: Metaphysics and Representation, Steven French offers many insights into this relationship (primarily) in the context of fundamental physics, and claims that a specific, structuralist conception of the ontology of the world exemplifies an optimal understanding of it. In this paper I contend that his messages regarding how best to think about the relationship are mixed, and in tension with one another. The tension is resolvable but at a cost: a weakening of the argument for French's structuralist ontology. I elaborate this claim in a specific case: his assertion of the superiority of a structuralist account of de re modality in terms of realism about laws and symmetries (conceived ontologically) over an account in terms of realism about dispositional properties. I suggest that these two accounts stem from different stances regarding how to theorize about scientific ontology, each of which is motivated by important aspects of physics.  相似文献   

8.
Though Robert Boyle called final causes one of the most important subjects for a natural philosopher to study, his own treatise on the subject, the Disquisition about Final Causes, has received comparatively little scholarly attention. In this paper, I explicate Boyle's complex argument against the use of teleological explanations for inanimate bodies, such as metals. The central object of this argument is a mysterious allusion to a silver plant. I claim that the silver plant is best understood as a reference to alchemical product: the Arbor Dianae, an offshoot of George Starkey's recipe for the Philosophers' Stone. Then, I show how the context of alchemy not only clarifies Boyle's argument but also places it within a wider dialectic about matter and teleology. I then contrast the parallel arguments of Boyle and John Ray on the question of whether metals have divine purposes and show that the difference is explained by Boyle's belief in the transmutation of metals.  相似文献   

9.
In this paper, I introduce a new historical case study into the scientific realism debate. During the late-eighteenth century, the Scottish natural philosopher James Hutton made two important successful novel predictions. The first concerned granitic veins intruding from granite masses into strata. The second concerned what geologists now term “angular unconformities”: older sections of strata overlain by younger sections, the two resting at different angles, the former typically more inclined than the latter. These predictions, I argue, are potentially problematic for selective scientific realism in that constituents of Hutton's theory that would not be considered even approximately true today played various roles in generating them. The aim here is not to provide a full philosophical analysis but to introduce the case into the debate by detailing the history and showing why, at least prima facie, it presents a problem for selective realism. First, I explicate Hutton's theory. I then give an account of Hutton's predictions and their confirmations. Next, I explain why these predictions are relevant to the realism debate. Finally, I consider which constituents of Hutton's theory are, according to current beliefs, true (or approximately true), which are not (even approximately) true, and which were responsible for these successes.  相似文献   

10.
Turner [The past vs. the tiny: Historical science and the abductive arguments for realism. Studies in History and Philosophy of Science 35A (2004) 1] claims that the arguments in favor of realism do not support with the same force both classes of realism, since they supply stronger reasons for experimental realism than for historical realism. I would like to make two comments, which should be seen as amplifications inspired by his proposal, rather than as a criticism. First, it is important to highlight that Turner’s distinction between ‘tiny’ and ‘past unobservables’ is neither excluding nor exhaustive. Second, even if we agreed with everything that Turner says regarding the arguments for realism and their relative weight in order to justify the experimental or historical version, there is an aspect that Turner does not consider and that renders historical realism less problematic than experimental realism.  相似文献   

11.
The goal of this paper is to provide an interpretation of Feyerabend's metaphysics of science as found in late works like Conquest of Abundance and Tyranny of Science. Feyerabend's late metaphysics consists of an attempt to criticize and provide a systematic alternative to traditional scientific realism, a package of views he sometimes referred to as “scientific materialism.” Scientific materialism is objectionable not only on metaphysical grounds, nor because it provides a poor ground for understanding science, but because it implies problematic claims about the epistemic and cultural authority of science, claims incompatible with situating science properly in democratic societies. I show how Feyerabend's metaphysical view, which I call “the abundant world” or “abundant realism,” constitute a sophisticated and challenging form of ontological pluralism that makes interesting connections with contemporary philosophy of science and issues of the political and policy role of science in a democratic society.  相似文献   

12.
Inferences from scientific success to the approximate truth of successful theories remain central to the most influential arguments for scientific realism. Challenges to such inferences, however, based on radical discontinuities within the history of science, have motivated a distinctive style of revision to the original argument. Conceding the historical claim, selective realists argue that accompanying even the most revolutionary change is the retention of significant parts of replaced theories, and that a realist attitude towards the systematically retained constituents of our scientific theories can still be defended. Selective realists thereby hope to secure the argument from success against apparent historical counterexamples. Independently of that objective, historical considerations have inspired a further argument for selective realism, where evidence for the retention of parts of theories is itself offered as justification for adopting a realist attitude towards them. Given the nature of these arguments from success and from retention, a reasonable expectation is that they would complement and reinforce one another, but although several theses purport to provide such a synthesis the results are often unconvincing. In this paper I reconsider the realist’s favoured type of scientific success, novel success, offer a revised interpretation of the concept, and argue that a significant consequence of reconfiguring the realist’s argument from success accordingly is a greater potential for its unification with the argument from retention.  相似文献   

13.
In his later writings Kuhn reconsidered his earlier account of incommensurability, clarifying some aspects, modifying others, and explicitly rejecting some of his earlier claims. In Kuhn’s new account incommensurability does not pose a problem for the rational evaluation of competing scientific theories, but does pose a problem for certain forms of realism. Kuhn maintains that, because of incommensurability, the notion that science might seek to learn the nature of things as they are in themselves is incoherent. I develop Kuhn’s new account of incommensurability, respond to his anti-realist argument, and sketch a form of realism in which the realist aim is a pursuable goal.  相似文献   

14.
The paper challenges a recent attempt by Jouni-Matti Kuukkanen to show that since Thomas Kuhn’s philosophical standpoint can be incorporated into coherentist epistemology, it does not necessarily lead to: (Thesis 1) an abandonment of rationality and rational interparadigm theory comparison, nor to (Thesis 2) an abandonment of convergent realism. Leaving aside the interpretation of Kuhn as a coherentist, we will show that Kuukkanen’s first thesis is not sufficiently explicated, while the second one entirely fails. With regard to Thesis 1, we argue that Kuhn’s view on inter-paradigm theory comparison allows only for (what we shall dub as) ‘the weak notion of rationality’, and that Kuukkanen’s argument is thus acceptable only in view of such a notion. With regard to Thesis 2, we show that even if we interpret Kuhn as a coherentist, his philosophical standpoint cannot be seen as compatible with convergent realism since Kuhn’s argument against it is not ‘ultimately empirical’, as Kuukkanen takes it to be.  相似文献   

15.
Kuhn argued against both the correspondence theory of truth and convergent realism. Although he likely misunderstood the nature of the correspondence theory, which it seems he wrongly believed to be an epistemic theory, Kuhn had an important epistemic point to make. He maintained that any assessment of correspondence between beliefs and reality is not possible, and therefore, the acceptance of beliefs and the presumption of their truthfulness has to be decided on the basis of other criteria. I will show that via Kuhn’s suggested epistemic values, specifically via problem-solving, his philosophy can be incorporated into a coherentist epistemology. Further, coherentism is, in principle, compatible with convergent realism. However, an argument for increasing likeness to truth requires appropriate historical continuity. Kuhn maintained that the history of science is full of discontinuity, and therefore, the historical condition of convergent realism is not satisfied.  相似文献   

16.
One way to reconstruct the miracle argument for scientific realism is to regard it as a statistical inference: since it is exceedingly unlikely that a false theory makes successful predictions, while it is rather likely that an approximately true theory is predictively successful, it is reasonable to infer that a predictively successful theory is at least approximately true. This reconstruction has led to the objection that the argument embodies a base rate fallacy: by focusing on successful theories one ignores the vast number of false theories some of which will be successful by mere chance.In this paper, I shall argue that the cogency of this objection depends on the explanandum of the miracle argument. It is cogent if what is to be explained is the success of a particular theory. If, however, the explanandum of the argument is the distribution of successful predictions among competing theories, the situation is different. Since the distribution of accidentally successful predictions is independent of the base rate, it is possible to assess the base rate by comparing this distribution to the empirically found distribution of successful predictions among competing theories.  相似文献   

17.
Perhaps the strongest argument for scientific realism, the no-miracles-argument, has been said to commit the so-called base rate fallacy. The apparent elusiveness of the base rate of true theories has even been said to undermine the rationality of the entire realism debate. On the basis of the Kuhnian picture of theory choice, I confront this challenge by arguing that a theory is likely to be true if it possesses multiple theoretical virtues and is embraced by numerous scientists–even when the base rate converges to zero.  相似文献   

18.
This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning.  相似文献   

19.
The selected effect account is regarded by many as one of the most attractive accounts of function. This account assumes that the function of a trait is what it has been selected for. Recently, it has been generalized by Justin Garson to include cases in which selection is understood as a simple sorting process, i.e., a selection process between entities that do not reproduce. However, once extended, this generalized selected effect account seems to ascribe functions to entities for which it looks unintuitive to do so. For instance, the hardness of rocks on a beach being differentially eroded by waves would be ascribed the function of resisting erosion. Garson provides one central argument why, despite appearance, one should not ascribe functions in cases of such sorting processes. In this paper, I start by presenting his argument, which hinges on whether a collection of entities form a population. I find it wanting. I argue instead that some selection processes are evolutionarily more or less interesting and that when a selection process is regarded as evolutionarily uninteresting, it will yield an uninteresting form of function rather than a reason for withholding the concept of function altogether.  相似文献   

20.
Several recent authors identify structural realism about scientific theories with the claim that the content of a scientific theory is expressible using its Ramsey sentence. Many of these authors have also argued that so understood, the view collapses into empiricist anti-realism, since an argument originally proposed by Max Newman in a review of Bertrand Russell’s The analysis of matter demonstrates that Ramsey sentences are trivially satisfied, and cannot make any significant claims about unobservables. In this paper I argue against both of these claims. Structural realism and Ramsey sentence realism are, in their most defensible versions, importantly different doctrines, and neither is committed to the premises required to demonstrate that they collapse into anti-realism.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号