首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
Hume's essay ‘Of Miracles’ has been a focus of controversy ever since its publication. The challenge to Christian orthodoxy was only too evident, but the balance-of-probabilities criterion advanced by Hume for determining when testimony justifies belief in miracles has also been a subject of contention among philosophers. The temptation for those familiar with Bayesian methodology to show that Hume's criterion determines a corresponding balance-of-posterior probabilities in favour of miracles is understandable, but I will argue that their attempts fail. However, I show that his criterion generates a valid form of the so-called No-Miracles Argument appealed to by modern realist philosophers, whose own presentation of it, despite their possession of the probabilistic machinery Hume himself lacked, is invalid.  相似文献   

2.
The goal of this paper, both historical and philosophical, is to launch a new case into the scientific realism debate: geocentric astronomy. Scientific realism about unobservables claims that the non-observational content of our successful/justified empirical theories is true, or approximately true. The argument that is currently considered the best in favor of scientific realism is the No Miracles Argument: the predictive success of a theory that makes (novel) observational predictions while making use of non-observational content would be inexplicable unless such non-observational content approximately corresponds to the world “out there”. Laudan's pessimistic meta-induction challenged this argument, and realists reacted by moving to a “selective” version of realism: the approximately true part of the theory is not its full non-observational content but only the part of it that is responsible for the novel, successful observational predictions. Selective scientific realism has been tested against some of the theories in Laudan's list, but the first member of this list, geocentric astronomy, has been traditionally ignored. Our goal here is to defend that Ptolemy's Geocentrism deserves attention and poses a prima facie strong case against selective realism, since it made several successful, novel predictions based on theoretical hypotheses that do not seem to be retained, not even approximately, by posterior theories. Here, though, we confine our work just to the detailed reconstruction of what we take to be the main novel, successful Ptolemaic predictions, leaving the full analysis and assessment of their significance for the realist thesis to future works.  相似文献   

3.
In this paper, I argue that the ultimate argument for Scientific Realism, also known as the No-Miracles Argument (NMA), ultimately fails as an abductive defence of Epistemic Scientific Realism (ESR), where (ESR) is the thesis that successful theories of mature sciences are approximately true. The NMA is supposed to be an Inference to the Best Explanation (IBE) that purports to explain the success of science. However, the explanation offered as the best explanation for success, namely (ESR), fails to yield independently testable predictions that alternative explanations for success do not yield. If this is correct, then there seems to be no good reason to prefer (ESR) over alternative explanations for success.  相似文献   

4.
The goal of this paper is to provide an interpretation of Feyerabend's metaphysics of science as found in late works like Conquest of Abundance and Tyranny of Science. Feyerabend's late metaphysics consists of an attempt to criticize and provide a systematic alternative to traditional scientific realism, a package of views he sometimes referred to as “scientific materialism.” Scientific materialism is objectionable not only on metaphysical grounds, nor because it provides a poor ground for understanding science, but because it implies problematic claims about the epistemic and cultural authority of science, claims incompatible with situating science properly in democratic societies. I show how Feyerabend's metaphysical view, which I call “the abundant world” or “abundant realism,” constitute a sophisticated and challenging form of ontological pluralism that makes interesting connections with contemporary philosophy of science and issues of the political and policy role of science in a democratic society.  相似文献   

5.
In this essay, I examine the curved spacetime formulation of Newtonian gravity known as Newton–Cartan gravity and compare it with flat spacetime formulations. Two versions of Newton–Cartan gravity can be identified in the physics literature—a “weak” version and a “strong” version. The strong version has a constrained Hamiltonian formulation and consequently a well-defined gauge structure, whereas the weak version does not (with some qualifications). Moreover, the strong version is best compared with the structure of what Earman (World enough and spacetime. Cambridge: MIT Press) has dubbed Maxwellian spacetime. This suggests that there are also two versions of Newtonian gravity in flat spacetime—a “weak” version in Maxwellian spacetime, and a “strong” version in Neo-Newtonian spacetime. I conclude by indicating how these alternative formulations of Newtonian gravity impact the notion of empirical indistinguishability and the debate over scientific realism.  相似文献   

6.
J. D. Trout has recently developed a new defense of scientific realism, a new version of the No Miracles Argument. I critically evaluate Trout's novel defense of realism. I argue that Trout's argument for scientific realism and the related explanation for the success of science are self-defeating. In the process of arguing against the traditional realist strategies for explaining the success of science, he inadvertently undermines his own argument.  相似文献   

7.
In this paper, I introduce a new historical case study into the scientific realism debate. During the late-eighteenth century, the Scottish natural philosopher James Hutton made two important successful novel predictions. The first concerned granitic veins intruding from granite masses into strata. The second concerned what geologists now term “angular unconformities”: older sections of strata overlain by younger sections, the two resting at different angles, the former typically more inclined than the latter. These predictions, I argue, are potentially problematic for selective scientific realism in that constituents of Hutton's theory that would not be considered even approximately true today played various roles in generating them. The aim here is not to provide a full philosophical analysis but to introduce the case into the debate by detailing the history and showing why, at least prima facie, it presents a problem for selective realism. First, I explicate Hutton's theory. I then give an account of Hutton's predictions and their confirmations. Next, I explain why these predictions are relevant to the realism debate. Finally, I consider which constituents of Hutton's theory are, according to current beliefs, true (or approximately true), which are not (even approximately) true, and which were responsible for these successes.  相似文献   

8.
In this paper I challenge and adjudicate between the two positions that have come to prominence in the scientific realism debate: deployment realism and structural realism. I discuss a set of cases from the history of celestial mechanics, including some of the most important successes in the history of science. To the surprise of the deployment realist, these are novel predictive successes toward which theoretical constituents that are now seen to be patently false were genuinely deployed. Exploring the implications for structural realism, I show that the need to accommodate these cases forces our notion of “structure” toward a dramatic depletion of logical content, threatening to render it explanatorily vacuous: the better structuralism fares against these historical examples, in terms of retention, the worse it fares in content and explanatory strength. I conclude by considering recent restrictions that serve to make “structure” more specific. I show however that these refinements will not suffice: the better structuralism fares in specificity and explanatory strength, the worse it fares against history. In light of these case studies, both deployment realism and structural realism are significantly threatened by the very historical challenge they were introduced to answer.  相似文献   

9.
Xenophon’s Peri hippikes <technees>, “On Horsemanship,” and the Hipparchikos <logos>, “the Cavalry Commander”, writings which can be regarded as technical works with a didactic purpose, are almost unknown. Xenophon was interested in problems of leadership and the exercise of power (For the titles of the both writings and the analogous supplementary terms “technees” and “logos” compare Breitenbach, 1966, 1761. Angled brackets are used by modern editors of ancient texts as text-critical signs and are inserted here because “peri hippikes” and “hipparchikos” are adjectives and cannot stand alone; “technees” and “logos” are supplied as absent, but implied, substantives. The conventional signs “<>” clarify this supplementary act for the modern reader.). The “Cavalry Commander” (ca. 365 B.C.) intended for the instruction of potential hipparchs (calvary commanders) is simultaneously political and didactic and technical; in this text, categories of leadership that had been developed in earlier works are combined in a powerful manner. Xenophon constructed “leaders” as exempla (“examples”) of correct behaviour; military and political theory are synthesised. The Hipparchikos logos, as a didactic work, aims to produce specialists who master their techné (“art” or “skill”), to develop their ideal qualities. Both texts are directed at an Athenian society, in which the cavalry had lost their significance and pride as a result of recent political turbulence. Xenophon hoped to reform the cavalry for the benefit of Athens. The basic question of the didactic work is: how can I become the best hipparch, how can I go beyond simply filling the office, and instead develop it for the well-being of the polis and thus serve the city? The art of leadership consists in dealing with subordinates in such a manner that they obey and follow voluntarily—still an innovative and modern approach today.  相似文献   

10.
In The Theory of Relativity and A Priori Knowledge (1920b), Reichenbach developed an original account of cognition as coordination of formal structures to empirical ones. One of the most salient features of this account is that it is explicitly not a top-down type of coordination, and in fact it is crucially “directed” by the empirical side. Reichenbach called this feature “the mutuality of coordination” but, in that work, did not elaborate sufficiently on how this is supposed to work. In a paper that he wrote less than two years afterwards (but that he published only in 1932), “The Principle of Causality and the Possibility of its Empirical Confirmation” (1923/1932), he described what seems to be a model for this idea, now within an analysis of causality that results in an account of scientific inference. Recent reassessments of his early proposal do not seem to capture the extent of Reichenbach's original worries. The present paper analyses Reichenbach's early account and suggests a new way to look at his early work. According to it, we perform measurements, individuate parameters, collect and analyse data, by using a “constructive” approach, such as the one with which we formulate and test hypotheses, which paradigmatically requires some simplicity assumptions. Reichenbach's attempt to account for all these aspects in 1923 was obviously limited and naive in many ways, but it shows that, in his view, there were multiple ways in which the idea of “constitution” is embodied in scientific practice.  相似文献   

11.
We provide a novel perspective on “regularity” as a property of representations of the Weyl algebra. We first critique a proposal by Halvorson [2004, “Complementarity of representations in quantum mechanics”, Studies in History and Philosophy of Modern Physics 35 (1), pp. 45–56], who argues that the non-regular “position” and “momentum” representations of the Weyl algebra demonstrate that a quantum mechanical particle can have definite values for position or momentum, contrary to a widespread view. We show that there are obstacles to such an intepretation of non-regular representations. In Part II, we propose a justification for focusing on regular representations, pace Halvorson, by drawing on algebraic methods.  相似文献   

12.
We provide a novel perspective on “regularity” as a property of representations of the Weyl algebra. In Part I, we critiqued a proposal by Halvorson [2004, “Complementarity of representations in quantum mechanics”, Studies in History and Philosophy of Modern Physics 35 (1), pp. 45–56], who advocates for the use of the non-regular “position” and “momentum” representations of the Weyl algebra. Halvorson argues that the existence of these non-regular representations demonstrates that a quantum mechanical particle can have definite values for position or momentum, contrary to a widespread view. In this sequel, we propose a justification for focusing on regular representations, pace Halvorson, by drawing on algebraic methods.  相似文献   

13.
This paper critically assesses the proposal that scientific realists do not need to search for a solution of the measurement problem in quantum mechanics, but should instead dismiss the problem as ill-posed. James Ladyman and Don Ross have sought to support this proposal with arguments drawn from their naturalized metaphysics and from a Bohr-inspired approach to quantum mechanics. I show that the first class of arguments is unsuccessful, because formulating the measurement problem does not depend on the metaphysical commitments which are undermined by ontic structural realism, rainforest realism, or naturalism in general. The second class of arguments is problematic due to its refusal to provide an analysis of the term “measurement”. It turns out that the proposed dissolution of the measurement problem is in conflict not only with traditional forms of scientific realism but even with the rather minimal realism that Ladyman and Ross themselves defend. The paper concludes with a brief discussion of two related proposals: Healey's pragmatist approach and Bub's information-theoretic interpretation.  相似文献   

14.
It is widely acknowledged that the patient's perspective should be considered when making decisions about how her care will be managed. Patient participation in the decision making process may play an important role in bringing to light and incorporating her perspective. The GRADE framework is touted as an evidence-based process for determining recommendations for clinical practice; i.e. determining how care ought to be managed. GRADE recommendations are categorized as “strong” or “weak” based on several factors, including the “values and preferences” of a “typical” patient. The strength of the recommendation also provides instruction to the clinician about when and how patients should participate in the clinical encounter, and thus whether an individual patient's values and preferences will be heard in her clinical encounter. That is, a “strong” recommendation encourages “paternalism” and a “weak” recommendation encourages shared decision making. We argue that adoption of the GRADE framework is problematic to patient participation and may result in care that is not respectful of the individual patient's values and preferences. We argue that the root of the problem is the conception of “values and preferences” in GRADE – the framework favours population thinking (e.g. “typical” patient “values and preferences”), despite the fact that “values and preferences” are individual in the sense that they are deeply personal. We also show that tying the strength of a recommendation to a model of decision making (paternalism or shared decision making) constrains patient participation and is not justified (theoretically and/or empirically) in the GRADE literature.  相似文献   

15.
This paper investigates how and when pairs of terms such as “local–global” and “im Kleinenim Grossen” began to be used by mathematicians as explicit reflexive categories. A first phase of automatic search led to the delineation of the relevant corpus, and to the identification of the period from 1898 to 1918 as that of emergence. The emergence appears to have been, from the very start, both transdisciplinary (function theory, calculus of variations, differential geometry) and international, although the AMS-Göttingen connection played a specific part. First used as an expository and didactic tool (e.g. by Osgood), it soon played a crucial part in the creation of new mathematical concepts (e.g. in Hahn’s work), in the shaping of research agendas (e.g. Blaschke’s global differential geometry), and in Weyl’s axiomatic foundation of the manifold concept. We finally turn to France, where in the 1910s, in the wake of Poincaré’s work, Hadamard began to promote a research agenda in terms of “passage du local au general.”  相似文献   

16.
We distinguish two orientations in Weyl's analysis of the fundamental role played by the notion of symmetry in physics, namely an orientation inspired by Klein's Erlangen program and a phenomenological-transcendental orientation. By privileging the former to the detriment of the latter, we sketch a group(oid)-theoretical program—that we call the Klein-Weyl program—for the interpretation of both gauge theories and quantum mechanics in a single conceptual framework. This program is based on Weyl's notion of a “structure-endowed entity” equipped with a “group of automorphisms”. First, we analyze what Weyl calls the “problem of relativity” in the frameworks provided by special relativity, general relativity, and Yang-Mills theories. We argue that both general relativity and Yang-Mills theories can be understood in terms of a localization of Klein's Erlangen program: while the latter describes the group-theoretical automorphisms of a single structure (such as homogenous geometries), local gauge symmetries and the corresponding gauge fields (Ehresmann connections) can be naturally understood in terms of the groupoid-theoretical isomorphisms in a family of identical structures. Second, we argue that quantum mechanics can be understood in terms of a linearization of Klein's Erlangen program. This stance leads us to an interpretation of the fact that quantum numbers are “indices characterizing representations of groups” ((Weyl, 1931a), p.xxi) in terms of a correspondence between the ontological categories of identity and determinateness.  相似文献   

17.
The paper takes up Bell's (1987) “Everett (?) theory” and develops it further. The resulting theory is about the system of all particles in the universe, each located in ordinary, 3-dimensional space. This many-particle system as a whole performs random jumps through 3N-dimensional configuration space – hence “Tychistic Bohmian Mechanics” (TBM). The distribution of its spontaneous localisations in configuration space is given by the Born Rule probability measure for the universal wavefunction. Contra Bell, the theory is argued to satisfy the minimal desiderata for a Bohmian theory within the Primitive Ontology framework (for which we offer a metaphysically more perspicuous formulation than is customary). TBM's formalism is that of ordinary Bohmian Mechanics (BM), without the postulate of continuous particle trajectories and their deterministic dynamics. This “rump formalism” receives, however, a different interpretation. We defend TBM as an empirically adequate and coherent quantum theory. Objections voiced by Bell and Maudlin are rebutted. The “for all practical purposes”-classical, Everettian worlds (i.e. quasi-classical histories) exist sequentially in TBM (rather than simultaneously, as in the Everett interpretation). In a temporally coarse-grained sense, they quasi-persist. By contrast, the individual particles themselves cease to persist.  相似文献   

18.
This paper reconsiders the challenge presented to scientific realism by the semantic incommensurability thesis. A twofold distinction is drawn between methodological and semantic incommensurability, and between semantic incommensurability due to variation of sense and due to discontinuity of reference. Only the latter presents a challenge to scientific realism. The realist may dispose of this challenge on the basis of a modified causal theory of reference, as argued in the author’s 1994 book, The incommensurability thesis. This referential response has been the subject of a charge of meta-incommensurability by Hoyningen-Huene et al. (1996), who argue that the realist’s referential response begs the question against anti-realist advocates of incommensurability. In reply, it is noted that a tu quoque rejoinder is available to the realist. It is also argued that the dialectical situation favours the scientific realist, since the anti-realist defence of incommensurability depends on an incoherent distinction between phenomenal world and world-in-itself. In light of such incoherence, and a strong commonsense presumption in favour of realism, the referential response to semantic incommensurability may be justifiably based on realism.  相似文献   

19.
Over the last two decades structural realism has been given progressively more elaborated formulations. Steven French has been at the forefront of the development of the most conceptually sophisticated and historically sensitive version of the view. In his book, The Structure of the World (French (2014)), French shows how structural realism, the view according to which structure is all there is (ontic structural realism), is able to illuminate central issues in the philosophy of science: underdetermination, scientific representation, dispositions, natural modality, and laws of nature. The discussion consistently sheds novel light on the problems under consideration while developing insightful and provocative views. In this paper, I focus on the status of mathematics within French's ontic structural realism, and I raise some concerns about its proper understanding vis-à-vis the realist components of the view.  相似文献   

20.
Naturalized metaphysics remains the default presupposition of much contemporary philosophy of physics. As metaphysics is supposed to concern the general structure of reality, so scientific naturalism draws upon our best physical theories to attempt to answer the foundational question “par excellenceviz., “how could the world possibly be the way this theory says it is?” A particular case study, Hilbert's attempt to analyze and explain a seeming “pre-established harmony” between mind and nature, is offered as a salutary reminder that naturalism's ready inference from physical theory to ontology may be too quick.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号