首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
I show how quantum mechanics, like the theory of relativity, can be understood as a ‘principle theory’ in Einstein's sense, and I use this notion to explore the approach to the problem of interpretation developed in my book Interpreting the Quantum World.  相似文献   

2.
In the Bayesian approach to quantum mechanics, probabilities—and thus quantum states—represent an agent's degrees of belief, rather than corresponding to objective properties of physical systems. In this paper we investigate the concept of certainty in quantum mechanics. Particularly, we show how the probability-1 predictions derived from pure quantum states highlight a fundamental difference between our Bayesian approach, on the one hand, and Copenhagen and similar interpretations on the other. We first review the main arguments for the general claim that probabilities always represent degrees of belief. We then argue that a quantum state prepared by some physical device always depends on an agent's prior beliefs, implying that the probability-1 predictions derived from that state also depend on the agent's prior beliefs. Quantum certainty is therefore always some agent's certainty. Conversely, if facts about an experimental setup could imply agent-independent certainty for a measurement outcome, as in many Copenhagen-like interpretations, that outcome would effectively correspond to a preexisting system property. The idea that measurement outcomes occurring with certainty correspond to preexisting system properties is, however, in conflict with locality. We emphasize this by giving a version of an argument of Stairs [(1983). Quantum logic, realism, and value-definiteness. Philosophy of Science, 50, 578], which applies the Kochen–Specker theorem to an entangled bipartite system.  相似文献   

3.
N. D. Mermin has proposed an “elaboration” of Einstein's 1905 derivation that supposedly fixes the flaws that I identified in this derivation. By specific examples taken from Einstein's own later work, I show that Mermin's elaboration is fraught with misconceptions.  相似文献   

4.
This article is about structural realism, historical continuity, laws of nature, and ceteris paribus clauses. Fresnel's Laws of optics support Structural Realism because they are a scientific structure that has survived theory change. However, the history of Fresnel's Laws which has been depicted in debates over realism since the 1980s is badly distorted. Specifically, claims that J. C. Maxwell or his followers believed in an ontologically-subsistent electromagnetic field, and gave up the aether, before Einstein's annus mirabilis in 1905 are indefensible. Related claims that Maxwell himself did not believe in a luminiferous aether are also indefensible. This paper corrects the record. In order to trace Fresnel's Laws across significant ontological changes, they must be followed past Einstein into modern physics and nonlinear optics. I develop the philosophical implications of a more accurate history, and analyze Fresnel's Laws' historical trajectory in terms of dynamic ceteris paribus clauses. Structuralists have not embraced ceteris paribus laws, but they continue to point to Fresnel's Laws to resist anti-realist arguments from theory change. Fresnel's Laws fit the standard definition of a ceteris paribus law as a law applicable only in particular circumstances. Realists who appeal to the historical continuity of Fresnel's Laws to combat anti-realists must incorporate ceteris paribus laws into their metaphysics.  相似文献   

5.
The 1919 British astronomical expedition led by Arthur Stanley Eddington to observe the deflection of starlight by the sun, as predicted by Einstein's relativistic theory of gravitation, is a fascinating example of the importance of expert testimony in the social transmission of scientific knowledge. While Popper lauded the expedition as science at its best, accounts by Earman and Glymour, Collins and Pinch, and Waller are more critical of Eddington's work. Here I revisit the eclipse expedition to dispute the characterization of the British response to general relativity as the blind acceptance of a partisan's pro-relativity claims by colleagues incapable of criticism. Many factors served to make Eddington the trusted British expert on relativity in 1919, and his experimental results rested on debatable choices of data analysis, choices criticized widely since but apparently not widely by his British contemporaries. By attending to how and to whom Eddington presented his testimony and how and by whom this testimony was received, I suggest, we may recognize as evidentially significant corroborating testimony from those who were expert not in relativity but in observational astronomy. We are reminded that even extraordinary expert testimony is neither offered nor accepted entirely in an epistemic vacuum.  相似文献   

6.
In this paper I concentrate on the dynamic aspects of the special theory of relativity (in the non-Minkowski formalism), and not on the kinematic part of the story as is usually done. Following up the dynamic story leads to a new point of view as to Poincaré's important role in the development of special relativity. Much of Poincaré's dynamic work did not enter into Einstein's 1905 theory, since Einstein was mainly occupied with kinematics. However, the dynamic part is most fundamental in the development of the special theory of relativity after 1905. In this paper I consider the main developments of relativistic dynamics in which I demonstrate that much response to Poincaré's dynamic research can be found. I argue that Poincaré's dynamic work assisted in departing from Einstein's electrodynamic theory towards relativistic dynamics (independent of electrodynamics).  相似文献   

7.
In this paper, I introduce a new historical case study into the scientific realism debate. During the late-eighteenth century, the Scottish natural philosopher James Hutton made two important successful novel predictions. The first concerned granitic veins intruding from granite masses into strata. The second concerned what geologists now term “angular unconformities”: older sections of strata overlain by younger sections, the two resting at different angles, the former typically more inclined than the latter. These predictions, I argue, are potentially problematic for selective scientific realism in that constituents of Hutton's theory that would not be considered even approximately true today played various roles in generating them. The aim here is not to provide a full philosophical analysis but to introduce the case into the debate by detailing the history and showing why, at least prima facie, it presents a problem for selective realism. First, I explicate Hutton's theory. I then give an account of Hutton's predictions and their confirmations. Next, I explain why these predictions are relevant to the realism debate. Finally, I consider which constituents of Hutton's theory are, according to current beliefs, true (or approximately true), which are not (even approximately) true, and which were responsible for these successes.  相似文献   

8.
Hume's essay ‘Of Miracles’ has been a focus of controversy ever since its publication. The challenge to Christian orthodoxy was only too evident, but the balance-of-probabilities criterion advanced by Hume for determining when testimony justifies belief in miracles has also been a subject of contention among philosophers. The temptation for those familiar with Bayesian methodology to show that Hume's criterion determines a corresponding balance-of-posterior probabilities in favour of miracles is understandable, but I will argue that their attempts fail. However, I show that his criterion generates a valid form of the so-called No-Miracles Argument appealed to by modern realist philosophers, whose own presentation of it, despite their possession of the probabilistic machinery Hume himself lacked, is invalid.  相似文献   

9.
Efforts to trace the influence of fin de siècle neo-Kantianism on early 20th Century philosophy of science have led scholars to recognize the powerful influence on Moritz Schlick of Hermann von Helmholtz, the doyen of 19th Century physics and a leader of the zur?ck zu Kant movement. But Michael Friedman thinks that Schlick misunderstood Helmholtz' signature philosophical doctrine, the sign-theory of perception. Indeed, Friedman has argued that Schlick transformed Helmholtz' Kantian view of spatial intuition into an empiricist version of the causal theory of perception. However, it will be argued that, despite the key role the sign-theory played in his epistemology, Schlick thought the Kantianism in Helmholtz' thought was deeply flawed, rendered obsolete by philosophical insights which emerged from recent scientific developments. So even though Schlick embraced the sign-theory, he rejected Helmholtz' ideas about spatial intuition. In fact, like his teacher, Max Planck, Schlick generalized the sign-theory into a form of structural realism. At the same time, Schlick borrowed the method of concept-formation developed by the formalist mathematicians, Moritz Pasch and David Hilbert, and combined it with the conventionalism of Henri Poincaré. Then, to link formally defined concepts with experience, Schlick's introduced his ‘method of coincidences’, similar to the ‘point-coincidences’ featured in Einstein's physics. The result was an original scientific philosophy, which owed much to contemporary scientific thinkers, but little to Kant or Kantianism.  相似文献   

10.
The question of the existence of gravitational stress-energy in general relativity has exercised investigators in the field since the inception of the theory. Folklore has it that no adequate definition of a localized gravitational stress-energetic quantity can be given. Most arguments to that effect invoke one version or another of the Principle of Equivalence. I argue that not only are such arguments of necessity vague and hand-waving but, worse, are beside the point and do not address the heart of the issue. Based on a novel analysis of what it may mean for one tensor to depend in the proper way on another, which, en passant, provides a precise characterization of the idea of a “geometric object”, I prove that, under certain natural conditions, there can be no tensor whose interpretation could be that it represents gravitational stress-energy in general relativity. It follows that gravitational energy, such as it is in general relativity, is necessarily non-local. Along the way, I prove a result of some interest in own right about the structure of the associated jet bundles of the bundle of Lorentz metrics over spacetime. I conclude by showing that my results also imply that, under a few natural conditions, the Einstein field equation is the unique equation relating gravitational phenomena to spatiotemporal structure, and discuss how this relates to the non-localizability of gravitational stress-energy. The main theorem proven underlying all the arguments is considerably stronger than the standard result in the literature used for the same purposes (Lovelock's theorem of 1972): it holds in all dimensions (not only in four); it does not require an assumption about the differential order of the desired concomitant of the metric; and it has a more natural physical interpretation.  相似文献   

11.
12.
The ontological model framework provides a rigorous approach to address the question of whether the quantum state is ontic or epistemic. When considering only conventional projective measurements, auxiliary assumptions are always needed to prove the reality of the quantum state in the framework. For example, the Pusey–Barrett–Rudolph theorem is based on an additional preparation independence assumption. In this paper, we give a new proof of ψ-ontology in terms of protective measurements in the ontological model framework. The proof does not rely on auxiliary assumptions, and it also applies to deterministic theories such as the de Broglie–Bohm theory. In addition, we give a simpler argument for ψ-ontology beyond the framework, which is based on protective measurements and a weaker criterion of reality. The argument may be also appealing for those people who favor an anti-realist view of quantum mechanics.  相似文献   

13.
In this paper I critically review attempts to formulate and derive the geodesic principle, which claims that free massive bodies follow geodesic paths in general relativity theory. I argue that if the principle is (canonically) interpreted as a law of motion describing the actual evolution of gravitating bodies, then it is impossible to generically apply the law to massive bodies in a way that is coherent with Einstein's field equations. Rejecting the canonical interpretation, I propose an alternative interpretation of the geodesic principle as a type of universality thesis analogous to the universality behavior exhibited in thermal systems during phase transitions.  相似文献   

14.
In this paper I take a sceptical view of the standard cosmological model and its variants, mainly on the following grounds: (i) The method of mathematical modelling that characterises modern natural philosophy—as opposed to Aristotle's—goes well with the analytic, piecemeal approach to physical phenomena adopted by Galileo, Newton and their followers, but it is hardly suited for application to the whole world. (ii) Einstein's first cosmological model (1917) was not prompted by the intimations of experience but by a desire to satisfy Mach's Principle. (iii) The standard cosmological model—a Friedmann–Lemaı̂tre–Robertson–Walker spacetime expanding with or without end from an initial singularity—is supported by the phenomena of redshifted light from distant sources and very nearly isotropic thermal background radiation provided that two mutually inconsistent physical theories are jointly brought to bear on these phenomena, viz the quantum theory of elementary particles and Einstein's theory of gravity. (iv) While the former is certainly corroborated by high-energy experiments conducted under conditions allegedly similar to those prevailing in the early world, precise tests of the latter involve applications of the Schwarzschild solution or the PPN formalism for which there is no room in a Friedmann–Lemaı̂tre–Robertson–Walker spacetime.  相似文献   

15.
At the time of Heinrich Hertz's premature death in 1894, he was regarded as one of the leading scientists of his generation. However, the posthumous publication of his treatise in the foundations of physics, Principles of Mechanics, presents a curious historical situation. Although Hertz's book was widely praised and admired, it was also met with a general sense of dissatisfaction. Almost all of Hertz's contemporaries criticized Principles for the lack of any plausible way to construct a mechanism from the “hidden masses” that are particularly characteristic of Hertz's framework. This issue seemed especially glaring given the expectation that Hertz's work might lead to a model of the underlying workings of the ether.In this paper I seek an explanation for why Hertz seemed so unperturbed by the difficulties of constructing such a mechanism. In arriving at this explanation, I explore how the development of Hertz's image-theory of representation framed the project of Principles. The image-theory brings with it an austere view of the “essential content” of mechanics, only requiring a kind of structural isomorphism between symbolic representations and target phenomena. I argue that bringing this into view makes clear why Hertz felt no need to work out the kinds of mechanisms that many of his readers looked for. Furthermore, I argue that a crucial role of Hertz's hypothesis of hidden masses has been widely overlooked. Far from acting as a proposal for the underlying structure of the ether, I show that Hertz's hypothesis ruled out knowledge of such underlying structure.  相似文献   

16.
Several authors have used the expression ‘formal asymmetry’ to characterize Einstein's method of introducing conceptual innovations. Prior to his use of formal asymmetries, however, Einstein relied upon analogy to introduce his major concepts, but without satisfactory results. He gradually refined another technique, reflection upon empirical problems, into the method of formal asymmetries, with impressive results. This historical study, based upon a textual analysis of Einstein's publications, raises a series of questions regarding the place of formal asymmetries in his work.  相似文献   

17.
The physiologist Claude Bernard was an important nineteenth-century methodologist of the life sciences. Here I place his thought in the context of the history of the vera causa standard, arguably the dominant epistemology of science in the eighteenth and early nineteenth centuries. Its proponents held that in order for a cause to be legitimately invoked in a scientific explanation, the cause must be shown by direct evidence to exist and to be competent to produce the effects ascribed to it. Historians of scientific method have argued that in the course of the nineteenth century the vera causa standard was superseded by a more powerful consequentialist epistemology, which also admitted indirect evidence for the existence and competence of causes. The prime example of this is the luminiferous ether, which was widely accepted, in the absence of direct evidence, because it entailed verified observational consequences and, in particular, successful novel predictions. According to the received view, the vera causa standard's demand for direct evidence of existence and competence came to be seen as an impracticable and needless restriction on the scope of legitimate inquiry into the fine structure of nature. The Mill-Whewell debate has been taken to exemplify this shift in scientific epistemology, with Whewell's consequentialism prevailing over Mill's defense of the older standard. However, Bernard's reflections on biological practice challenge the received view. His methodology marked a significant extension of the vera causa standard that made it both powerful and practicable. In particular, Bernard emphasized the importance of detection procedures in establishing the existence of unobservable entities. Moreover, his sophisticated notion of controlled experimentation permitted inferences about competence even in complex biological systems. In the life sciences, the vera causa standard began to flourish precisely around the time of its alleged abandonment.  相似文献   

18.
The paper takes up Bell's (1987) “Everett (?) theory” and develops it further. The resulting theory is about the system of all particles in the universe, each located in ordinary, 3-dimensional space. This many-particle system as a whole performs random jumps through 3N-dimensional configuration space – hence “Tychistic Bohmian Mechanics” (TBM). The distribution of its spontaneous localisations in configuration space is given by the Born Rule probability measure for the universal wavefunction. Contra Bell, the theory is argued to satisfy the minimal desiderata for a Bohmian theory within the Primitive Ontology framework (for which we offer a metaphysically more perspicuous formulation than is customary). TBM's formalism is that of ordinary Bohmian Mechanics (BM), without the postulate of continuous particle trajectories and their deterministic dynamics. This “rump formalism” receives, however, a different interpretation. We defend TBM as an empirically adequate and coherent quantum theory. Objections voiced by Bell and Maudlin are rebutted. The “for all practical purposes”-classical, Everettian worlds (i.e. quasi-classical histories) exist sequentially in TBM (rather than simultaneously, as in the Everett interpretation). In a temporally coarse-grained sense, they quasi-persist. By contrast, the individual particles themselves cease to persist.  相似文献   

19.
In The Paradox of Predictivism (2008, Cambridge University Press) I tried to demonstrate that there is an intimate relationship between predictivism (the thesis that novel predictions sometimes carry more weight than accommodations) and epistemic pluralism (the thesis that one important form of evidence in science is the judgments of other scientists). Here I respond to various published criticisms of some of the key points from Paradox from David Harker, Jarret Leplin, and Clark Glymour. Foci include my account of predictive novelty (endorsement novelty), the claim that predictivism has two roots, the prediction per se and predictive success, and my account of why Mendeleev’s predictions carried special weight in confirming the Periodic Law of the Elements.  相似文献   

20.
I reappraise in detail Hertz's cathode ray experiments. I show that, contrary to Buchwald's (1995) evaluation, the core experiment establishing the electrostatic properties of the rays was successfully replicated by Perrin (probably) and Thomson (certainly). Buchwald's discussion of ‘current purification’ is shown to be a red herring. My investigation of the origin of Buchwald's misinterpretation of this episode reveals that he was led astray by a focus on what Hertz ‘could do’—his experimental resources. I argue that one should focus instead on what Hertz wanted to achieve—his experimental goals. Focusing on these goals, I find that his explicit and implicit requirements for a successful investigation of the rays’ properties are met by Perrin and Thomson. Thus, even by Hertz's standards, they did indeed replicate his experiment.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号