首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Quantum mechanics is a theory whose foundations spark controversy to this day. Although many attempts to explain the underpinnings of the theory have been made, none has been unanimously accepted as satisfactory. Fuchs has recently claimed that the foundational issues can be resolved by interpreting quantum mechanics in the light of quantum information. The view proposed is that quantum mechanics should be interpreted along the lines of the subjective Bayesian approach to probability theory. The quantum state is not the physical state of a microscopic object. It is an epistemic state of an observer; it represents subjective degrees of belief about outcomes of measurements. The interpretation gives an elegant solution to the infamous measurement problem: measurement is nothing but Bayesian belief updating in a analogy to belief updating in a classical setting. In this paper, we analyze an argument that Fuchs gives in support of this latter claim. We suggest that the argument is not convincing since it rests on an ad hoc construction. We close with some remarks on the options left for Fuchs’ quantum Bayesian project.  相似文献   

2.
Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computational quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.  相似文献   

3.
I argue that Deutsch׳s model for the behavior of systems traveling around closed timelike curves (CTCs) relies implicitly on a substantive metaphysical assumption. Deutsch is employing a version of quantum theory with a significantly supplemented ontology of parallel existent worlds, which differ in kind from the many worlds of the Everett interpretation. Standard Everett does not support the existence of multiple identical copies of the world, which the D-CTC model requires. This has been obscured because he often refers to the branching structure of Everett as a “multiverse”, and describes quantum interference by reference to parallel interacting definite worlds. But he admits that this is only an approximation to Everett. The D-CTC model, however, relies crucially on the existence of a multiverse of parallel interacting worlds. Since his model is supplemented by structures that go significantly beyond quantum theory, and play an ineliminable role in its predictions and explanations, it does not represent a quantum solution to the paradoxes of time travel.  相似文献   

4.
Earman (2018) has recently argued that the Principal Principle, a principle of rationality connecting objective chance and credence, is a theorem of quantum probability theory. This paper critiques Earman's argument, while also offering a positive proposal for how to understand the status of the Principal Principle in quantum probability theory.  相似文献   

5.
6.
I propose a general geometric framework in which to discuss the existence of time observables. This framework allows one to describe a local sense in which time observables always exist, and a global sense in which they can sometimes exist subject to a restriction on the vector fields that they generate. Pauli׳s prohibition on quantum time observables is derived as a corollary to this result. I will then discuss how time observables can be regained in modest extensions of quantum theory beyond its standard formulation.  相似文献   

7.
When attempting to assess the strengths and weaknesses of various principles in their potential role of guiding the formulation of a theory of quantum gravity, it is crucial to distinguish between principles which are strongly supported by empirical data – either directly or indirectly – and principles which instead (merely) rely heavily on theoretical arguments for their justification. Principles in the latter category are not necessarily invalid, but their a priori foundational significance should be regarded with due caution. These remarks are illustrated in terms of the current standard models of cosmology and particle physics, as well as their respective underlying theories, i.e., essentially general relativity and quantum (field) theory. For instance, it is clear that both standard models are severely constrained by symmetry principles: an effective homogeneity and isotropy of the known universe on the largest scales in the case of cosmology and an underlying exact gauge symmetry of nuclear and electromagnetic interactions in the case of particle physics. However, in sharp contrast to the cosmological situation, where the relevant symmetry structure is more or less established directly on observational grounds, all known, nontrivial arguments for the “gauge principle” are purely theoretical (and far less conclusive than usually advocated). Similar remarks apply to the larger theoretical structures represented by general relativity and quantum (field) theory, where – actual or potential – empirical principles, such as the (Einstein) equivalence principle or EPR-type nonlocality, should be clearly differentiated from theoretical ones, such as general covariance or renormalizability. It is argued that if history is to be of any guidance, the best chance to obtain the key structural features of a putative quantum gravity theory is by deducing them, in some form, from the appropriate empirical principles (analogous to the manner in which, say, the idea that gravitation is a curved spacetime phenomenon is arguably implied by the equivalence principle). Theoretical principles may still be useful however in formulating a concrete theory (analogous to the manner in which, say, a suitable form of general covariance can still act as a sieve for separating theories of gravity from one another). It is subsequently argued that the appropriate empirical principles for deducing the key structural features of quantum gravity should at least include (i) quantum nonlocality, (ii) irreducible indeterminacy (or, essentially equivalently, given (i), relativistic causality), (iii) the thermodynamic arrow of time, (iv) homogeneity and isotropy of the observable universe on the largest scales. In each case, it is explained – when appropriate – how the principle in question could be implemented mathematically in a theory of quantum gravity, why it is considered to be of fundamental significance and also why contemporary accounts of it are insufficient. For instance, the high degree of uniformity observed in the Cosmic Microwave Background is usually regarded as theoretically problematic because of the existence of particle horizons, whereas the currently popular attempts to resolve this situation in terms of inflationary models are, for a number of reasons, less than satisfactory. However, rather than trying to account for the required empirical features dynamically, an arguably much more fruitful approach consists in attempting to account for these features directly, in the form of a lawlike initial condition within a theory of quantum gravity.  相似文献   

8.
In this paper I critically evaluate the justification of the von Neumann–Lüders projection postulate for state changes in projective measurement contexts from the objective quantum Bayesian perspective. I point out that the justification provided so far for the von Neumann–Lüders projection postulate is insufficient. I argue that the best way to correct this problem is to make an assumption, Benign Realism, which is contradictory to the objective quantum Bayesian interpretation of quantum states.  相似文献   

9.
10.
In previous work, a non-standard theory of probability was formulated and used to systematize interference effects involving the simplest type of quantum systems. The main result here is a self-contained, non-trivial generalization of that theory to capture interference effects involving a much broader range of quantum systems. The discussion also focuses on interpretive matters having to do with the actual/virtual distinction, non-locality, and conditional probabilities.  相似文献   

11.
This article presents a discussion of the notion of quantum ontological excess baggage, first proposed by Hardy. It is argued here that this idea does not have the significance suggested in his paper. It is shown that if this concept is properly analyzed, it fails to pose any threat to the ontic approach to quantum theory in general.  相似文献   

12.
Motivated by the question what it is that makes quantum mechanics a holistic theory (if so), I try to define for general physical theories what we mean by `holism'. For this purpose I propose an epistemological criterion to decide whether or not a physical theory is holistic, namely: a physical theory is holistic if and only if it is impossible in principle to infer the global properties, as assigned in the theory, by local resources available to an agent. I propose that these resources include at least all local operations and classical communication. This approach is contrasted with the well-known approaches to holism in terms of supervenience. The criterion for holism proposed here involves a shift in emphasis from ontology to epistemology. I apply this epistemological criterion to classical physics and Bohmian mechanics as represented on a phase and configuration space respectively, and for quantum mechanics (in the orthodox interpretation) using the formalism of general quantum operations as completely positive trace non-increasing maps. Furthermore, I provide an interesting example from which one can conclude that quantum mechanics is holistic in the above mentioned sense, although, perhaps surprisingly, no entanglement is needed.  相似文献   

13.
The radiation that is due to the braking of charged particles has been in the focus of theoretical physics since the discovery of X-rays by the end of the 19th century. The impact of cathode rays in the anti-cathode of an X-ray tube that resulted in the production of X-rays led to the view that X-rays are aether impulses spreading from the site of the impact. In 1909, Arnold Sommerfeld calculated from Maxwell׳s equations the angular distribution of electromagnetic radiation due to the braking of electrons. He thereby coined the notion of “Bremsstrahlen.” In 1923, Hendrik A. Kramers provided a quantum theoretical explanation of this process by means of Bohr׳s correspondence principle. With the advent of quantum mechanics the theory of bremsstrahlung became a target of opportunity for theorists like Yoshikatsu Sugiura, Robert Oppenheimer, and–again–Sommerfeld, who presented in 1931 a comprehensive treatise on this subject. Throughout the 1930s, Sommerfeld׳s disciples in Munich and elsewhere extended and improved the bremsstrahlen theory. Hans Bethe and Walter Heitler, in particular, in 1934 presented a theory that was later regarded as “the most important achievement of QED in the 1930s” (Freeman Dyson). From a historical perspective the bremsstrahlen problem may be regarded as a probe for the evolution of theories in response to revolutionary changes in the underlying principles.  相似文献   

14.
The apparent dichotomy between quantum jumps on the one hand, and continuous time evolution according to wave equations on the other hand, provided a challenge to Bohr's proposal of quantum jumps in atoms. Furthermore, Schrödinger's time-dependent equation also seemed to require a modification of the explanation for the origin of line spectra due to the apparent possibility of superpositions of energy eigenstates for different energy levels. Indeed, Schrödinger himself proposed a quantum beat mechanism for the generation of discrete line spectra from superpositions of eigenstates with different energies.However, these issues between old quantum theory and Schrödinger's wave mechanics were correctly resolved only after the development and full implementation of photon quantization. The second quantized scattering matrix formalism reconciles quantum jumps with continuous time evolution through the identification of quantum jumps with transitions between different sectors of Fock space. The continuous evolution of quantum states is then recognized as a sum over continually evolving jump amplitudes between different sectors in Fock space.In today's terminology, this suggests that linear combinations of scattering matrix elements are epistemic sums over ontic states. Insights from the resolution of the dichotomy between quantum jumps and continuous time evolution therefore hold important lessons for modern research both on interpretations of quantum mechanics and on the foundations of quantum computing. They demonstrate that discussions of interpretations of quantum theory necessarily need to take into account field quantization. They also demonstrate the limitations of the role of wave equations in quantum theory, and caution us that superpositions of quantum states for the formation of qubits may be more limited than usually expected.  相似文献   

15.
I argue that the key principle of microgravity is what I have called elsewhere the Lorentzian strategy. This strategy may be seen as either a reverse-engineering approach or a descent with modification approach, but however one sees if the method works neither by attempting to propound a theory that is the quantum version of either an extant or generalized gravitation theory nor by attempting to propound a theory that is the final version of quantum mechanics and finding gravity within it. Instead the method works by beginning with what we are pretty sure is a good approximation to the low-energy limit of whatever the real microprocesses are that generate what we experience as gravitation. This method is powerful, fruitful, and not committed to principles for which we have, as yet, only scant evidence; the method begins with what we do know and teases out what we can know next. The principle is methodological, not ontological.  相似文献   

16.
Fritz London's seminal idea of “quantum mechanisms of macroscopic scale”, first articulated in 1946, was the unanticipated result of two decades of research, during which London pursued quantum-mechanical explanations of various kinds of systems of particles at different scales. He started at the microphysical scale with the hydrogen molecule, generalized his approach to chemical bonds and intermolecular forces, then turned to macrophysical systems like superconductors and superfluid helium. Along this path, he formulated a set of concepts—the quantum mechanism of exchange, the rigidity of the wave function, the role of quantum statistics in multi-particle systems, the possibility of order in momentum space—that eventually coalesced into a new conception of systems of equal particles. In particular, it was London's clarification of Bose-Einstein condensation that enabled him to formulate the notion of superfluids, and led him to the recognition that quantum mechanics was not, as it was commonly assumed, relevant exclusively as a micromechanics.  相似文献   

17.
Can we explain the laws of thermodynamics, in particular the irreversible increase of entropy, from the underlying quantum mechanical dynamics? Attempts based on classical dynamics have all failed. Albert (1994a,b; 2000) proposed a way to recover thermodynamics on a purely dynamical basis, using the quantum theory of the collapse of the wavefunction of Ghirardi, Rimini and Weber (1986). In this paper we propose an alternative way to explain thermodynamics within no-collapse interpretations of quantum mechanics. Our approach relies on the standard quantum mechanical models of environmental decoherence of open systems, e.g. Joos and Zeh (1985) and Zurek and Paz (1994).  相似文献   

18.
Operational frameworks are very useful to study the foundations of quantum mechanics, and are sometimes used to promote antirealist attitudes towards the theory. The aim of this paper is to review three arguments aiming at defending an antirealist reading of quantum physics based on various developments of standard quantum mechanics appealing to notions such as quantum information, non-causal correlations and indefinite causal orders. Those arguments will be discussed in order to show that they are not convincing. Instead, it is argued that there is conceptually no argument that could favour realist or antirealist attitudes towards quantum mechanics based solely on some features of some formalism. In particular, both realist and antirealist views are well accomodable within operational formulations of the theory. The reason for this is that the realist/antirealist debate is located at a purely epistemic level, which is not engaged by formal aspects of theories. As such, operational formulations of quantum mechanics are epistmologically and ontologically neutral. This discussion aims at clarifying the limits of the historical and methodological affinities between scientific antirealism and operational physics while engaging with recent discoveries in quantum foundations. It also aims at presenting various realist strategies to account for those developments.  相似文献   

19.
Claims that the standard procedure for testing scientific theories is inapplicable to Everettian quantum theory, and hence that the theory is untestable, are due to misconceptions about probability and about the logic of experimental testing. Refuting those claims by correcting those misconceptions leads to an improved theory of scientific methodology (based on Popper׳s) and testing, which allows various simplifications, notably the elimination of everything probabilistic from the methodology (‘Bayesian’ credences) and from fundamental physics (stochastic processes).  相似文献   

20.
Despite remarkable efforts, it remains notoriously difficult to equip quantum theory with a coherent ontology. Hence, Healey (2017, 12) has recently suggested that “quantum theory has no physical ontology and states no facts about physical objects or events”, and Fuchs et al. (2014, 752) similarly hold that “quantum mechanics itself does not deal directly with the objective world”. While intriguing, these positions either raise the question of how talk of ‘physical reality’ can even remain meaningful, or they must ultimately embrace a hidden variables-view, in tension with their original project. I here offer a neo-Kantian alternative. In particular, I will show how constitutive elements in the sense of Reichenbach (1920) and Friedman (1999, 2001) can be identified within quantum theory, through considerations of symmetries that allow the constitution of a ‘quantum reality’, without invoking any notion of a radically mind-independent reality. The resulting conception will inherit elements from pragmatist and ‘QBist’ approaches, but also differ from them in crucial respects. Furthermore, going beyond the Friedmanian program, I will show how non-fundamental and approximate symmetries can be relevant for identifying constitutive principles.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号