首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 406 毫秒
1.
In a previous paper [Hemmo, M & Shenker, O (2003). Quantum decoherence and the approach to equilibrium I. Philosophy of Science, 70, 330–358] we discussed a recent proposal by Albert [(2000). Time and chance. Cambridge, MA: Harvard University Press. Chapter 7] to recover thermodynamics on a purely dynamical basis, using the quantum theory of the collapse of the quantum state of [Ghirardi, G, Rimini, A and Weber, T., (1986). Unified dynamics for microscopic and macroscopic systems. Physical Review, D 34, 470–479]. We proposed an alternative way to explain thermodynamics within no collapse interpretations of quantum mechanics. In this paper some difficulties faced by both approaches are discussed and solved: the spin echo experiments, and the problem of extremely light gases. In these contexts, we point out several ways in which the above quantum mechanical approaches as well as some other classical approaches to the foundations of statistical mechanics may be distinguished experimentally.  相似文献   

2.
Can we explain the laws of thermodynamics, in particular the irreversible increase of entropy, from the underlying quantum mechanical dynamics? Attempts based on classical dynamics have all failed. Albert (1994a,b; 2000) proposed a way to recover thermodynamics on a purely dynamical basis, using the quantum theory of the collapse of the wavefunction of Ghirardi, Rimini and Weber (1986). In this paper we propose an alternative way to explain thermodynamics within no-collapse interpretations of quantum mechanics. Our approach relies on the standard quantum mechanical models of environmental decoherence of open systems, e.g. Joos and Zeh (1985) and Zurek and Paz (1994).  相似文献   

3.
In recent papers, Zurek [(2005). Probabilities from entanglement, Born's rule pk=|ψk|2 from entanglement. Physical Review A, 71, 052105] has objected to the decision-theoretic approach of Deutsch [(1999) Quantum theory of probability and decisions. Proceedings of the Royal Society of London A, 455, 3129–3137] and Wallace [(2003). Everettian rationality: defending Deutsch's approach to probability in the Everett interpretation. Studies in History and Philosophy of Modern Physics, 34, 415–438] to deriving the Born rule for quantum probabilities on the grounds that it courts circularity. Deutsch and Wallace assume that the many worlds theory is true and that decoherence gives rise to a preferred basis. However, decoherence arguments use the reduced density matrix, which relies upon the partial trace and hence upon the Born rule for its validity. Using the Heisenberg picture and quantum Darwinism—the notion that classical information is quantum information that can proliferate in the environment pioneered in Ollivier et al. [(2004). Objective properties from subjective quantum states: Environment as a witness. Physical Review Letters, 93, 220401 and (2005). Environment as a witness: Selective proliferation of information and emergence of objectivity in a quantum universe. Physical Review A, 72, 042113]—I show that measurement interactions between two systems only create correlations between a specific set of commuting observables of system 1 and a specific set of commuting observables of system 2. This argument picks out a unique basis in which information flows in the correlations between those sets of commuting observables. I then derive the Born rule for both pure and mixed states and answer some other criticisms of the decision theoretic approach to quantum probability.  相似文献   

4.
This paper examines the interweaving of the history of quantum decoherence and the interpretation problem in quantum mechanics through the work of two physicists—H. Dieter Zeh and Wojciech Zurek. In the early 1970s Zeh anticipated many of the important concepts of decoherence, framing it within an Everett-type interpretation. Zeh has since remained committed to this view; however, Zurek, whose papers in the 1980s were crucial in the treatment of the preferred basis problem and the subsequent development of density matrix formalism, has argued that decoherence leads to what he terms the ‘existential interpretation’, compatible with certain aspects of both Everett's relative-state formulation and the Bohr's ‘Copenhagen interpretation’. I argue that these different interpretations can be traced back to the different early approaches to the study of environment-induced decoherence in quantum systems, evident in the early work of Zeh and Zurek. I also show how Zurek's work has contributed to the tendency to see decoherence as contributing to a ‘new orthodoxy’ or a reconstruction of the original Copenhagen interpretation.  相似文献   

5.
We discuss the meaning of probabilities in the many worlds interpretation of quantum mechanics. We start by presenting very briefly the many worlds theory, how the problem of probability arises, and some unsuccessful attempts to solve it in the past. Then we criticize a recent attempt by Deutsch to derive the quantum mechanical probabilities from the non-probabilistic parts of quantum mechanics and classical decision theory. We further argue that the Born probability does not make sense even as an additional probability rule in the many worlds theory. Our conclusion is that the many worlds theory fails to account for the probabilistic statements of standard (collapse) quantum mechanics.  相似文献   

6.
This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard (two-vector) approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions (‘forwards’ and ‘backwards’) within the one-vector formalism. In turn, considerations of forwards and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival.  相似文献   

7.
Typical worlds     
Hugh Everett III presented pure wave mechanics, sometimes referred to as the many-worlds interpretation, as a solution to the quantum measurement problem. While pure wave mechanics is an objectively deterministic physical theory with no probabilities, Everett sought to show how the theory might be understood as making the standard quantum statistical predictions as appearances to observers who were themselves described by the theory. We will consider his argument and how it depends on a particular notion of branch typicality. We will also consider responses to Everett and the relationship between typicality and probability. The suggestion will be that pure wave mechanics requires a number of significant auxiliary assumptions in order to make anything like the standard quantum predictions.  相似文献   

8.
According to Zurek, decoherence is a process resulting from the interaction between a quantum system and its environment; this process singles out a preferred set of states, usually called “pointer basis”, that determines which observables will receive definite values. This means that decoherence leads to a sort of selection which precludes all except a small subset of the states in the Hilbert space of the system from behaving in a classical manner: environment-induced-superselection—einselection—is a consequence of the process of decoherence. The aim of this paper is to present a new approach to decoherence, different from the mainstream approach of Zurek and his collaborators. We will argue that this approach offers conceptual advantages over the traditional one when problems of foundations are considered; in particular, from the new perspective, decoherence in closed quantum systems becomes possible and the preferred basis acquires a well founded definition.  相似文献   

9.
Quantum mechanics is a theory whose foundations spark controversy to this day. Although many attempts to explain the underpinnings of the theory have been made, none has been unanimously accepted as satisfactory. Fuchs has recently claimed that the foundational issues can be resolved by interpreting quantum mechanics in the light of quantum information. The view proposed is that quantum mechanics should be interpreted along the lines of the subjective Bayesian approach to probability theory. The quantum state is not the physical state of a microscopic object. It is an epistemic state of an observer; it represents subjective degrees of belief about outcomes of measurements. The interpretation gives an elegant solution to the infamous measurement problem: measurement is nothing but Bayesian belief updating in a analogy to belief updating in a classical setting. In this paper, we analyze an argument that Fuchs gives in support of this latter claim. We suggest that the argument is not convincing since it rests on an ad hoc construction. We close with some remarks on the options left for Fuchs’ quantum Bayesian project.  相似文献   

10.
It is part of information theory folklore that, while quantum theory prohibits the generic (or universal) cloning of states, such cloning is allowed by classical information theory. Indeed, many take the phenomenon of no-cloning to be one of the features that distinguishes quantum mechanics from classical mechanics. In this paper, we argue that pace conventional wisdom, in the case where one does not include a machine system, there is an analog of the no-cloning theorem for classical systems. However, upon adjoining a non-trivial machine system (or ancilla) one finds that, pace the quantum case, the obstruction to cloning disappears for pure states. We begin by discussing some conceptual points and category-theoretic generalities having to do with cloning, and proceed to discuss no-cloning in both the case of (non-statistical) classical mechanics and classical statistical mechanics.  相似文献   

11.
We make a first attempt to axiomatically formulate the Montevideo interpretation of quantum mechanics. In this interpretation environmental decoherence is supplemented with loss of coherence due to the use of realistic clocks to measure time to solve the measurement problem. The resulting formulation is framed entirely in terms of quantum objects. Unlike in ordinary quantum mechanics, classical time only plays the role of an unobservable parameter. The formulation eliminates any privileged role of the measurement process giving an objective definition of when an event occurs in a system.  相似文献   

12.
It is widely believed that the underlying reality behind statistical mechanics is a deterministic and unitary time evolution of a many-particle wave function, even though this is in conflict with the irreversible, stochastic nature of statistical mechanics. The usual attempts to resolve this conflict for instance by appealing to decoherence or eigenstate thermalization are riddled with problems. This paper considers theoretical physics of thermalized systems as it is done in practice and shows that all approaches to thermalized systems presuppose in some form limits to linear superposition and deterministic time evolution. These considerations include, among others, the classical limit, extensivity, the concepts of entropy and equilibrium, and symmetry breaking in phase transitions and quantum measurement. As a conclusion, the paper suggests that the irreversibility and stochasticity of statistical mechanics should be taken as a real property of nature. It follows that a gas of a macroscopic number N of atoms in thermal equilibrium is best represented by a collection of N wave packets of a size of the order of the thermal de Broglie wave length, which behave quantum mechanically below this scale but classically sufficiently far beyond this scale. In particular, these wave packets must localize again after scattering events, which requires stochasticity and indicates a connection to the measurement process.  相似文献   

13.
In the Bayesian approach to quantum mechanics, probabilities—and thus quantum states—represent an agent's degrees of belief, rather than corresponding to objective properties of physical systems. In this paper we investigate the concept of certainty in quantum mechanics. Particularly, we show how the probability-1 predictions derived from pure quantum states highlight a fundamental difference between our Bayesian approach, on the one hand, and Copenhagen and similar interpretations on the other. We first review the main arguments for the general claim that probabilities always represent degrees of belief. We then argue that a quantum state prepared by some physical device always depends on an agent's prior beliefs, implying that the probability-1 predictions derived from that state also depend on the agent's prior beliefs. Quantum certainty is therefore always some agent's certainty. Conversely, if facts about an experimental setup could imply agent-independent certainty for a measurement outcome, as in many Copenhagen-like interpretations, that outcome would effectively correspond to a preexisting system property. The idea that measurement outcomes occurring with certainty correspond to preexisting system properties is, however, in conflict with locality. We emphasize this by giving a version of an argument of Stairs [(1983). Quantum logic, realism, and value-definiteness. Philosophy of Science, 50, 578], which applies the Kochen–Specker theorem to an entangled bipartite system.  相似文献   

14.
Here we investigate what it might mean for a formulation of quantum mechanics to be empirically adequate. We begin by considering the measurement problem as an empirical problem and distinguishing between stronger and weaker varieties of empirical adequacy. A strongly adequate theory is one that explains the experiences of a physically situated observer. A formulation of quantum mechanics that provides such situated empirical adequacy also provides a particularly compelling response to the measurement problem. As a concrete example we consider how Bohmian mechanics explains the experience of a physically situated observer.  相似文献   

15.
In spite of the increasing attention that quantum chaos has received from physicists in recent times, when the subject is considered from a conceptual viewpoint the usual opinion is that there is some kind of conflict between quantum mechanics and chaos. In this paper we follow the program of Belot and Earman, who propose to analyze the problem of quantum chaos as a particular case of the classical limit of quantum mechanics. In particular, we address the problem on the basis of our account of the classical limit, which in turn is grounded on the self-induced approach to decoherence. This strategy allows us to identify the conditions that a quantum system must satisfy to lead to non-integrability and to mixing in the classical limit.  相似文献   

16.
In October 1924, The Physical Review, a relatively minor journal at the time, published a remarkable two-part paper by John H. Van Vleck, working in virtual isolation at the University of Minnesota. Using Bohr’s correspondence principle and Einstein’s quantum theory of radiation along with advanced techniques from classical mechanics, Van Vleck showed that quantum formulae for emission, absorption, and dispersion of radiation merge with their classical counterparts in the limit of high quantum numbers. For modern readers Van Vleck’s paper is much easier to follow than the famous paper by Kramers and Heisenberg on dispersion theory, which covers similar terrain and is widely credited to have led directly to Heisenberg’s Umdeutung paper. This makes Van Vleck’s paper extremely valuable for the reconstruction of the genesis of matrix mechanics. It also makes it tempting to ask why Van Vleck did not take the next step and develop matrix mechanics himself. This paper was written as part of a joint project in the history of quantum physics of the Max Planck Institut für Wissenschaftsgeschichte and the Fritz-Haber-Institut in Berlin.  相似文献   

17.
Everettian accounts of quantum mechanics entail that people branch; every possible result of a measurement actually occurs, and I have one successor for each result. Is there room for probability in such an account? The prima facie answer is no; there are no ontic chances here, and no ignorance about what will happen. But since any adequate quantum mechanical theory must make probabilistic predictions, much recent philosophical labor has gone into trying to construct an account of probability for branching selves. One popular strategy involves arguing that branching selves introduce a new kind of subjective uncertainty. I argue here that the variants of this strategy in the literature all fail, either because the uncertainty is spurious, or because it is in the wrong place to yield probabilistic predictions. I conclude that uncertainty cannot be the ground for probability in Everettian quantum mechanics.  相似文献   

18.
19.
We argue against claims that the classical ? → 0 limit is “singular” in a way that frustrates an eliminative reduction of classical to quantum physics. We show one precise sense in which quantum mechanics and scaling behavior can be used to recover classical mechanics exactly, without making prior reference to the classical theory. To do so, we use the tools of strict deformation quantization, which provides a rigorous way to capture the ? → 0 limit. We then use the tools of category theory to demonstrate one way that this reduction is explanatory: it illustrates a sense in which the structure of quantum mechanics determines that of classical mechanics.  相似文献   

20.
Planck's change in attitude to the question of whether atomic hypotheses were scientifically accessible, is discussed. It is argued contra Holton, that Planck's change in attitude to this question did not signal a methodological shift towards realism. The point of doing this is not just to investigate a significant episode in the history of quantum theory, but also to use the episode as a case study in support of a broader historical thesis. This thesis is that there was a widespread late-nineteenth century methodological tradition which motivated the change in status of certain ontological claims — e.g., that atoms exist — from ‘inaccessible to science’ to ‘scientifically acceptable’ even though those claims were not strictly ‘observable’. This methodological tradition is a hybrid of positivist and realist views. Thus, contrary to one popular view, the fin de siécle triumph of atomism is not to be seen as a triumph for a realist view of science Poincare's views are also used as an illustration.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号