首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In the Bayesian approach to quantum mechanics, probabilities—and thus quantum states—represent an agent's degrees of belief, rather than corresponding to objective properties of physical systems. In this paper we investigate the concept of certainty in quantum mechanics. Particularly, we show how the probability-1 predictions derived from pure quantum states highlight a fundamental difference between our Bayesian approach, on the one hand, and Copenhagen and similar interpretations on the other. We first review the main arguments for the general claim that probabilities always represent degrees of belief. We then argue that a quantum state prepared by some physical device always depends on an agent's prior beliefs, implying that the probability-1 predictions derived from that state also depend on the agent's prior beliefs. Quantum certainty is therefore always some agent's certainty. Conversely, if facts about an experimental setup could imply agent-independent certainty for a measurement outcome, as in many Copenhagen-like interpretations, that outcome would effectively correspond to a preexisting system property. The idea that measurement outcomes occurring with certainty correspond to preexisting system properties is, however, in conflict with locality. We emphasize this by giving a version of an argument of Stairs [(1983). Quantum logic, realism, and value-definiteness. Philosophy of Science, 50, 578], which applies the Kochen–Specker theorem to an entangled bipartite system.  相似文献   

2.
The paper argues that the formulation of quantum mechanics proposed by Ghirardi, Rimini and Weber (GRW) is a serious candidate for being a fundamental physical theory and explores its ontological commitments from this perspective. In particular, we propose to conceive of spatial superpositions of non-massless microsystems as dispositions or powers, more precisely propensities, to generate spontaneous localizations. We set out five reasons for this view, namely that (1) it provides for a clear sense in which quantum systems in entangled states possess properties even in the absence of definite values; (2) it vindicates objective, single-case probabilities; (3) it yields a clear transition from quantum to classical properties; (4) it enables to draw a clear distinction between purely mathematical and physical structures, and (5) it grounds the arrow of time in the time-irreversible manifestation of the propensities to localize.  相似文献   

3.
A theorem due to Geroch and Jang (1975) provides a sense in which the geodesic principle has the status of a theorem in General Relativity. I have recently shown that a similar theorem holds in the context of geometrized Newtonian gravitation (Newton–Cartan theory) (Weatherall, J.O., 2011). Here I compare the interpretations of these two theorems. I argue that despite some apparent differences between the theorems, the status of the geodesic principle in geometrized Newtonian gravitation is, mutatis mutandis, strikingly similar to the relativistic case.  相似文献   

4.
5.
The basic notion of an objective probability is that of a probability determined by the physical structure of the world. On this understanding, there are subjective credences that do not correspond to objective probabilities, such as credences concerning rival physical theories. The main question for objective probabilities is how they are determined by the physical structure.In this paper, I survey three ways of understanding objective probability: stochastic dynamics, humean chances, and deterministic chances (typicality). The first is the obvious way to understand the probabilities of quantum mechanics via a collapse theory such as GRW, the last is the way to understand the probabilities in the context of a deterministic theory such as Bohmian mechanics. Humean chances provide a more abstract and general account of chances locutions that are independent of dynamical considerations.  相似文献   

6.
A relation is obtained between weak values of quantum observables and the consistency criterion for histories of quantum events. It is shown that “strange” weak values for projection operators (such as values less than zero) always correspond to inconsistent families of histories. It is argued that using the ABL rule to obtain probabilities for counterfactual measurements corresponding to those strange weak values gives inconsistent results. This problem is shown to be remedied by using the conditional weight, or pseudo-probability, obtained from the multiple-time application of Lüders’ Rule. It is argued that an assumption of reverse causality (a form of time symmetry) implies that weak values obtain, in a restricted sense, at the time of the weak measurement as well as at the time of post-selection. Finally, it is argued that weak values are more appropriately characterized as multiple-time amplitudes than expectation values, and as such can have little to say about counterfactual questions.  相似文献   

7.
We discuss the meaning of probabilities in the many worlds interpretation of quantum mechanics. We start by presenting very briefly the many worlds theory, how the problem of probability arises, and some unsuccessful attempts to solve it in the past. Then we criticize a recent attempt by Deutsch to derive the quantum mechanical probabilities from the non-probabilistic parts of quantum mechanics and classical decision theory. We further argue that the Born probability does not make sense even as an additional probability rule in the many worlds theory. Our conclusion is that the many worlds theory fails to account for the probabilistic statements of standard (collapse) quantum mechanics.  相似文献   

8.
John Norton's The Material Theory of Induction bristles with fresh insights and provocative ideas that provide a much needed stimulus to a stodgy if not moribund field. I use quantum mechanics (QM) as a medium for exploring some of these ideas. First, I note that QM offers more predictability than Newtonian mechanics for the Norton dome and other cases where classical determinism falters. But this ability of QM to partially cure the ills of classical determinism depends on facts about the quantum Hamiltonian operator that vary from case to case, providing an illustration of Norton's theme of the importance of contingent facts for inductive reasoning. Second, I agree with Norton that Bayesianism as developed for classical probability theory does not constitute a universal inference machine, and I use QM to explain the sense in which this is so. But at the same time I defend a brand of quantum Bayesianism as providing an illuminating account of how physicists' reasoning about quantum events. Third, I argue that if the probabilities induced by quantum states are regarded as objective chances then there are strong reasons to think that fair infinite lotteries are impossible in a quantum world.  相似文献   

9.
GRW theory postulates a stochastic mechanism assuring that every so often the wave function of a quantum system is ‘hit’, which leaves it in a localised state. How are we to interpret the probabilities built into this mechanism? GRW theory is a firmly realist proposal and it is therefore clear that these probabilities are objective probabilities (i.e. chances). A discussion of the major theories of chance leads us to the conclusion that GRW probabilities can be understood only as either single case propensities or Humean objective chances. Although single case propensities have some intuitive appeal in the context of GRW theory, on balance it seems that Humean objective chances are preferable on conceptual grounds.  相似文献   

10.
In previous work, a non-standard theory of probability was formulated and used to systematize interference effects involving the simplest type of quantum systems. The main result here is a self-contained, non-trivial generalization of that theory to capture interference effects involving a much broader range of quantum systems. The discussion also focuses on interpretive matters having to do with the actual/virtual distinction, non-locality, and conditional probabilities.  相似文献   

11.
In this paper I critically evaluate the justification of the von Neumann–Lüders projection postulate for state changes in projective measurement contexts from the objective quantum Bayesian perspective. I point out that the justification provided so far for the von Neumann–Lüders projection postulate is insufficient. I argue that the best way to correct this problem is to make an assumption, Benign Realism, which is contradictory to the objective quantum Bayesian interpretation of quantum states.  相似文献   

12.
Earman (2018) has recently argued that the Principal Principle, a principle of rationality connecting objective chance and credence, is a theorem of quantum probability theory. This paper critiques Earman's argument, while also offering a positive proposal for how to understand the status of the Principal Principle in quantum probability theory.  相似文献   

13.
I outline an argument for a subjective Bayesian interpretation of quantum probabilities as degrees of belief distributed subject to consistency constraints on a quantum rather than a classical event space. I show that the projection postulate of quantum mechanics can be understood as a noncommutative generalization of the classical Bayesian rule for updating an initial probability distribution on new information, and I contrast the Bayesian interpretation of quantum probabilities sketched here with an alternative approach defended by Chris Fuchs.  相似文献   

14.
In this paper we investigate the feasibility of algorithmically deriving precise probability forecasts from imprecise forecasts. We provide an empirical evaluation of precise probabilities that have been derived from two types of imprecise probability forecasts: probability intervals and probability intervals with second-order probability distributions. The minimum cross-entropy (MCE) principle is applied to the former to derive precise (i.e. additive) probabilities; expectation (EX) is used to derive precise probabilities in the latter case. Probability intervals that were constructed without second-order probabilities tended to be narrower than and contained in those that were amplified by second-order probabilities. Evidence that this narrowness is due to motivational bias is presented. Analysis of forecasters' mean Probability Scores for the derived precise probabilities indicates that it is possible to derive precise forecasts whose external correspondence is as good as directly assessed precise probability forecasts. The forecasts of the EX method, however, are more like the directly assessed precise forecasts than those of the MCE method.  相似文献   

15.
We defend the many-worlds interpretation of quantum mechanics (MWI) against the objection that it cannot explain why measurement outcomes are predicted by the Born probability rule. We understand quantum probabilities in terms of an observer's self-location probabilities. We formulate a probability postulate for the MWI: the probability of self-location in a world with a given set of outcomes is the absolute square of that world's amplitude. We provide a proof of this postulate, which assumes the quantum formalism and two principles concerning symmetry and locality. We also show how a structurally similar proof of the Born rule is available for collapse theories. We conclude by comparing our account to the recent account offered by Sebens and Carroll.  相似文献   

16.
In recent papers, Zurek [(2005). Probabilities from entanglement, Born's rule pk=|ψk|2 from entanglement. Physical Review A, 71, 052105] has objected to the decision-theoretic approach of Deutsch [(1999) Quantum theory of probability and decisions. Proceedings of the Royal Society of London A, 455, 3129–3137] and Wallace [(2003). Everettian rationality: defending Deutsch's approach to probability in the Everett interpretation. Studies in History and Philosophy of Modern Physics, 34, 415–438] to deriving the Born rule for quantum probabilities on the grounds that it courts circularity. Deutsch and Wallace assume that the many worlds theory is true and that decoherence gives rise to a preferred basis. However, decoherence arguments use the reduced density matrix, which relies upon the partial trace and hence upon the Born rule for its validity. Using the Heisenberg picture and quantum Darwinism—the notion that classical information is quantum information that can proliferate in the environment pioneered in Ollivier et al. [(2004). Objective properties from subjective quantum states: Environment as a witness. Physical Review Letters, 93, 220401 and (2005). Environment as a witness: Selective proliferation of information and emergence of objectivity in a quantum universe. Physical Review A, 72, 042113]—I show that measurement interactions between two systems only create correlations between a specific set of commuting observables of system 1 and a specific set of commuting observables of system 2. This argument picks out a unique basis in which information flows in the correlations between those sets of commuting observables. I then derive the Born rule for both pure and mixed states and answer some other criticisms of the decision theoretic approach to quantum probability.  相似文献   

17.
It is generally thought that objective chances for particular events different from 1 and 0 and determinism are incompatible. However, there are important scientific theories whose laws are deterministic but which also assign non-trivial probabilities to events. The most important of these is statistical mechanics whose probabilities are essential to the explanations of thermodynamic phenomena. These probabilities are often construed as ‘ignorance’ probabilities representing our lack of knowledge concerning the microstate. I argue that this construal is incompatible with the role of probability in explanation and laws. This is the ‘paradox of deterministic probabilities’. After surveying the usual list of accounts of objective chance and finding them inadequate I argue that an account of chance sketched by David Lewis can be modified to solve the paradox of deterministic probabilities and provide an adequate account of the probabilities in deterministic theories like statistical mechanics.  相似文献   

18.
The traditional use of ergodic theory in the foundations of equilibrium statistical mechanics is that it provides a link between thermodynamic observables and microcanonical probabilities. First of all, the ergodic theorem demonstrates the equality of microcanonical phase averages and infinite time averages (albeit for a special class of systems, and up to a measure zero set of exceptions). Secondly, one argues that actual measurements of thermodynamic quantities yield time averaged quantities, since measurements take a long time. The combination of these two points is held to be an explanation why calculating microcanonical phase averages is a successful algorithm for predicting the values of thermodynamic observables. It is also well known that this account is problematic.This survey intends to show that ergodic theory nevertheless may have important roles to play, and it explores three other uses of ergodic theory. Particular attention is paid, firstly, to the relevance of specific interpretations of probability, and secondly, to the way in which the concern with systems in thermal equilibrium is translated into probabilistic language. With respect to the latter point, it is argued that equilibrium should not be represented as a stationary probability distribution as is standardly done; instead, a weaker definition is presented.  相似文献   

19.
In his response to my (2010), Ian Kidd claims that my argument against Stump’s interpretation of Duhem’s concept of ‘good sense’ is unsound because it ignores an important distinction within virtue epistemology. In light of the distinction between reliabilist and responsibilist virtue epistemology, Kidd argues that Duhem can be seen as supporting the latter, which he further illustrates with a discussion of Duhem’s argument against ‘perfect theory’. I argue that no substantive argument is offered to show that the distinction is relevant and can establish that Duhem’s ‘good sense’ can be understood within responsibilist virtue epistemology. I furthermore demonstrate that Kidd’s attempt to support his contention relies on a crucial misreading of Duhem’s general philosophy of science, and in doing so highlight the importance of understanding ‘good sense’ in its original context, that of theory choice.  相似文献   

20.
Probabilistic forecasts have good ‘external correspondence’ if events that are assigned probabilities close to 1 tend to occur frequently, whereas those assigned probabilities near 0 tend to occur rarely. This paper describes simple procedures for analysing external correspondence into meaningful components that might guide efforts to understand and improve forecasting performance. The procedures focus on differences between the judgements made by the forecaster when the target event occurs, as compared to when it does not. The illustrations involve a professional oddsmaker's predictions of baseball game outcomes, meteorologists' precipitation forecasts and physicians' diagnoses of pneumonia. The illustrations demonstrate the ability of the procedures to highlight important forecasting tendencies that are sometimes more difficult to discern by other means.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号