首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
The main topics of this second part of a two-part essay are some consequences of the phenomenon of vacuum polarization as the most important physical manifestation of modular localization. Besides philosophically unexpected consequences, it has led to a new constructive “outside-inwards approach” in which the pointlike fields and the compactly localized operator algebras which they generate only appear from intersecting much simpler algebras localized in noncompact wedge regions whose generators have extremely mild almost free field behavior.Another consequence of vacuum polarization presented in this essay is the localization entropy near a causal horizon which follows a logarithmically modified area law in which a dimensionless area (the area divided by the square of dR where dR is the thickness of a light-sheet) appears. There are arguments that this logarithmically modified area law corresponds to the volume law of the standard heat bath thermal behavior. We also explain the symmetry enhancing effect of holographic projections onto the causal horizon of a region and show that the resulting infinite dimensional symmetry groups contain the Bondi–Metzner–Sachs group. This essay is the second part of a partitioned longer paper.  相似文献   

2.
3.
Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computational quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.  相似文献   

4.
Among the alternatives of non-relativistic quantum mechanics (NRQM) there are those that give different predictions than quantum mechanics in yet-untested circumstances, while remaining compatible with current empirical findings. In order to test these predictions, one must isolate one's system from environmental induced decoherence, which, on the standard view of NRQM, is the dynamical mechanism that is responsible for the ‘apparent’ collapse in open quantum systems. But while recent advances in condensed-matter physics may lead in the near future to experimental setups that will allow one to test the two hypotheses, namely genuine collapse vs. decoherence, hence make progress toward a solution to the quantum measurement problem, those philosophers and physicists who are advocating an information-theoretic approach to the foundations of quantum mechanics are still unwilling to acknowledge the empirical character of the issue at stake. Here I argue that in doing so they are displaying an unwarranted double standard.  相似文献   

5.
Fritz London's seminal idea of “quantum mechanisms of macroscopic scale”, first articulated in 1946, was the unanticipated result of two decades of research, during which London pursued quantum-mechanical explanations of various kinds of systems of particles at different scales. He started at the microphysical scale with the hydrogen molecule, generalized his approach to chemical bonds and intermolecular forces, then turned to macrophysical systems like superconductors and superfluid helium. Along this path, he formulated a set of concepts—the quantum mechanism of exchange, the rigidity of the wave function, the role of quantum statistics in multi-particle systems, the possibility of order in momentum space—that eventually coalesced into a new conception of systems of equal particles. In particular, it was London's clarification of Bose-Einstein condensation that enabled him to formulate the notion of superfluids, and led him to the recognition that quantum mechanics was not, as it was commonly assumed, relevant exclusively as a micromechanics.  相似文献   

6.
This article presents a discussion of the notion of quantum ontological excess baggage, first proposed by Hardy. It is argued here that this idea does not have the significance suggested in his paper. It is shown that if this concept is properly analyzed, it fails to pose any threat to the ontic approach to quantum theory in general.  相似文献   

7.
When attempting to assess the strengths and weaknesses of various principles in their potential role of guiding the formulation of a theory of quantum gravity, it is crucial to distinguish between principles which are strongly supported by empirical data – either directly or indirectly – and principles which instead (merely) rely heavily on theoretical arguments for their justification. Principles in the latter category are not necessarily invalid, but their a priori foundational significance should be regarded with due caution. These remarks are illustrated in terms of the current standard models of cosmology and particle physics, as well as their respective underlying theories, i.e., essentially general relativity and quantum (field) theory. For instance, it is clear that both standard models are severely constrained by symmetry principles: an effective homogeneity and isotropy of the known universe on the largest scales in the case of cosmology and an underlying exact gauge symmetry of nuclear and electromagnetic interactions in the case of particle physics. However, in sharp contrast to the cosmological situation, where the relevant symmetry structure is more or less established directly on observational grounds, all known, nontrivial arguments for the “gauge principle” are purely theoretical (and far less conclusive than usually advocated). Similar remarks apply to the larger theoretical structures represented by general relativity and quantum (field) theory, where – actual or potential – empirical principles, such as the (Einstein) equivalence principle or EPR-type nonlocality, should be clearly differentiated from theoretical ones, such as general covariance or renormalizability. It is argued that if history is to be of any guidance, the best chance to obtain the key structural features of a putative quantum gravity theory is by deducing them, in some form, from the appropriate empirical principles (analogous to the manner in which, say, the idea that gravitation is a curved spacetime phenomenon is arguably implied by the equivalence principle). Theoretical principles may still be useful however in formulating a concrete theory (analogous to the manner in which, say, a suitable form of general covariance can still act as a sieve for separating theories of gravity from one another). It is subsequently argued that the appropriate empirical principles for deducing the key structural features of quantum gravity should at least include (i) quantum nonlocality, (ii) irreducible indeterminacy (or, essentially equivalently, given (i), relativistic causality), (iii) the thermodynamic arrow of time, (iv) homogeneity and isotropy of the observable universe on the largest scales. In each case, it is explained – when appropriate – how the principle in question could be implemented mathematically in a theory of quantum gravity, why it is considered to be of fundamental significance and also why contemporary accounts of it are insufficient. For instance, the high degree of uniformity observed in the Cosmic Microwave Background is usually regarded as theoretically problematic because of the existence of particle horizons, whereas the currently popular attempts to resolve this situation in terms of inflationary models are, for a number of reasons, less than satisfactory. However, rather than trying to account for the required empirical features dynamically, an arguably much more fruitful approach consists in attempting to account for these features directly, in the form of a lawlike initial condition within a theory of quantum gravity.  相似文献   

8.
This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning.  相似文献   

9.
I show explicitly how concerns about wave function collapse and ontology can be decoupled from the bulk of technical analysis necessary to recover localized, approximately Newtonian trajectories from quantum theory. In doing so, I demonstrate that the account of classical behavior provided by decoherence theory can be straightforwardly tailored to give accounts of classical behavior on multiple interpretations of quantum theory, including the Everett, de Broglie–Bohm and GRW interpretations. I further show that this interpretation-neutral, decoherence-based account conforms to a general view of inter-theoretic reduction in physics that I have elaborated elsewhere, which differs from the oversimplified picture that treats reduction as a matter of simply taking limits. This interpretation-neutral account rests on a general three-pronged strategy for reduction between quantum and classical theories that combines decoherence, an appropriate form of Ehrenfest׳s Theorem, and a decoherence-compatible mechanism for collapse. It also incorporates a novel argument as to why branch-relative trajectories should be approximately Newtonian, which is based on a little-discussed extension of Ehrenfest׳s Theorem to open systems, rather than on the more commonly cited but less germane closed-systems version. In the Conclusion, I briefly suggest how the strategy for quantum-classical reduction described here might be extended to reduction between other classical and quantum theories, including classical and quantum field theory and classical and quantum gravity.  相似文献   

10.
The Quantum Hall Effects offer a rich variety of theoretical and experimental advances. They provide interesting insights on such topics as gauge invariance, strong interactions in Condensed Matter physics, emergence of new paradigms. This paper focuses on some related philosophical questions. Various brands of positivism or agnosticism are confronted with the physics of the Quantum Hall Effects. Hacking׳s views on Scientific Realism, Chalmers׳ on Non-Figurative Realism are discussed. It is argued that the difficulties with those versions of realism may be resolved within a dialectical materialist approach. The latter is argued to provide a rational approach to the phenomena, theory and ontology of the Quantum Hall Effects.  相似文献   

11.
The traditional use of ergodic theory in the foundations of equilibrium statistical mechanics is that it provides a link between thermodynamic observables and microcanonical probabilities. First of all, the ergodic theorem demonstrates the equality of microcanonical phase averages and infinite time averages (albeit for a special class of systems, and up to a measure zero set of exceptions). Secondly, one argues that actual measurements of thermodynamic quantities yield time averaged quantities, since measurements take a long time. The combination of these two points is held to be an explanation why calculating microcanonical phase averages is a successful algorithm for predicting the values of thermodynamic observables. It is also well known that this account is problematic.This survey intends to show that ergodic theory nevertheless may have important roles to play, and it explores three other uses of ergodic theory. Particular attention is paid, firstly, to the relevance of specific interpretations of probability, and secondly, to the way in which the concern with systems in thermal equilibrium is translated into probabilistic language. With respect to the latter point, it is argued that equilibrium should not be represented as a stationary probability distribution as is standardly done; instead, a weaker definition is presented.  相似文献   

12.
13.
The apparent dichotomy between quantum jumps on the one hand, and continuous time evolution according to wave equations on the other hand, provided a challenge to Bohr's proposal of quantum jumps in atoms. Furthermore, Schrödinger's time-dependent equation also seemed to require a modification of the explanation for the origin of line spectra due to the apparent possibility of superpositions of energy eigenstates for different energy levels. Indeed, Schrödinger himself proposed a quantum beat mechanism for the generation of discrete line spectra from superpositions of eigenstates with different energies.However, these issues between old quantum theory and Schrödinger's wave mechanics were correctly resolved only after the development and full implementation of photon quantization. The second quantized scattering matrix formalism reconciles quantum jumps with continuous time evolution through the identification of quantum jumps with transitions between different sectors of Fock space. The continuous evolution of quantum states is then recognized as a sum over continually evolving jump amplitudes between different sectors in Fock space.In today's terminology, this suggests that linear combinations of scattering matrix elements are epistemic sums over ontic states. Insights from the resolution of the dichotomy between quantum jumps and continuous time evolution therefore hold important lessons for modern research both on interpretations of quantum mechanics and on the foundations of quantum computing. They demonstrate that discussions of interpretations of quantum theory necessarily need to take into account field quantization. They also demonstrate the limitations of the role of wave equations in quantum theory, and caution us that superpositions of quantum states for the formation of qubits may be more limited than usually expected.  相似文献   

14.
Over the last three decades, string theory has emerged as one of the leading hopes for a consistent theory of quantum gravity that unifies particle physics with general relativity. Despite the fact that string theory has been a thriving research program for the better part of three decades, it has been subjected to extensive criticism from a number of prominent physicists. The aim of this paper is to obtain a clearer picture of where the conflict lies in competing assessments of string theory, through a close reading of the argumentative strategies employed by protagonists on both sides. Although it has become commonplace to construe this debate as stemming from different attitudes to the absence of testable predictions, we argue that this presents an overly simplified view of the controversy, which ignores the critical role of heuristic appraisal. While string theorists and their defenders see the theoretical achievements of the string theory program as providing strong indication that it is ‘on the right track’, critics have challenged such claims, by calling into question the status of certain ‘solved problems’ and its purported ‘explanatory coherence’. The debates over string theory are therefore particularly instructive from a philosophical point of view, not only because they offer important insights into the nature of heuristic appraisal and theoretical progress, but also because they raise deep questions about what constitutes a solved problem and an explanation in fundamental physics.  相似文献   

15.
Operational frameworks are very useful to study the foundations of quantum mechanics, and are sometimes used to promote antirealist attitudes towards the theory. The aim of this paper is to review three arguments aiming at defending an antirealist reading of quantum physics based on various developments of standard quantum mechanics appealing to notions such as quantum information, non-causal correlations and indefinite causal orders. Those arguments will be discussed in order to show that they are not convincing. Instead, it is argued that there is conceptually no argument that could favour realist or antirealist attitudes towards quantum mechanics based solely on some features of some formalism. In particular, both realist and antirealist views are well accomodable within operational formulations of the theory. The reason for this is that the realist/antirealist debate is located at a purely epistemic level, which is not engaged by formal aspects of theories. As such, operational formulations of quantum mechanics are epistmologically and ontologically neutral. This discussion aims at clarifying the limits of the historical and methodological affinities between scientific antirealism and operational physics while engaging with recent discoveries in quantum foundations. It also aims at presenting various realist strategies to account for those developments.  相似文献   

16.
17.
Ulrich Meyer’s book The Nature of Time uses tense logic to argue for a ‘modal’ view of time, which replaces substantial times (as in Newton’s Absolute Time) with ‘ersatz times’ constructed using conceptually basic tense operators. He also argues against Bertrand Russell’s relationist theory, in which times are classes of events, and against the idea that relativity compels the integration of time and space (called by Meyer the Inseparability Argument). I find fault with each of these negative arguments, as well as with Meyer’s purported reconstruction of empty spacetime from tense operators and substantial spatial points. I suggest that Meyer’s positive project is best conceived as an elimination of time in the mode of Julian Barbour's The End of Time.  相似文献   

18.
In this paper we make an empirical investigation of the relationship between the consistency, coherence and validity of probability judgements in a real-world forecasting context. Our results indicate that these measures of the adequacy of an individual's probability assessments are not closely related as we anticipated. Twenty-nine of our thirty-six subjects were better calibrated in point probabilities than in odds and our subjects were, in general more coherent using point probabilities than odds forecasts. Contrary to our expectations we found very little difference in forecasting response and performance between simple and compound holistic forecasts. This result is evidence against the ‘divide-and-conquer’ rationale underlying most applications of normative decision theory. In addition, our recompositions of marginal and conditional assessments into compound forecasts were no better calibrated or resolved than their holistic counterparts. These findings convey two implications for forecasting. First, untrained judgemental forecasters should use point probabilities in preference to odds. Second, judgemental forecasts of complex compound probabilities may be as well assessed holistically as they are using methods of decomposition and recomposition. In addition, our study provides a paradigm for further studies of the relationship between consistency, coherence and validity in judgemental probability forecasting.  相似文献   

19.
We distinguish two orientations in Weyl's analysis of the fundamental role played by the notion of symmetry in physics, namely an orientation inspired by Klein's Erlangen program and a phenomenological-transcendental orientation. By privileging the former to the detriment of the latter, we sketch a group(oid)-theoretical program—that we call the Klein-Weyl program—for the interpretation of both gauge theories and quantum mechanics in a single conceptual framework. This program is based on Weyl's notion of a “structure-endowed entity” equipped with a “group of automorphisms”. First, we analyze what Weyl calls the “problem of relativity” in the frameworks provided by special relativity, general relativity, and Yang-Mills theories. We argue that both general relativity and Yang-Mills theories can be understood in terms of a localization of Klein's Erlangen program: while the latter describes the group-theoretical automorphisms of a single structure (such as homogenous geometries), local gauge symmetries and the corresponding gauge fields (Ehresmann connections) can be naturally understood in terms of the groupoid-theoretical isomorphisms in a family of identical structures. Second, we argue that quantum mechanics can be understood in terms of a linearization of Klein's Erlangen program. This stance leads us to an interpretation of the fact that quantum numbers are “indices characterizing representations of groups” ((Weyl, 1931a), p.xxi) in terms of a correspondence between the ontological categories of identity and determinateness.  相似文献   

20.
With the Higgs boson discovery and no new physics found at the LHC, confidence in Naturalness as a guiding principle for particle physics is under increased pressure. We wait to see if it proves its mettle in the LHC upgrades ahead, and beyond. In the meantime, I present a justification a posteriori of the Naturalness criterion by suggesting that uncompromising application of the principle to Quantum Electrodynamics leads toward the Standard Model and Higgs boson without additional experimental input. Potential lessons for today and future theory building are commented upon.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号