首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Fritz London's seminal idea of “quantum mechanisms of macroscopic scale”, first articulated in 1946, was the unanticipated result of two decades of research, during which London pursued quantum-mechanical explanations of various kinds of systems of particles at different scales. He started at the microphysical scale with the hydrogen molecule, generalized his approach to chemical bonds and intermolecular forces, then turned to macrophysical systems like superconductors and superfluid helium. Along this path, he formulated a set of concepts—the quantum mechanism of exchange, the rigidity of the wave function, the role of quantum statistics in multi-particle systems, the possibility of order in momentum space—that eventually coalesced into a new conception of systems of equal particles. In particular, it was London's clarification of Bose-Einstein condensation that enabled him to formulate the notion of superfluids, and led him to the recognition that quantum mechanics was not, as it was commonly assumed, relevant exclusively as a micromechanics.  相似文献   

2.
The wave-mechanical treatment of the valence bond, by Walter Heitler and Fritz London, and its ensuing foundational importance in quantum chemistry has been traditionally regarded as the basis for the argument that chemistry may be theoretically reduced to physics. Modern analyses of the reductionist claim focuses on the limitations to achieving full reduction in practice because of the approximations used in modern quantum chemical methods, but neglect the historical importance of the chemical bond as a chemical entity. This paper re-examines these arguments with a study of the development of the valence bond by chemist Gilbert Lewis within a chemically autonomous framework, and its extension by Linus Pauling using Heitler and London’s methods. Here, we see that the chemical bond is best described as a theoretical synthesis or physico-chemical entity, to represent its full interdisciplinary importance from the philosophical and historical perspectives.  相似文献   

3.
In this paper I examine the notion and role of metaphors and illustrations in Maxwell's works in exact science as a pathway into a broader and richer philosophical conception of a scientist and scientific practice. While some of these notions and methods are still at work in current scientific research—from economics and biology to quantum computation and quantum field theory—, here I have chosen to attest to their entrenchment and complexity in actual science by attempting to make some conceptual sense of Maxwell's own usage; this endeavour includes situating Maxwell's conceptions and applications in his own culture of Victorian science and philosophy. I trace Maxwell's notions to the formulation of the problem of understanding, or interpreting, abstract representations such as potential functions and Lagrangian equations. I articulate the solution in terms of abstract-concrete relations, where the concrete, in tune with Victorian British psychology and engineering, includes the muscular as well as the pictorial. This sets the basis for a conception of understanding in terms of unification and concrete modelling, or representation. I examine the relation of illustration to analogies and metaphors on which this account rests. Lastly, I stress and explain the importance of context-dependence, its consequences for realism-instrumentalism debates, and Maxwell's own emphasis on method.  相似文献   

4.
5.
This paper examines Bub's interpretation of the foundational significance of the theorem of Clifton, Bub, and Halvorson (CBH) which characterizes quantum theories in terms of information-theoretic constraints. Bub argues that quantum theory must be re-conceived of as a principle theory of information where information is a new physical primitive, to the exclusion of hidden variable theories. I will argue, contrary to Bub, that the CBH theorem cannot be used to exclude hidden variables theories. Drawing inspiration from Bub, I sketch an alternative conception of quantum mechanics as a theory of information, but one which embraces all empirically equivalent quantum theories.  相似文献   

6.
This paper defends the deflationary character of two recent views regarding scientific representation, namely RIG Hughes' DDI model and the inferential conception. It is first argued that these views' deflationism is akin to the homonymous position in discussions regarding the nature of truth. There, we are invited to consider the platitudes that the predicate “true” obeys at the level of practice, disregarding any deeper, or more substantive, account of its nature. More generally, for any concept X, a deflationary approach is then defined in opposition to a substantive approach, where a substantive approach to X is an analysis of X in terms of some property P, or relation R, accounting for and explaining the standard use of X. It then becomes possible to characterize a deflationary view of scientific representation in three distinct senses, namely: a “no-theory” view, a “minimalist” view, and a “use-based” view—in line with three standard deflationary responses in the philosophical literature on truth. It is then argued that both the DDI model and the inferential conception may be suitably understood in any of these three different senses. The application of these deflationary ‘hermeneutics’ moreover yields significant improvements on the DDI model, which bring it closer to the inferential conception. It is finally argued that what these approaches have in common—the key to any deflationary account of scientific representation—is the denial that scientific representation may be ultimately reduced to any substantive explanatory property of sources, or targets, or their relations.  相似文献   

7.
Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computational quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.  相似文献   

8.
The fact that there exist in nature thoroughly deterministic systems whose future behavior cannot be predicted, no matter how advanced or fined-tune our cognitive and technical abilities turn out to be, has been well established over the last decades or so, essentially in the light of two different theoretical frameworks, namely chaos theory and (some deterministic interpretation of) quantum mechanics. The prime objective of this paper is to show that there actually exists an alternative strategy to ground the divorce between determinism and predictability, a way that is older than—and conceptually independent from—chaos theory and quantum mechanics, and which has not received much attention in the recent philosophical literature about determinism. This forgotten strategy—embedded in the doctrine called “emergent evolutionism”—is nonetheless far from being a mere historical curiosity that should only draw the attention of philosophers out of their concern for comprehensiveness. It has been indeed recently revived in the works of respected scientists.  相似文献   

9.
An “intrinsically mixed” state is a mixed state of a system that is (in a sense to be elaborated) ‘orthogonal’ to every pure state of that system. Although the presence of such states in the quantum theories of infinite systems is well known to those who work with such theories, intrinsically mixed states are virtually unheralded in the philosophical literature. Rob Clifton was thoroughly familiar with intrinsically mixed states. I aim here to introduce them to a wider audience—and to encourage that audience to cultivate their acquaintance by suggesting that intrinsically mixed states undermine assumptions framing standard discussions of the quantum measurement problem.  相似文献   

10.
The paper seeks to make progress from stating primitive ontology theories of quantum physics—notably Bohmian mechanics, the GRW matter density theory and the GRW flash theory—to assessing these theories. Four criteria are set out: (a) internal coherence; (b) empirical adequacy; (c) relationship to other theories; and (d) explanatory value. The paper argues that the stock objections against these theories do not withstand scrutiny. Its focus then is on their explanatory value: they pursue different strategies to ground the textbook formalism of quantum mechanics, and they develop different explanations of quantum non-locality. In conclusion, it is argued that Bohmian mechanics offers a better prospect for making quantum non-locality intelligible than the GRW matter density theory and the GRW flash theory.  相似文献   

11.
Baker (2011) argues that broken symmetries pose a number of puzzles for the interpretation of quantum theories—puzzles which he claims do not arise in classical theories. I provide examples of classical cases of symmetry breaking and show that they have precisely the same features that Baker finds puzzling in quantum theories. To the extent that Baker is correct that the classical cases pose no puzzles, the features of the quantum case that Baker highlights should not be puzzling either.  相似文献   

12.
Proposed by Einstein, Podolsky, and Rosen (EPR) in 1935, the entangled state has played a central part in exploring the foundation of quantum mechanics. At the end of the twentieth century, however, some physicists and mathematicians set aside the epistemological debates associated with EPR and turned it from a philosophical puzzle into practical resources for information processing. This paper examines the origin of what is known as quantum information. Scientists had considered making quantum computers and employing entanglement in communications for a long time. But the real breakthrough only occurred in the 1980s when they shifted focus from general-purpose systems such as Turing machines to algorithms and protocols that solved particular problems, including quantum factorization, quantum search, superdense code, and teleportation. Key to their development was two groups of mathematical manipulations and deformations of entanglement—quantum parallelism and ‘feedback EPR’—that served as conceptual templates. The early success of quantum parallelism and feedback EPR was owed to the idealized formalism of entanglement researchers had prepared for philosophical discussions. Yet, such idealization is difficult to hold when the physical implementation of quantum information processors is at stake. A major challenge for today's quantum information scientists and engineers is thus to move from Einstein et al.'s well-defined scenarios into realistic models.  相似文献   

13.
Despite remarkable efforts, it remains notoriously difficult to equip quantum theory with a coherent ontology. Hence, Healey (2017, 12) has recently suggested that “quantum theory has no physical ontology and states no facts about physical objects or events”, and Fuchs et al. (2014, 752) similarly hold that “quantum mechanics itself does not deal directly with the objective world”. While intriguing, these positions either raise the question of how talk of ‘physical reality’ can even remain meaningful, or they must ultimately embrace a hidden variables-view, in tension with their original project. I here offer a neo-Kantian alternative. In particular, I will show how constitutive elements in the sense of Reichenbach (1920) and Friedman (1999, 2001) can be identified within quantum theory, through considerations of symmetries that allow the constitution of a ‘quantum reality’, without invoking any notion of a radically mind-independent reality. The resulting conception will inherit elements from pragmatist and ‘QBist’ approaches, but also differ from them in crucial respects. Furthermore, going beyond the Friedmanian program, I will show how non-fundamental and approximate symmetries can be relevant for identifying constitutive principles.  相似文献   

14.
The simplest case of quantum field theory on curved spacetime—that of the Klein–Gordon field on a globally hyperbolic spacetime—reveals a dilemma: In generic circumstances, either there is no dynamics for this quantum field, or else there is a dynamics that is not unitarily implementable. We do not try to resolve the dilemma here, but endeavour to spell out the consequences of seizing one or the other horn of the dilemma.  相似文献   

15.
The ontological model framework provides a rigorous approach to address the question of whether the quantum state is ontic or epistemic. When considering only conventional projective measurements, auxiliary assumptions are always needed to prove the reality of the quantum state in the framework. For example, the Pusey–Barrett–Rudolph theorem is based on an additional preparation independence assumption. In this paper, we give a new proof of ψ-ontology in terms of protective measurements in the ontological model framework. The proof does not rely on auxiliary assumptions, and it also applies to deterministic theories such as the de Broglie–Bohm theory. In addition, we give a simpler argument for ψ-ontology beyond the framework, which is based on protective measurements and a weaker criterion of reality. The argument may be also appealing for those people who favor an anti-realist view of quantum mechanics.  相似文献   

16.
17.
Many consider the apparent disappearance of time and change in quantum gravity the main metaphysical challenge since it seems to lead to a form of Parmenidean view according to which the physical world simply is, nothing changes, moves, becomes, happens. In this paper, I argue that the main metaphysical challenge of Rovelli’s philosophical view of loop quantum gravity is to lead exactly to the opposite view, namely, a form of Heraclitean view, or rather, of radical process metaphysics according to which there is becoming (process, change, event) but not being (substance, stasis, thing). However, this does not entail that time is real. Fundamentally, time does not exist. I show how Rovelli’s understanding of loop quantum gravity supports the view that there is change without time, so that the physical world can be timeless yet ever-changing. I conclude by arguing that it is such a process-oriented conception that constitutes the revolutionary metaphysical challenge and philosophical significance of loop quantum gravity, while the alleged Parmenidean view turns out to be nothing but the endpoint of a long-standing metaphysical orthodoxy.  相似文献   

18.
According to Zurek, decoherence is a process resulting from the interaction between a quantum system and its environment; this process singles out a preferred set of states, usually called “pointer basis”, that determines which observables will receive definite values. This means that decoherence leads to a sort of selection which precludes all except a small subset of the states in the Hilbert space of the system from behaving in a classical manner: environment-induced-superselection—einselection—is a consequence of the process of decoherence. The aim of this paper is to present a new approach to decoherence, different from the mainstream approach of Zurek and his collaborators. We will argue that this approach offers conceptual advantages over the traditional one when problems of foundations are considered; in particular, from the new perspective, decoherence in closed quantum systems becomes possible and the preferred basis acquires a well founded definition.  相似文献   

19.
Spin is typically thought to be a fundamental property of the electron and other elementary particles. Although it is defined as an internal angular momentum much of our understanding of it is bound up with the mathematics of group theory. This paper traces the development of the concept of spin paying particular attention to the way that quantum mechanics has influenced its interpretation in both theoretical and experimental contexts. The received view is that electron spin was discovered experimentally by Stern and Gerlach in 1921, 5 years prior to its theoretical formulation by Goudsmit and Uhlenbeck. However, neither Goudsmit nor Uhlenbeck, nor any others involved in the debate about spin cited the Stern–Gerlach experiment as corroborating evidence. In fact, Bohr and Pauli were emphatic that the spin of a single electron could not be measured in classical experiments. In recent years experiments designed to refute the Bohr–Pauli thesis and measure electron spin have been carried out. However, a number of ambiguities surround these results—ambiguities that relate not only to the measurements themselves but to the interpretation of the experiments. After discussing these various issues I raise some philosophical questions about the ontological and epistemic status of spin. Because it is a curious hybrid of the mathematical and the physical these questions are relatively complex, and while I do not pretend to have answered them here, the goal of the paper is to uncover and isolate how spin presents problems for traditional realism and to illustrate the power that theories like quantum mechanics have for shaping both philosophical questions and answers.  相似文献   

20.
In the Bayesian approach to quantum mechanics, probabilities—and thus quantum states—represent an agent's degrees of belief, rather than corresponding to objective properties of physical systems. In this paper we investigate the concept of certainty in quantum mechanics. Particularly, we show how the probability-1 predictions derived from pure quantum states highlight a fundamental difference between our Bayesian approach, on the one hand, and Copenhagen and similar interpretations on the other. We first review the main arguments for the general claim that probabilities always represent degrees of belief. We then argue that a quantum state prepared by some physical device always depends on an agent's prior beliefs, implying that the probability-1 predictions derived from that state also depend on the agent's prior beliefs. Quantum certainty is therefore always some agent's certainty. Conversely, if facts about an experimental setup could imply agent-independent certainty for a measurement outcome, as in many Copenhagen-like interpretations, that outcome would effectively correspond to a preexisting system property. The idea that measurement outcomes occurring with certainty correspond to preexisting system properties is, however, in conflict with locality. We emphasize this by giving a version of an argument of Stairs [(1983). Quantum logic, realism, and value-definiteness. Philosophy of Science, 50, 578], which applies the Kochen–Specker theorem to an entangled bipartite system.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号