共查询到20条相似文献,搜索用时 31 毫秒
1.
The application of analytic continuation in quantum field theory (QFT) is juxtaposed to T-duality and mirror symmetry in string theory. Analytic continuation—a mathematical transformation that takes the time variable t to negative imaginary time—it—was initially used as a mathematical technique for solving perturbative Feynman diagrams, and was subsequently the basis for the Euclidean approaches within mainstream QFT (e.g., Wilsonian renormalization group methods, lattice gauge theories) and the Euclidean field theory program for rigorously constructing non-perturbative models of interacting QFTs. A crucial difference between theories related by duality transformations and those related by analytic continuation is that the former are judged to be physically equivalent while the latter are regarded as physically inequivalent. There are other similarities between the two cases that make comparing and contrasting them a useful exercise for clarifying the type of argument that is needed to support the conclusion that dual theories are physically equivalent. In particular, T-duality and analytic continuation in QFT share the criterion for predictive equivalence that two theories agree on the complete set of expectation values and the mass spectra and the criterion for formal equivalence that there is a “translation manual” between the physically significant algebras of observables and sets of states in the two theories. The analytic continuation case study illustrates how predictive and formal equivalence are compatible with physical inequivalence, but not in the manner of standard underdetermination cases. Arguments for the physical equivalence of dual theories must cite considerations beyond predictive and formal equivalence. The analytic continuation case study is an instance of the strategy of developing a physical theory by extending the formal or mathematical equivalence with another physical theory as far as possible. That this strategy has resulted in developments in pure mathematics as well as theoretical physics is another feature that this case study has in common with dualities in string theory. 相似文献
2.
Over the last three decades, string theory has emerged as one of the leading hopes for a consistent theory of quantum gravity that unifies particle physics with general relativity. Despite the fact that string theory has been a thriving research program for the better part of three decades, it has been subjected to extensive criticism from a number of prominent physicists. The aim of this paper is to obtain a clearer picture of where the conflict lies in competing assessments of string theory, through a close reading of the argumentative strategies employed by protagonists on both sides. Although it has become commonplace to construe this debate as stemming from different attitudes to the absence of testable predictions, we argue that this presents an overly simplified view of the controversy, which ignores the critical role of heuristic appraisal. While string theorists and their defenders see the theoretical achievements of the string theory program as providing strong indication that it is ‘on the right track’, critics have challenged such claims, by calling into question the status of certain ‘solved problems’ and its purported ‘explanatory coherence’. The debates over string theory are therefore particularly instructive from a philosophical point of view, not only because they offer important insights into the nature of heuristic appraisal and theoretical progress, but also because they raise deep questions about what constitutes a solved problem and an explanation in fundamental physics. 相似文献
3.
Duality, the equivalence between seemingly distinct quantum systems, is a curious property that has been known for at least three quarters of a century. In the past two decades it has played a central role in mapping out the structure of theoretical physics. I discuss the unexpected connections that have been revealed among quantum field theories and string theories. Written for a special issue of Studies in History and Philosophy of Modern Physics. 相似文献
4.
Alan Forrester 《Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics》2007,38(4):815-831
In recent papers, Zurek [(2005). Probabilities from entanglement, Born's rule pk=|ψk|2 from entanglement. Physical Review A, 71, 052105] has objected to the decision-theoretic approach of Deutsch [(1999) Quantum theory of probability and decisions. Proceedings of the Royal Society of London A, 455, 3129–3137] and Wallace [(2003). Everettian rationality: defending Deutsch's approach to probability in the Everett interpretation. Studies in History and Philosophy of Modern Physics, 34, 415–438] to deriving the Born rule for quantum probabilities on the grounds that it courts circularity. Deutsch and Wallace assume that the many worlds theory is true and that decoherence gives rise to a preferred basis. However, decoherence arguments use the reduced density matrix, which relies upon the partial trace and hence upon the Born rule for its validity. Using the Heisenberg picture and quantum Darwinism—the notion that classical information is quantum information that can proliferate in the environment pioneered in Ollivier et al. [(2004). Objective properties from subjective quantum states: Environment as a witness. Physical Review Letters, 93, 220401 and (2005). Environment as a witness: Selective proliferation of information and emergence of objectivity in a quantum universe. Physical Review A, 72, 042113]—I show that measurement interactions between two systems only create correlations between a specific set of commuting observables of system 1 and a specific set of commuting observables of system 2. This argument picks out a unique basis in which information flows in the correlations between those sets of commuting observables. I then derive the Born rule for both pure and mixed states and answer some other criticisms of the decision theoretic approach to quantum probability. 相似文献
5.
This paper deals with Hobbes's theory of optical images, developed in his optical magnum opus, ‘A Minute or First Draught of the Optiques’ (1646), and published in abridged version in De homine (1658). The paper suggests that Hobbes's theory of vision and images serves him to ground his philosophy of man on his philosophy of body. Furthermore, since this part of Hobbes's work on optics is the most thoroughly geometrical, it reveals a good deal about the role of mathematics in Hobbes's philosophy. The paper points to some difficulties in the thesis of Shapin and Schaffer, who presented geometry as a ‘paradigm’ for Hobbes's natural philosophy. It will be argued here that Hobbes's application of geometry to optics was dictated by his metaphysical and epistemological principles, not by a blind belief in the power of geometry. Geometry supported causal explanation, and assisted reason in making sense of appearances by helping the philosopher understand the relationships between the world outside us and the images it produces in us. Finally the paper broadly suggests how Hobbes's theory of images may have triggered, by negative example, the flourishing of geometrical optics in Restoration England. 相似文献
6.
7.
8.
The universal acceptance of atomism in physics and chemistry in the early 20th century went along with an altered view on the epistemic status of microphysical conjectures. Contrary to the prevalent understanding during the 19th century, on the new view unobservable objects could be ‘discovered’. It is argued in the present paper that this shift can be connected to the implicit integration of elements of meta-empirical theory assessment into the concept of theory confirmation. 相似文献
9.
The recent discovery of the Higgs at 125 GeV by the ATLAS and CMS experiments at the LHC has put significant pressure on a principle which has guided much theorizing in high energy physics over the last 40 years, the principle of naturalness. In this paper, I provide an explication of the conceptual foundations and physical significance of the naturalness principle. I argue that the naturalness principle is well-grounded both empirically and in the theoretical structure of effective field theories, and that it was reasonable for physicists to endorse it. Its possible failure to be realized in nature, as suggested by recent LHC data, thus represents an empirical challenge to certain foundational aspects of our understanding of QFT. In particular, I argue that its failure would undermine one class of recent proposals which claim that QFT provides us with a picture of the world as being structured into quasi-autonomous physical domains. 相似文献
10.
We present the fractional quantum Hall (FQH) effect as a candidate emergent phenomenon. Unlike some other putative cases of condensed matter emergence (such as thermal phase transitions), the FQH effect is not based on symmetry breaking. Instead FQH states are part of a distinct class of ordered matter that is defined topologically. Topologically ordered states result from complex long-ranged correlations between their constituent parts, such that the system displays strongly irreducible, qualitatively novel properties. 相似文献
11.
S-dualities have been held to have radical implications for our metaphysics of fundamentality. In particular, it has been claimed that they make the fundamentality status of a physical object theory-relative in an important new way. But what physicists have had to say on the issue has not been clear or consistent, and in particular seems to be ambiguous between whether S-dualities demand an anti-realist interpretation of fundamentality talk or merely a revised realism. This paper is an attempt to bring some clarity to the matter. After showing that even antecedently familiar fundamentality claims are true only relative to a raft of metaphysical, physical, and mathematical assumptions, I argue that the relativity of fundamentality inherent in S-duality nevertheless represents something new, and that part of the reason for this is that it has both realist and anti-realist implications for fundamentality talk. I close by discussing the broader significance that S-dualities have for structuralist metaphysics and for fundamentality metaphysics more generally. 相似文献
12.
The view that the fundamental kind properties are intrinsic properties enjoys reflexive endorsement by most metaphysicians of science. But ontic structural realists deny that there are any fundamental intrinsic properties at all. Given that structuralists distrust intuition as a guide to truth, and given that we currently lack a fundamental physical theory that we could consult instead to order settle the issue, it might seem as if there is simply nowhere for this debate to go at present. However, I will argue that there exists an as-yet untapped resource for arguing for ontic structuralism – namely, the way that fundamentality is conceptualized in our most fundamental physical frameworks. By arguing that physical objects must be subject to the ‘Goldilock's principle’ if they are to count as fundamental at all, I argue that we can no longer view the majority of properties defining them as intrinsic. As such, ontic structural realism can be regarded as the most promising metaphysics for fundamental physics, and that this is so even though we do not yet claim to know precisely what that fundamental physics is. 相似文献
13.
We outline a framework for analyzing episodes from the history of science in which the application of mathematics plays a constitutive role in the conceptual development of empirical sciences. Our starting point is the inferential conception of the application of mathematics, recently advanced by Bueno and Colyvan (2011). We identify and discuss some systematic problems of this approach. We propose refinements of the inferential conception based on theoretical considerations and on the basis of a historical case study. We demonstrate the usefulness of the refined, dynamical inferential conception using the well-researched example of the genesis of general relativity. Specifically, we look at the collaboration of the physicist Einstein and the mathematician Grossmann in the years 1912–1913, which resulted in the jointly published “Outline of a Generalized Theory of Relativity and a Theory of Gravitation,” a precursor theory of the final theory of general relativity. In this episode, independently developed mathematical theories, the theory of differential invariants and the absolute differential calculus, were applied in the process of finding a relativistic theory of gravitation. The dynamical inferential conception not only provides a natural framework to describe and analyze this episode, but it also generates new questions and insights. We comment on the mathematical tradition on which Grossmann drew, and on his own contributions to mathematical theorizing. The dynamical inferential conception allows us to identify both the role of heuristics and of mathematical resources as well as the systematic role of problems and mistakes in the reconstruction of episodes of conceptual innovation and theory change. 相似文献
14.
When attempting to assess the strengths and weaknesses of various principles in their potential role of guiding the formulation of a theory of quantum gravity, it is crucial to distinguish between principles which are strongly supported by empirical data – either directly or indirectly – and principles which instead (merely) rely heavily on theoretical arguments for their justification. Principles in the latter category are not necessarily invalid, but their a priori foundational significance should be regarded with due caution. These remarks are illustrated in terms of the current standard models of cosmology and particle physics, as well as their respective underlying theories, i.e., essentially general relativity and quantum (field) theory. For instance, it is clear that both standard models are severely constrained by symmetry principles: an effective homogeneity and isotropy of the known universe on the largest scales in the case of cosmology and an underlying exact gauge symmetry of nuclear and electromagnetic interactions in the case of particle physics. However, in sharp contrast to the cosmological situation, where the relevant symmetry structure is more or less established directly on observational grounds, all known, nontrivial arguments for the “gauge principle” are purely theoretical (and far less conclusive than usually advocated). Similar remarks apply to the larger theoretical structures represented by general relativity and quantum (field) theory, where – actual or potential – empirical principles, such as the (Einstein) equivalence principle or EPR-type nonlocality, should be clearly differentiated from theoretical ones, such as general covariance or renormalizability. It is argued that if history is to be of any guidance, the best chance to obtain the key structural features of a putative quantum gravity theory is by deducing them, in some form, from the appropriate empirical principles (analogous to the manner in which, say, the idea that gravitation is a curved spacetime phenomenon is arguably implied by the equivalence principle). Theoretical principles may still be useful however in formulating a concrete theory (analogous to the manner in which, say, a suitable form of general covariance can still act as a sieve for separating theories of gravity from one another). It is subsequently argued that the appropriate empirical principles for deducing the key structural features of quantum gravity should at least include (i) quantum nonlocality, (ii) irreducible indeterminacy (or, essentially equivalently, given (i), relativistic causality), (iii) the thermodynamic arrow of time, (iv) homogeneity and isotropy of the observable universe on the largest scales. In each case, it is explained – when appropriate – how the principle in question could be implemented mathematically in a theory of quantum gravity, why it is considered to be of fundamental significance and also why contemporary accounts of it are insufficient. For instance, the high degree of uniformity observed in the Cosmic Microwave Background is usually regarded as theoretically problematic because of the existence of particle horizons, whereas the currently popular attempts to resolve this situation in terms of inflationary models are, for a number of reasons, less than satisfactory. However, rather than trying to account for the required empirical features dynamically, an arguably much more fruitful approach consists in attempting to account for these features directly, in the form of a lawlike initial condition within a theory of quantum gravity. 相似文献
15.
This article traces the origins of Kenneth Wilson's conception of effective field theories (EFTs) in the 1960s. I argue that what really made the difference in Wilson's path to his first prototype of EFT are his long-standing pragmatic aspirations and methodological commitments. Wilson's primary interest was to work on mathematically interesting physical problems and he thought that progress could be made by treating them as if they could be analyzed in principle by a sufficiently powerful computer. The first point explains why he had no qualms about twisting the structure of field theories; the second why he divided the state-space of a toy model field theory into continuous slices by following a standard divide-and-conquer algorithmic strategy instead of working directly with a fully discretized and finite theory. I also show how Wilson's prototype bears the mark of these aspirations and commitments and clear up a few striking ironies along the way. 相似文献
16.
Theodore Christidis Yorgos Goudaroulis Maria Mikou 《Archive for History of Exact Sciences》1987,37(2):183-191
During the first twenty-four years after the discovery of superconductivity many attempts to derive an adequate theory failed, mainly because the problem was not formulated quite correctly. In this paper we investigate certain questions related to the heuristic role of mathematics in the appropriate formulation of the problem that had to be solved and the development of a theory which was hindered by theoretical superstitions. 相似文献
17.
Mary Domski 《Studies in history and philosophy of science》2009,40(2):119-130
I argue for an interpretation of the connection between Descartes’ early mathematics and metaphysics that centers on the standard of geometrical intelligibility that characterizes Descartes’ mathematical work during the period 1619 to 1637. This approach remains sensitive to the innovations of Descartes’ system of geometry and, I claim, sheds important light on the relationship between his landmark Geometry (1637) and his first metaphysics of nature, which is presented in Le monde (1633). In particular, I argue that the same standard of clear and distinct motions for construction that allows Descartes to distinguish ‘geometric’ from ‘imaginary’ curves in the domain of mathematics is adopted in Le monde as Descartes details God’s construction of nature. I also show how, on this interpretation, the metaphysics of Le monde can fruitfully be brought to bear on Descartes’ attempted solution to the Pappus problem, which he presents in Book I of the Geometry. My general goal is to show that attention to the standard of intelligibility Descartes invokes in these different areas of inquiry grants us a richer view of the connection between his early mathematics and philosophy than an approach that assumes a common method is what binds his work in these domains together. 相似文献
18.
This paper criticizes the traditional philosophical account of the quantization of gauge theories and offers an alternative. On the received view, gauge theories resist quantization because they feature distinct mathematical representatives of the same physical state of affairs. This resistance is overcome by a sequence of ad hoc modifications, justified in part by reference to semiclassical electrodynamics. Among other things, these modifications introduce ”ghosts”: particles with unphysical properties which do not appear in asymptotic states and which are said to be purely a notational convenience. I argue that this sequence of modifications is unjustified and inadequate, making it a poor basis for the interpretation of ghosts. I then argue that gauge theories can be quantized by the same method as any other theory. On this account, ghosts are not purely notation: they are coordinates on the classical configuration space of the theory—specifically, on its gauge structure. This interpretation does not fall prey to the standard philosophical arguments against the significance of ghosts, due to Weingard. Weingard’s argumentative strategy, properly applied, in fact tells in favor of ghosts’ physical significance. 相似文献
19.
The paper seeks to make progress from stating primitive ontology theories of quantum physics—notably Bohmian mechanics, the GRW matter density theory and the GRW flash theory—to assessing these theories. Four criteria are set out: (a) internal coherence; (b) empirical adequacy; (c) relationship to other theories; and (d) explanatory value. The paper argues that the stock objections against these theories do not withstand scrutiny. Its focus then is on their explanatory value: they pursue different strategies to ground the textbook formalism of quantum mechanics, and they develop different explanations of quantum non-locality. In conclusion, it is argued that Bohmian mechanics offers a better prospect for making quantum non-locality intelligible than the GRW matter density theory and the GRW flash theory. 相似文献
20.
The paper takes up Bell's (1987) “Everett (?) theory” and develops it further. The resulting theory is about the system of all particles in the universe, each located in ordinary, 3-dimensional space. This many-particle system as a whole performs random jumps through 3N-dimensional configuration space – hence “Tychistic Bohmian Mechanics” (TBM). The distribution of its spontaneous localisations in configuration space is given by the Born Rule probability measure for the universal wavefunction. Contra Bell, the theory is argued to satisfy the minimal desiderata for a Bohmian theory within the Primitive Ontology framework (for which we offer a metaphysically more perspicuous formulation than is customary). TBM's formalism is that of ordinary Bohmian Mechanics (BM), without the postulate of continuous particle trajectories and their deterministic dynamics. This “rump formalism” receives, however, a different interpretation. We defend TBM as an empirically adequate and coherent quantum theory. Objections voiced by Bell and Maudlin are rebutted. The “for all practical purposes”-classical, Everettian worlds (i.e. quasi-classical histories) exist sequentially in TBM (rather than simultaneously, as in the Everett interpretation). In a temporally coarse-grained sense, they quasi-persist. By contrast, the individual particles themselves cease to persist. 相似文献