首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The view that the fundamental kind properties are intrinsic properties enjoys reflexive endorsement by most metaphysicians of science. But ontic structural realists deny that there are any fundamental intrinsic properties at all. Given that structuralists distrust intuition as a guide to truth, and given that we currently lack a fundamental physical theory that we could consult instead to order settle the issue, it might seem as if there is simply nowhere for this debate to go at present. However, I will argue that there exists an as-yet untapped resource for arguing for ontic structuralism – namely, the way that fundamentality is conceptualized in our most fundamental physical frameworks. By arguing that physical objects must be subject to the ‘Goldilock's principle’ if they are to count as fundamental at all, I argue that we can no longer view the majority of properties defining them as intrinsic. As such, ontic structural realism can be regarded as the most promising metaphysics for fundamental physics, and that this is so even though we do not yet claim to know precisely what that fundamental physics is.  相似文献   

2.
The Higgs naturalness principle served as the basis for the so far failed prediction that signatures of physics beyond the Standard Model (SM) would be discovered at the LHC. One influential formulation of the principle, which prohibits fine tuning of bare Standard Model (SM) parameters, rests on the assumption that a particular set of values for these parameters constitute the “fundamental parameters” of the theory, and serve to mathematically define the theory. On the other hand, an old argument by Wetterich suggests that fine tuning of bare parameters merely reflects an arbitrary, inconvenient choice of expansion parameters and that the choice of parameters in an EFT is therefore arbitrary. We argue that these two interpretations of Higgs fine tuning reflect distinct ways of formulating and interpreting effective field theories (EFTs) within the Wilsonian framework: the first takes an EFT to be defined by a single set of physical, fundamental bare parameters, while the second takes a Wilsonian EFT to be defined instead by a whole Wilsonian renormalization group (RG) trajectory, associated with a one-parameter class of physically equivalent parametrizations. From this latter perspective, no single parametrization constitutes the physically correct, fundamental parametrization of the theory, and the delicate cancellation between bare Higgs mass and quantum corrections appears as an eliminable artifact of the arbitrary, unphysical reference scale with respect to which the physical amplitudes of the theory are parametrized. While the notion of fundamental parameters is well motivated in the context of condensed matter field theory, we explain why it may be superfluous in the context of high energy physics.  相似文献   

3.
In this paper I elicit a prediction from structural realism and compare it, not to a historical case, but to a contemporary scientific theory. If structural realism is correct, then we should expect physics to develop theories that fail to provide an ontology of the sort sought by traditional realists. If structure alone is responsible for instrumental success, we should expect surplus ontology to be eliminated. Quantum field theory (QFT) provides the framework for some of the best confirmed theories in science, but debates over its ontology are vexed. Rather than taking a stand on these matters, the structural realist can embrace QFT as an example of just the kind of theory SR should lead us to expect. Yet, it is not clear that QFT meets the structuralist's positive expectation by providing a structure for the world. In particular, the problem of unitarily inequivalent representations threatens to undermine the possibility of QFT providing a unique structure for the world. In response to this problem, I suggest that the structuralist should endorse pluralism about structure.  相似文献   

4.
One of the key philosophical questions regarding quantum field theory is whether it should be given a particle or field interpretation. The particle interpretation of QFT is commonly viewed as being undermined by the well-known no-go results, such as the Malament, Reeh-Schlieder and Hegerfeldt theorems. These theorems all focus on the localizability problem within the relativistic framework. In this paper I would like to go back to the basics and ask the simple-minded question of how the notion of quanta appears in the standard procedure of field quantization, starting with the elementary case of the finite numbers of harmonic oscillators, and proceeding to the more realistic scenario of continuous fields with infinitely many degrees of freedom. I will try to argue that the way the standard formalism introduces the talk of field quanta does not justify treating them as particle-like objects with well-defined properties.  相似文献   

5.
The application of analytic continuation in quantum field theory (QFT) is juxtaposed to T-duality and mirror symmetry in string theory. Analytic continuation—a mathematical transformation that takes the time variable t to negative imaginary time—it—was initially used as a mathematical technique for solving perturbative Feynman diagrams, and was subsequently the basis for the Euclidean approaches within mainstream QFT (e.g., Wilsonian renormalization group methods, lattice gauge theories) and the Euclidean field theory program for rigorously constructing non-perturbative models of interacting QFTs. A crucial difference between theories related by duality transformations and those related by analytic continuation is that the former are judged to be physically equivalent while the latter are regarded as physically inequivalent. There are other similarities between the two cases that make comparing and contrasting them a useful exercise for clarifying the type of argument that is needed to support the conclusion that dual theories are physically equivalent. In particular, T-duality and analytic continuation in QFT share the criterion for predictive equivalence that two theories agree on the complete set of expectation values and the mass spectra and the criterion for formal equivalence that there is a “translation manual” between the physically significant algebras of observables and sets of states in the two theories. The analytic continuation case study illustrates how predictive and formal equivalence are compatible with physical inequivalence, but not in the manner of standard underdetermination cases. Arguments for the physical equivalence of dual theories must cite considerations beyond predictive and formal equivalence. The analytic continuation case study is an instance of the strategy of developing a physical theory by extending the formal or mathematical equivalence with another physical theory as far as possible. That this strategy has resulted in developments in pure mathematics as well as theoretical physics is another feature that this case study has in common with dualities in string theory.  相似文献   

6.
With the Higgs boson discovery and no new physics found at the LHC, confidence in Naturalness as a guiding principle for particle physics is under increased pressure. We wait to see if it proves its mettle in the LHC upgrades ahead, and beyond. In the meantime, I present a justification a posteriori of the Naturalness criterion by suggesting that uncompromising application of the principle to Quantum Electrodynamics leads toward the Standard Model and Higgs boson without additional experimental input. Potential lessons for today and future theory building are commented upon.  相似文献   

7.
S-dualities have been held to have radical implications for our metaphysics of fundamentality. In particular, it has been claimed that they make the fundamentality status of a physical object theory-relative in an important new way. But what physicists have had to say on the issue has not been clear or consistent, and in particular seems to be ambiguous between whether S-dualities demand an anti-realist interpretation of fundamentality talk or merely a revised realism. This paper is an attempt to bring some clarity to the matter. After showing that even antecedently familiar fundamentality claims are true only relative to a raft of metaphysical, physical, and mathematical assumptions, I argue that the relativity of fundamentality inherent in S-duality nevertheless represents something new, and that part of the reason for this is that it has both realist and anti-realist implications for fundamentality talk. I close by discussing the broader significance that S-dualities have for structuralist metaphysics and for fundamentality metaphysics more generally.  相似文献   

8.
In this paper I ague against John Henry's claim that Newton embraced unmediated action at a distance as an explanation of gravity (Henry, 1994, 1999, 2011, 2014). In particular, I take issue with his apparent suggestion that the fact, as he sees it, that two of Newton's prominent followers, namely, Richard Bentley and Samuel Clarke, embraced unmediated action at a distance as an explanation of gravity provides significant supporting evidence that Newton did as well (see Henry, 1994 and 1999). Instead, I argue that while Bentley did ultimately defend the notion of unmediated action at a distance as an explanation of gravity, Newton himself accepted that notion neither in his correspondence with Bentley, as Henry has maintained, nor in any of his later works. I also provide evidence that suggests that Newton did, in fact, accept both the principle of local causation and the passivity of matter. Finally, I argue that whatever the case may be with respect to Newton on the matter, it is clear from his correspondence with Leibniz, as well as from his Boyle lectures, that contrary to what Henry has maintained, Clarke was a stalwart opponent of unmediated action at a distance due to his strong commitment to both the principle of local causation and the passivity of matter.  相似文献   

9.
10.
This paper criticizes the traditional philosophical account of the quantization of gauge theories and offers an alternative. On the received view, gauge theories resist quantization because they feature distinct mathematical representatives of the same physical state of affairs. This resistance is overcome by a sequence of ad hoc modifications, justified in part by reference to semiclassical electrodynamics. Among other things, these modifications introduce ”ghosts”: particles with unphysical properties which do not appear in asymptotic states and which are said to be purely a notational convenience. I argue that this sequence of modifications is unjustified and inadequate, making it a poor basis for the interpretation of ghosts. I then argue that gauge theories can be quantized by the same method as any other theory. On this account, ghosts are not purely notation: they are coordinates on the classical configuration space of the theory—specifically, on its gauge structure. This interpretation does not fall prey to the standard philosophical arguments against the significance of ghosts, due to Weingard. Weingard’s argumentative strategy, properly applied, in fact tells in favor of ghosts’ physical significance.  相似文献   

11.
This article traces the origins of Kenneth Wilson's conception of effective field theories (EFTs) in the 1960s. I argue that what really made the difference in Wilson's path to his first prototype of EFT are his long-standing pragmatic aspirations and methodological commitments. Wilson's primary interest was to work on mathematically interesting physical problems and he thought that progress could be made by treating them as if they could be analyzed in principle by a sufficiently powerful computer. The first point explains why he had no qualms about twisting the structure of field theories; the second why he divided the state-space of a toy model field theory into continuous slices by following a standard divide-and-conquer algorithmic strategy instead of working directly with a fully discretized and finite theory. I also show how Wilson's prototype bears the mark of these aspirations and commitments and clear up a few striking ironies along the way.  相似文献   

12.
Alison Gopnik and Andrew Meltzoff have argued for a view they call the ‘theory theory’: theory change in science and children are similar. While their version of the theory theory has been criticized for depending on a number of disputed claims, we argue that there is a fundamental problem which is much more basic: the theory theory is multiply ambiguous. We show that it might be claiming that a similarity holds between theory change in children and (i) individual scientists, (ii) a rational reconstruction of a Superscientist, or (iii) the scientific community. We argue that (i) is false, (ii) is non-empirical (which is problematic since the theory theory is supposed to be a bold empirical hypothesis), and (iii) is either false or doesn't make enough sense to have a truth-value. We conclude that the theory theory is an interesting failure. Its failure points the way to a full, empirical picture of scientific development, one that marries a concern with the social dynamics of science to a psychological theory of scientific cognition.  相似文献   

13.
The question of the existence of gravitational stress-energy in general relativity has exercised investigators in the field since the inception of the theory. Folklore has it that no adequate definition of a localized gravitational stress-energetic quantity can be given. Most arguments to that effect invoke one version or another of the Principle of Equivalence. I argue that not only are such arguments of necessity vague and hand-waving but, worse, are beside the point and do not address the heart of the issue. Based on a novel analysis of what it may mean for one tensor to depend in the proper way on another, which, en passant, provides a precise characterization of the idea of a “geometric object”, I prove that, under certain natural conditions, there can be no tensor whose interpretation could be that it represents gravitational stress-energy in general relativity. It follows that gravitational energy, such as it is in general relativity, is necessarily non-local. Along the way, I prove a result of some interest in own right about the structure of the associated jet bundles of the bundle of Lorentz metrics over spacetime. I conclude by showing that my results also imply that, under a few natural conditions, the Einstein field equation is the unique equation relating gravitational phenomena to spatiotemporal structure, and discuss how this relates to the non-localizability of gravitational stress-energy. The main theorem proven underlying all the arguments is considerably stronger than the standard result in the literature used for the same purposes (Lovelock's theorem of 1972): it holds in all dimensions (not only in four); it does not require an assumption about the differential order of the desired concomitant of the metric; and it has a more natural physical interpretation.  相似文献   

14.
This paper is concerned with Friedman׳s recent revival of the notion of the relativized a priori. It is particularly concerned with addressing the question as to how Friedman׳s understanding of the constitutive function of the a priori has changed since his defence of the idea in his Dynamics of Reason. Friedman׳s understanding of the a priori remains influenced by Reichenbach׳s initial defence of the idea; I argue that this notion of the a priori does not naturally lend itself to describing the historical development of space-time physics. Friedman׳s analysis of the role of the rotating frame thought experiment in the development of general relativity – which he suggests made the mathematical possibility of four-dimensional space-time a genuine physical possibility – has a central role in his argument. I analyse this thought experiment and argue that it is better understood by following Cassirer and placing emphasis on regulative principles. Furthermore, I argue that Cassirer׳s Kantian framework enables us to capture Friedman׳s key insights into the nature of the constitutive a priori.  相似文献   

15.
In this paper I critically review attempts to formulate and derive the geodesic principle, which claims that free massive bodies follow geodesic paths in general relativity theory. I argue that if the principle is (canonically) interpreted as a law of motion describing the actual evolution of gravitating bodies, then it is impossible to generically apply the law to massive bodies in a way that is coherent with Einstein's field equations. Rejecting the canonical interpretation, I propose an alternative interpretation of the geodesic principle as a type of universality thesis analogous to the universality behavior exhibited in thermal systems during phase transitions.  相似文献   

16.
In this paper I consider the structures that chemists and physicists attribute at the molecular scale to substances and materials of various kinds, and how they relate to structures and processes at other scales. I argue that the structure of a substance is the set of properties and relations which are preserved across all the conditions in which it can be said to exist. In short, structure is abstraction. On the basis of this view, and using concrete examples, I argue that structures, and therefore the chemical substances and other materials to which they are essential, are emergent. Firstly, structures themselves are scale-dependent because they can only exist within certain physical conditions, and a single substance may have different structures at different scales (of length, time and energy). Secondly, the distinctness of both substances and structures is a scale-dependent relationship: above a certain point, two distinct possibilities may become one. Thirdly, the necessary conditions for composition, for both substances and molecular species, are scale-dependent. To know whether a group of nuclei and electrons form a molecule it is not enough to consider energy alone: one also has to know about their environment and the lifetime over which the group robustly hangs together.  相似文献   

17.
In 2006, this journal addressed the problem of technological artefacts, and through a series of articles aimed at tackling the ‘dual nature of technical artefacts’, posited an understanding of these as constituted by both a structural (physical) and a functional (intentional) component. This attempt to conceptualise artefacts established a series of important questions, concerning such aspects of material technologies as mechanisms, functions, human intentionality, and normativity. However, I believe that in establishing the ‘dual nature’ thesis, the authors within this issue focused too strongly on technological function. By positing function as the analytic axis of the ‘dual nature’ framework, the theorists did not sufficiently problematise what is ultimately a social phenomenon. Here I posit a complementary analytic approach to this problem; namely, I argue that by using the Strong Programme’s performative theory of social institutions, we can better understand the nature of material technologies. Drawing particularly from Martin Kusch’s work, I here argue that by conceptualising artefacts as artificial kinds, we can better examine technological ontology, functions, and normativity. Ultimately, a Strong Programme approach, constructivist and collectivist in nature, offers a useful elaboration upon the important question raised by the ‘dual nature’ theorists.  相似文献   

18.
In this paper, I try to decipher the role of internal symmetries in the ontological maze of particle physics. The relationship between internal symmetries and laws of nature is discussed within the framework of “Platonic realism.” The notion of physical “structure” is introduced as representing a deeper ontological layer behind the observable world. I argue that an internal symmetry is a structure encompassing laws of nature. The application of internal symmetry groups to particle physics came about in two revolutionary steps. The first was the introduction of the internal symmetries of hadrons in the early 1960s. These global and approximate symmetries served as means of bypassing the dynamics. I argue that the realist could interpret these symmetries as ontologically prior to the hadrons. The second step was the gauge revolution in the 1970s, where symmetries became local and exact and were integrated with the dynamics. I argue that the symmetries of the second generation are fundamental in the following two respects: (1) According to the so-called “gauge argument,” gauge symmetry dictates the existence of gauge bosons, which determine the nature of the forces. This view, which has been recently criticized by some philosophers, is widely accepted in particle physics at least as a heuristic principle. (2) In view of grand unified theories, the new symmetries can be interpreted as ontologically prior to baryon matter.  相似文献   

19.
Recent insights into the conceptual structure of localization in QFT (modular localization) led to clarifications of old unsolved problems. The oldest one is the Einstein–Jordan conundrum which led Jordan in 1925 to the discovery of quantum field theory. This comparison of fluctuations in subsystems of heat bath systems (Einstein) with those resulting from the restriction of the QFT vacuum state to an open subvolume (Jordan) leads to a perfect analogy; the globally pure vacuum state becomes upon local restriction a strongly impure KMS state. This phenomenon of localization-caused thermal behavior as well as the vacuum-polarization clouds at the causal boundary of the localization region places localization in QFT into a sharp contrast with quantum mechanics and justifies the attribute “holstic”. In fact it positions the E–J Gedankenexperiment into the same conceptual category as the cosmological constant problem and the Unruh Gedankenexperiment. The holistic structure of QFT resulting from “modular localization” also leads to a revision of the conceptual origin of the crucial crossing property which entered particle theory at the time of the bootstrap S-matrix approach but suffered from incorrect use in the S-matrix settings of the dual model and string theory.The new holistic point of view, which strengthens the autonomous aspect of QFT, also comes with new messages for gauge theory by exposing the clash between Hilbert space structure and localization and presenting alternative solutions based on the use of stringlocal fields in Hilbert space. Among other things this leads to a reformulation of the Englert–Higgs symmetry breaking mechanism.  相似文献   

20.
In a paper published in 1939, Ernest Nagel described the role that projective duality had played in the reformulation of mathematical understanding through the turn of the nineteenth century, claiming that the discovery of the principle of duality had freed mathematicians from the belief that their task was to describe intuitive elements. While instances of duality in mathematics have increased enormously through the twentieth century, philosophers since Nagel have paid little attention to the phenomenon. In this paper I will argue that a reassessment is overdue. Something beyond doubt is that category theory has an enormous amount to say on the subject, for example, in terms of arrow reversal, dualising objects and adjunctions. These developments have coincided with changes in our understanding of identity and structure within mathematics. While it transpires that physicists have employed the term ‘duality’ in ways which do not always coincide with those of mathematicians, analysis of the latter should still prove very useful to philosophers of physics. Consequently, category theory presents itself as an extremely important language for the philosophy of physics.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号