首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到17条相似文献,搜索用时 0 毫秒
1.
The early history of the attempts to unify quantum theory with the general theory of relativity is depicted through the work of the Italian physicist Gleb Wataghin, who, in the context of quantum electrodynamics, has anticipated some of the ideas that the quantum gravity community is entertaining today.  相似文献   

2.
The main topics of this second part of a two-part essay are some consequences of the phenomenon of vacuum polarization as the most important physical manifestation of modular localization. Besides philosophically unexpected consequences, it has led to a new constructive “outside-inwards approach” in which the pointlike fields and the compactly localized operator algebras which they generate only appear from intersecting much simpler algebras localized in noncompact wedge regions whose generators have extremely mild almost free field behavior.Another consequence of vacuum polarization presented in this essay is the localization entropy near a causal horizon which follows a logarithmically modified area law in which a dimensionless area (the area divided by the square of dR where dR is the thickness of a light-sheet) appears. There are arguments that this logarithmically modified area law corresponds to the volume law of the standard heat bath thermal behavior. We also explain the symmetry enhancing effect of holographic projections onto the causal horizon of a region and show that the resulting infinite dimensional symmetry groups contain the Bondi–Metzner–Sachs group. This essay is the second part of a partitioned longer paper.  相似文献   

3.
4.
When attempting to assess the strengths and weaknesses of various principles in their potential role of guiding the formulation of a theory of quantum gravity, it is crucial to distinguish between principles which are strongly supported by empirical data – either directly or indirectly – and principles which instead (merely) rely heavily on theoretical arguments for their justification. Principles in the latter category are not necessarily invalid, but their a priori foundational significance should be regarded with due caution. These remarks are illustrated in terms of the current standard models of cosmology and particle physics, as well as their respective underlying theories, i.e., essentially general relativity and quantum (field) theory. For instance, it is clear that both standard models are severely constrained by symmetry principles: an effective homogeneity and isotropy of the known universe on the largest scales in the case of cosmology and an underlying exact gauge symmetry of nuclear and electromagnetic interactions in the case of particle physics. However, in sharp contrast to the cosmological situation, where the relevant symmetry structure is more or less established directly on observational grounds, all known, nontrivial arguments for the “gauge principle” are purely theoretical (and far less conclusive than usually advocated). Similar remarks apply to the larger theoretical structures represented by general relativity and quantum (field) theory, where – actual or potential – empirical principles, such as the (Einstein) equivalence principle or EPR-type nonlocality, should be clearly differentiated from theoretical ones, such as general covariance or renormalizability. It is argued that if history is to be of any guidance, the best chance to obtain the key structural features of a putative quantum gravity theory is by deducing them, in some form, from the appropriate empirical principles (analogous to the manner in which, say, the idea that gravitation is a curved spacetime phenomenon is arguably implied by the equivalence principle). Theoretical principles may still be useful however in formulating a concrete theory (analogous to the manner in which, say, a suitable form of general covariance can still act as a sieve for separating theories of gravity from one another). It is subsequently argued that the appropriate empirical principles for deducing the key structural features of quantum gravity should at least include (i) quantum nonlocality, (ii) irreducible indeterminacy (or, essentially equivalently, given (i), relativistic causality), (iii) the thermodynamic arrow of time, (iv) homogeneity and isotropy of the observable universe on the largest scales. In each case, it is explained – when appropriate – how the principle in question could be implemented mathematically in a theory of quantum gravity, why it is considered to be of fundamental significance and also why contemporary accounts of it are insufficient. For instance, the high degree of uniformity observed in the Cosmic Microwave Background is usually regarded as theoretically problematic because of the existence of particle horizons, whereas the currently popular attempts to resolve this situation in terms of inflationary models are, for a number of reasons, less than satisfactory. However, rather than trying to account for the required empirical features dynamically, an arguably much more fruitful approach consists in attempting to account for these features directly, in the form of a lawlike initial condition within a theory of quantum gravity.  相似文献   

5.
6.
In this discussion paper, I seek to challenge Hylarie Kochiras’ recent claims on Newton’s attitude towards action at a distance, which will be presented in Section 1. In doing so, I shall include the positions of Andrew Janiak and John Henry in my discussion and present my own tackle on the matter (Section 2). Additionally, I seek to strengthen Kochiras’ argument that Newton sought to explain the cause of gravity in terms of secondary causation (Section 3). I also provide some specification on what Kochiras calls ‘Newton’s substance counting problem’ (Section 4). In conclusion, I suggest a historical correction (Section 5).  相似文献   

7.
Philosophers of science have paid little attention, positive or negative, to Lyotard’s book The postmodern condition, even though it has been popular in other fields. We set out some of the reasons for this neglect. Lyotard thought that sciences could be justified by non-scientific narratives (a position he later abandoned). We show why this is unacceptable, and why many of Lyotard’s characterisations of science are either implausible or are narrowly positivist. One of Lyotard’s themes is that the nature of knowledge has changed and thereby so has society itself. However much of what Lyotard says muddles epistemological matters about the definition of ‘knowledge’ with sociological claims about how information circulates in modern society. We distinguish two kinds of legitimation of science: epistemic and socio-political. In proclaiming ‘incredulity towards metanarratives’ Lyotard has nothing to say about how epistemic and methodological principles are to be justified (legitimated). He also gives a bad argument as to why there can be no epistemic legitimation, which is based on an act/content confusion, and a confusion between making an agreement and the content of what is agreed to. As for socio-political legitimation, Lyotard’s discussion remains at the abstract level of science as a whole rather than at the level of the particular applications of sciences. Moreover his positive points can be accepted without taking on board any of his postmodernist account of science. Finally we argue that Lyotard’s account of paralogy, which is meant to provide a ‘postmodern’ style of justification, is a failure.  相似文献   

8.
I attempt a reconstruction of Kant’s version of the causal theory of time that makes it appear coherent. Two problems are at issue. The first concerns Kant’s reference to reciprocal causal influence for characterizing simultaneity. This approach is criticized by pointing out that Kant’s procedure involves simultaneous counterdirected processes—which seems to run into circularity. The problem can be defused by drawing on instantaneous processes such as the propagation of gravitation in Newtonian mechanics. Another charge of circularity against Kant’s causal theory was leveled by Schopenhauer. His objection was that Kant’s approach is invalidated by the failure to deliver non-temporal criteria for distinguishing between causes and effects. I try to show that the modern causal account has made important progress toward a successful resolution of this difficulty. The fork asymmetry, as based on Reichenbach’s principle of the common cause, provides a means for the distinction between cause and effect that is not based on temporal order (if some preconditions are realized).  相似文献   

9.
Building on Norton's “material theory of induction,” this paper shows through careful historical analysis that analogy can act as a methodological principle or stratagem, providing experimentalists with a useful framework to assess data and devise novel experiments. Although this particular case study focuses on late eighteenth and early nineteenth-century experiments on the properties and composition of acids, the results of this investigation may be extended and applied to other research programs. A stage in-between what Steinle calls “exploratory experimentation” and robust theory, I argue that analogy encouraged research to substantiate why the likenesses should outweigh the differences (or vice versa) when evaluating results and designing experiments.  相似文献   

10.
John D. Norton is responsible for a number of influential views in contemporary philosophy of science. This paper will discuss two of them. The material theory of induction claims that inductive arguments are ultimately justified by their material features, not their formal features. Thus, while a deductive argument can be valid irrespective of the content of the propositions that make up the argument, an inductive argument about, say, apples, will be justified (or not) depending on facts about apples. The argument view of thought experiments claims that thought experiments are arguments, and that they function epistemically however arguments do. These two views have generated a great deal of discussion, although there hasn't been much written about their combination. I argue that despite some interesting harmonies, there is a serious tension between them. I consider several options for easing this tension, before suggesting a set of changes to the argument view that I take to be consistent with Norton's fundamental philosophical commitments, and which retain what seems intuitively correct about the argument view. These changes require that we move away from a unitary epistemology of thought experiments and towards a more pluralist position.  相似文献   

11.
The question of the existence of gravitational stress-energy in general relativity has exercised investigators in the field since the inception of the theory. Folklore has it that no adequate definition of a localized gravitational stress-energetic quantity can be given. Most arguments to that effect invoke one version or another of the Principle of Equivalence. I argue that not only are such arguments of necessity vague and hand-waving but, worse, are beside the point and do not address the heart of the issue. Based on a novel analysis of what it may mean for one tensor to depend in the proper way on another, which, en passant, provides a precise characterization of the idea of a “geometric object”, I prove that, under certain natural conditions, there can be no tensor whose interpretation could be that it represents gravitational stress-energy in general relativity. It follows that gravitational energy, such as it is in general relativity, is necessarily non-local. Along the way, I prove a result of some interest in own right about the structure of the associated jet bundles of the bundle of Lorentz metrics over spacetime. I conclude by showing that my results also imply that, under a few natural conditions, the Einstein field equation is the unique equation relating gravitational phenomena to spatiotemporal structure, and discuss how this relates to the non-localizability of gravitational stress-energy. The main theorem proven underlying all the arguments is considerably stronger than the standard result in the literature used for the same purposes (Lovelock's theorem of 1972): it holds in all dimensions (not only in four); it does not require an assumption about the differential order of the desired concomitant of the metric; and it has a more natural physical interpretation.  相似文献   

12.
This paper explores various metaphysical aspects of Leibniz's concepts of space, motion, and matter, with the intention of demonstrating how the distinctive role of force in Leibnizian physics can be used to develop a theory of relational motion using privileged reference frames. Although numerous problems will remain for a consistent Leibnizian relationist account, the version developed within our investigation will advance the work of previous commentators by more accurately reflecting the specific details of Leibniz's own natural philosophy, especially his handling of the dynamical interactions of plenum bodies.  相似文献   

13.
The history of modern economics abounds with pleas for more pluralism as well as pleas for more unification. These seem to be contradictory goals, suggesting that pluralism and unification are mutually exclusive, or at least that they involve trade-offs with more of one necessarily being traded off against less of the other. This paper will use the example of Paul Samuelson's Foundations of Economic Analysis (1947) to argue that the relationship between pluralism and unification is often more complex than this simple dichotomy suggests. In particular, Samuelson's Foundations is invariably presented as a key text in the unification of modern economics during the middle of the twentieth century; and in many ways that is entirely correct. But Samuelson's unification was not at the theoretical (causal and explanatory) level, but rather at the purely mathematical derivational level. Although this fact is recognized in the literature on Samuelson, what seems to be less recognized is that for Samuelson, much of the motivation for this unification was pluralist in spirit: not to narrow scientific economics into one single theory, but rather to allow for more than one theory to co-exist under a single unified derivational technique. This hidden pluralism will be discussed in detail. The paper concludes with a discussion of the implications for more recent developments in economics.  相似文献   

14.
Šešelja and Straßer’s critique fails to hit its target for two main reasons. First, the argument is not that Kuhn is a rationalist because he is a coherentist. Although Kuhn can be taken as a rationalist because of his commitment to epistemic values, coherence analysis provides a more comprehensive characterisation of cognitive process in scientific change than any of these values alone can offer. Further, we should understand Kuhn as characterising science as the best form of rationality we have outside logic, which rules out algorithmic rationality and allows non-cognitive factors to play a role in theory change. Second, Šešelja and Straßer overemphasise the importance of a priori reasoning in Kuhn, which was only an alternative to his earlier historical-empirical approach. My suggestion is that Kuhn’s neo-Kantian historical cognitivism integrates the earlier empirical and the later a-prioristic orientations. According to it, that any understanding of the world is preconditioned by some kind of mental module that is liable to change, detected as a discontinuity in the historical record of science.  相似文献   

15.
In 2006, this journal addressed the problem of technological artefacts, and through a series of articles aimed at tackling the ‘dual nature of technical artefacts’, posited an understanding of these as constituted by both a structural (physical) and a functional (intentional) component. This attempt to conceptualise artefacts established a series of important questions, concerning such aspects of material technologies as mechanisms, functions, human intentionality, and normativity. However, I believe that in establishing the ‘dual nature’ thesis, the authors within this issue focused too strongly on technological function. By positing function as the analytic axis of the ‘dual nature’ framework, the theorists did not sufficiently problematise what is ultimately a social phenomenon. Here I posit a complementary analytic approach to this problem; namely, I argue that by using the Strong Programme’s performative theory of social institutions, we can better understand the nature of material technologies. Drawing particularly from Martin Kusch’s work, I here argue that by conceptualising artefacts as artificial kinds, we can better examine technological ontology, functions, and normativity. Ultimately, a Strong Programme approach, constructivist and collectivist in nature, offers a useful elaboration upon the important question raised by the ‘dual nature’ theorists.  相似文献   

16.
String theorists are certain that they are practicing physicists. Yet, some of their recent critics deny this. This paper argues that this conflict is really about who holds authority in making rational judgment in theoretical physics. At bottom, the conflict centers on the question: who is a proper physicist? To illustrate and understand the differing opinions about proper practice and identity, we discuss different appreciations of epistemic virtues and explanation among string theorists and their critics, and how these have been sourced in accounts of Einstein's biography. Just as Einstein is claimed by both sides, historiography offers examples of both successful and unsuccessful non-empirical science. History of science also teaches that times of conflict are often times of innovation, in which novel scholarly identities may come into being. At the same time, since the contributions of Thomas Kuhn historians have developed a critical attitude towards formal attempts and methodological recipes for epistemic demarcation and justification of scientific practice. These are now, however, being considered in the debate on non-empirical physics.  相似文献   

17.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号