首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
I present in detail the case for regarding black hole thermodynamics as having a statistical-mechanical explanation in exact parallel with the statistical-mechanical explanation believed to underlie the thermodynamics of other systems. (Here I presume that black holes are indeed thermodynamic systems in the fullest sense; I review the evidence for that conclusion in the prequel to this paper.) I focus on three lines of argument: (i) zero-loop and one-loop calculations in quantum general relativity understood as a quantum field theory, using the path-integral formalism; (ii) calculations in string theory of the leading-order terms, higher-derivative corrections, and quantum corrections, in the black hole entropy formula for extremal and near-extremal black holes; (iii) recovery of the qualitative and (in some cases) quantitative structure of black hole statistical mechanics via the AdS/CFT correspondence. In each case I briefly review the content of, and arguments for, the form of quantum gravity being used (effective field theory; string theory; AdS/CFT) at a (relatively) introductory level: the paper is aimed at readers with some familiarity with thermodynamics, quantum mechanics and general relativity but does not presume advanced knowledge of quantum gravity. My conclusion is that the evidence for black hole statistical mechanics is as solid as we could reasonably expect it to be in the absence of a directly-empirically-verified theory of quantum gravity.  相似文献   

2.
I give a fairly systematic and thorough presentation of the case for regarding black holes as thermodynamic systems in the fullest sense, aimed at readers with some familiarity with thermodynamics, quantum mechanics and general relativity but not presuming advanced knowledge of quantum gravity. I pay particular attention to (i) the availability in classical black hole thermodynamics of a well-defined notion of adiabatic intervention; (ii) the power of the membrane paradigm to make black hole thermodynamics precise and to extend it to local-equilibrium contexts; (iii) the central role of Hawking radiation in permitting black holes to be in thermal contact with one another; (iv) the wide range of routes by which Hawking radiation can be derived and its back-reaction on the black hole calculated; (v) the interpretation of Hawking radiation close to the black hole as a gravitationally bound thermal atmosphere. In an appendix I discuss recent criticisms of black hole thermodynamics by Dougherty and Callender. This paper confines its attention to the thermodynamics of black holes; a sequel will consider their statistical mechanics.  相似文献   

3.
This paper discusses some philosophical aspects related to the recent publication of the experimental results of the 2017 black hole experiment, namely the first image of the supermassive black hole at the center of galaxy M87. In this paper I present a philosophical analysis of the 2017 Event Horizon Telescope (EHT) black hole experiment. I first present Hacking's philosophy of experimentation. Hacking gives his taxonomy of elements of laboratory science and distinguishes a list of elements. I show that the EHT experiment conforms to major elements from Hacking's list. I then describe with the help of Galison's Philosophy of the Shadow how the EHT Collaboration created the famous black hole image. Galison outlines three stages for the reconstruction of the black hole image: Socio-Epistemology, Mechanical Objectivity, after which there is an additional Socio-Epistemology stage. I subsequently present my own interpretation of the reconstruction of the black hole image and I discuss model fitting to data. I suggest that the main method used by the EHT Collaboration to assure trust in the results of the EHT experiment is what philosophers call the Argument from Coincidence. I show that using this method for the above purpose is problematic. I present two versions of the Argument from Coincidence: Hacking's Coincidence and Cartwright's Reproducibility by which I analyse the EHT experiment. The same estimation of the mass of the black hole is reproduced in four different procedures. The EHT Collaboration concludes: the value we have converged upon is robust. I analyse the mass measurements of the black hole with the help of Cartwright's notion of robustness. I show that the EHT Collaboration construe Coincidence/Reproducibility as Technological Agnosticism and I contrast this interpretation with van Fraassen's scientific agnosticism.  相似文献   

4.
It is generally accepted, following Landauer and Bennett, that the process of measurement involves no minimum entropy cost, but the erasure of information in resetting the memory register of a computer to zero requires dissipating heat into the environment. This thesis has been challenged recently in a two-part article by Earman and Norton. I review some relevant observations in the thermodynamics of computation and argue that Earman and Norton are mistaken: there is in principle no entropy cost to the acquisition of information, but the destruction of information does involve an irreducible entropy cost.  相似文献   

5.
This paper analyzes the experiment presented in 2019 by the Event Horizon Telescope (EHT) Collaboration that unveiled the first image of the supermassive black hole at the center of galaxy M87. The intended aim of the paper is to assess whether the EHT Collaboration has made an “inference to the best explanation” (IBE) to conclude that the data effectively confirm the hypothesis that the object at the center of M87 is in fact a supermassive Kerr rotating black hole. I demonstrate that the EHT Collaboration has applied an IBE. It is shown that the hypothesis that at the center of M87 there is a supermassive Kerr rotating black hole was already the best explanation at the time in which the 2017 EHT experiment was conducted. My analysis is intertwined with considerations on realist and empiricist interpretations of IBE, which are used to assess whether the conclusion that the object at the center of M87 is a Kerr rotating black hole implies holding a realist commitment with respect to such object.  相似文献   

6.
Black hole complementarity has been proposed as a way to reconcile the result of Hawking, that black holes evaporate, with fundamental unitary quantum theories of gravity, such as string theory. Hawking's semi-classical analysis suggests that the evaporation of black holes is a non-unitary process, yet black hole complementarity gives a perspective on the semi-classical black hole which retains unitarity. We outline this proposal and address a number of methodological criticisms that have been made with regard to this proposal.  相似文献   

7.
Landauer's Principle asserts that there is an unavoidable cost in thermodynamic entropy creation when data is erased. It is usually derived from incorrect assumptions, most notably, that erasure must compress the phase space of a memory device or that thermodynamic entropy arises from the probabilistic uncertainty of random data. Recent work seeks to prove Landauer's Principle without using these assumptions. I show that the processes assumed in the proof, and in the thermodynamics of computation more generally, can be combined to produce devices that both violate the second law and erase data without entropy cost, indicating an inconsistency in the theoretical system. Worse, the standard repertoire of processes selectively neglects thermal fluctuations. Concrete proposals for how we might measure dissipationlessly and expand single molecule gases reversibly are shown to be fatally disrupted by fluctuations. Reversible, isothermal processes on molecular scales are shown to be disrupted by fluctuations that can only be overcome by introducing entropy creating, dissipative processes.  相似文献   

8.
The microscopic explanation of the physical phenomena represented by a macroscopic theory is often cast in terms of the reduction of the latter to a more fundamental theory, which represents the same phenomena at the microscopic level, albeit in an idealized way. In particular, the reduction of thermodynamics to statistical mechanics is a much discussed case-study in philosophy of physics. Based on the Generalized Nagel–Schaffner model, the alleged reductive explanation would be accomplished if one finds a corrected version of classical thermodynamics that can be strictly derived from statistical mechanics. That is the sense in which, according to Callender (1999, 2001), one should not take thermodynamics too seriously. Arguably, the sought-after revision is given by statistical thermodynamics, intended as a macroscopic theory equipped with a probabilistic law of equilibrium fluctuations. The present paper aims to evaluate this proposal. The upshot is that, while statistical thermodynamics enables one to re-define equilibrium so as to agree with Boltzmann entropy, it does not provide a definitive solution to the problem of explaining macroscopic irreversibility from a microscopic point of view.  相似文献   

9.
When considering controversial thermodynamic scenarios such as Maxwell's demon, it is often necessary to consider probabilistic mixtures of macrostates. This raises the question of how, if at all, to assign entropy to them. The information-theoretic entropy is often used in such cases; however, no general proof of the soundness of doing so has been given, and indeed some arguments against doing so have been presented. We offer a general proof of the applicability of the information-theoretic entropy to probabilistic mixtures of macrostates that is based upon a probabilistic generalisation of the Kelvin statement of the second law. We defend the latter and make clear the other assumptions on which our main result depends. We also briefly discuss the interpretation of our result.  相似文献   

10.
Thermodynamics has a clear arrow of time, characterized by the irreversible approach to equilibrium. This stands in contrast to the laws of microscopic theories, which are invariant under time-reversal. Foundational discussions of this “problem of irreversibility” often focus on historical considerations, and do therefore not take results of modern physical research on this topic into account. In this article, I will close this gap by studying the implications of dynamical density functional theory (DDFT), a central method of modern nonequilibrium statistical mechanics not previously considered in philosophy of physics, for this debate. For this purpose, the philosophical discussion of irreversibility is structured into five problems, concerned with the source of irreversibility in thermodynamics, the definition of equilibrium and entropy, the justification of coarse-graining, the approach to equilibrium and the arrow of time. For each of these problems, it is shown that DDFT provides novel insights that are of importance for both physicists and philosophers of physics.  相似文献   

11.
12.
I give a brief account of the way in which thermodynamics and statistical mechanics actually work as contemporary scientific theories, and in particular of what statistical mechanics contributes to thermodynamics over and above any supposed underpinning of the latter׳s general principles. In doing so, I attempt to illustrate that statistical mechanics should not be thought of wholly or even primarily as itself a foundational project for thermodynamics, and that conceiving of it this way potentially distorts the foundational study of statistical mechanics itself.  相似文献   

13.
Can we explain the laws of thermodynamics, in particular the irreversible increase of entropy, from the underlying quantum mechanical dynamics? Attempts based on classical dynamics have all failed. Albert (1994a,b; 2000) proposed a way to recover thermodynamics on a purely dynamical basis, using the quantum theory of the collapse of the wavefunction of Ghirardi, Rimini and Weber (1986). In this paper we propose an alternative way to explain thermodynamics within no-collapse interpretations of quantum mechanics. Our approach relies on the standard quantum mechanical models of environmental decoherence of open systems, e.g. Joos and Zeh (1985) and Zurek and Paz (1994).  相似文献   

14.
One finds, in Maxwell's writings on thermodynamics and statistical physics, a conception of the nature of these subjects that differs in interesting ways from the way they are usually conceived. In particular, though—in agreement with the currently accepted view—Maxwell maintains that the second law of thermodynamics, as originally conceived, cannot be strictly true, the replacement he proposes is different from the version accepted by most physicists today. The modification of the second law accepted by most physicists is a probabilistic one: although statistical fluctuations will result in occasional spontaneous differences in temperature or pressure, there is no way to predictably and reliably harness these to produce large violations of the original version of the second law. Maxwell advocates a version of the second law that is strictly weaker; the validity of even this probabilistic version is of limited scope, limited to situations in which we are dealing with large numbers of molecules en masse and have no ability to manipulate individual molecules. Connected with this is his conception of the thermodynamic concepts of heat, work, and entropy; on the Maxwellian view, these are concept that must be relativized to the means we have available for gathering information about and manipulating physical systems. The Maxwellian view is one that deserves serious consideration in discussions of the foundation of statistical mechanics. It has relevance for the project of recovering thermodynamics from statistical mechanics because, in such a project, it matters which version of the second law we are trying to recover.  相似文献   

15.
I began this study with Laudan's argument from the pessimistic induction and I promised to show that the caloric theory of heat cannot be used to support the premisses of the meta-induction on past scientific theories. I tried to show that the laws of experimental calorimetry, adiabatic change and Carnot's theory of the motive power of heat were (i) independent of the assumption that heat is a material substance, (ii) approximately true, (iii) deducible and accounted for within thermodynamics.I stressed that results (i) and (ii) were known to most theorists of the caloric theory and that result (iii) was put forward by the founders of the new thermodynamics. In other words, the truth-content of the caloric theory was located, selected carefully, and preserved by the founders of thermodynamics.However, the reader might think that even if I have succeeded in showing that laudan is wrong about the caloric theory, I have not shown how the strategy followed in this paper can be generalised against the pessimistic meta-induction. I think that the general strategy against Laudan's argument suggested in this paper is this: the empirical success of a mature scientific theory suggests that there are respects and degrees in which this theory is true. The difficulty for — and and real challenge to — philosophers of science is to suggest ways in which this truth-content can be located and shown to be preserved — if at all — to subsequent theories. In particular, the empirical success of a theory does not, automatically, suggest that all theoretical terms of the theory refer. On the contrary, judgments of referential success depend on which theoretical claims are well-supported by the evidence. This is a matter of specific investigation. Generally, one would expect that claims about theoretical entities which are not strongly supported by the evidence or turn out to be independent of the evidence at hand, are not compelling. For simply, if the evidence does not make it likely that our beliefs about putative theoretical entities are approximately correct, a belief in those entities would be ill-founded and unjustified. Theoretical extrapolations in science are indespensable , but they are not arbitrary. If the evidence does not warrant them I do not see why someone should commit herself to them. In a sense, the problem with empricist philisophers is not that they demand that theoretical beliefs must be warranted by evidence. Rather, it is that they claim that no evidence can warrant theorretical beliefs. A realist philosopher of science would not disagree on the first, but she has good grounds to deny the second.I argued that claims about theoretical entities which are not strongly supported by the evidence must not be taken as belief-worthy. But can one sustaon the more ambitious view that loosely supported parts of a theory tend to be just those that include non-referring terms? There is an obvious excess risk in such a generalisation. For there are well-known cases in which a theoretical claim was initially weakly supported by the evidence  相似文献   

16.
It is often held by philosophers of science that special, idealized situations are prior to complex cases in several senses: equations for complex cases are derived from those for special cases by “composing” special case equations; behavior in complex cases is explained in terms of behavior in special cases; one learns the true nature of a property in the special case where it is allowed to work in isolation. In this paper, I argue that a strand of non-equilibrium thermodynamics which attempts to go beyond the limitations of classical non-equilibrium thermodynamics adheres to something that is the reverse of this picture. Thus, the legitimacy (or lack thereof) of this picture lies very near to the heart of foundational issues in non-equilibrium thermodynamics.  相似文献   

17.
‘Holographic’ relations between theories have become an important theme in quantum gravity research. These relations entail that a theory without gravity is equivalent to a gravitational theory with an extra spatial dimension. The idea of holography was first proposed in 1993 by Gerard ׳t Hooft on the basis of his studies of evaporating black holes. Soon afterwards the holographic ‘AdS/CFT’ duality was introduced, which since has been intensively studied in the string theory community and beyond. Recently, Erik Verlinde has proposed that even Newton׳s law of gravitation can be related holographically to the ‘thermodynamics of information’ on screens. We discuss these scenarios, with special attention to the status of the holographic relation in them and to the question of whether they make gravity and spacetime emergent. We conclude that only Verlinde׳s scheme straightforwardly instantiates emergence. However, assuming a non-standard interpretation of AdS/CFT may create room for the emergence of spacetime and gravity there as well.  相似文献   

18.
In 1904 Joachim published an influential paper dealing with ‘Aristotle's Conception of Chemical Combination’1 which has provided the basis of much more recent studies.2 About the same time, Duhem3 developed what he regarded as an essentially Aristotelian view of chemistry, based on his understanding of phenomenological thermodynamics. He does not present a detailed textual analysis, but rather emphasises certain general ideas. Joachim's classic paper contains obscurities which I have been unable to fathom and theses which do not seem to be fully explained, or which at least seem difficult for the modern reader to understand. An attempt is made here to provide a systematic account of the Aristotelian theory of the generation of substances by the mixing of elements by reconsidering Joachim's treatment in the light of the sort of points which most interested Duhem.The work described in this paper was undertaken with a view to providing a basis for presenting, evaluating and criticising Duhem's understanding of what was for him modern (i.e. 19th-century) chemistry. This latter project will be taken up on another occasion. I hope the present paper will be of some value to a broader philosophical readership in so far as it provides a fairly clear conception of matter which might be called Aristotelian, even if it is not precisely Aristotle's, and raises certain clear problems of interpretation. It may also be of interest to historians of chemistry in suggesting an analysis of the old chemical notion of a mixt independent of atomic theories.  相似文献   

19.
The aim of this paper is to survey and discuss some key connections between information and confirmation within a broadly Bayesian framework. We mean to show that treating information and confirmation in a unified fashion is an intuitive and fruitful approach, fostering insights and prospects in the analysis of a variety of related notions such as belief change, partial entailment, entropy, the value of experiments, and more besides. To this end, we recapitulate established theoretical achievements, disclose a number of underlying links, and provide a few novel results.  相似文献   

20.
As is well known from Einstein (1905) the choice of a criterion for distant simultaneity is equivalent to stipulating one-way speeds for the transit of light. It is shown that any choice of non-standard synchrony is equivalent to a Lorentz local time boost. From this and considerations from the hole argument, it follows that there is a non-trivial sense in which distant simultaneity is conventional, at least to the extent that the “gauge freedom” arising in the hole argument is non-trivial.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号