首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 10 毫秒
1.
2.
I give a fairly systematic and thorough presentation of the case for regarding black holes as thermodynamic systems in the fullest sense, aimed at readers with some familiarity with thermodynamics, quantum mechanics and general relativity but not presuming advanced knowledge of quantum gravity. I pay particular attention to (i) the availability in classical black hole thermodynamics of a well-defined notion of adiabatic intervention; (ii) the power of the membrane paradigm to make black hole thermodynamics precise and to extend it to local-equilibrium contexts; (iii) the central role of Hawking radiation in permitting black holes to be in thermal contact with one another; (iv) the wide range of routes by which Hawking radiation can be derived and its back-reaction on the black hole calculated; (v) the interpretation of Hawking radiation close to the black hole as a gravitationally bound thermal atmosphere. In an appendix I discuss recent criticisms of black hole thermodynamics by Dougherty and Callender. This paper confines its attention to the thermodynamics of black holes; a sequel will consider their statistical mechanics.  相似文献   

3.
One finds, in Maxwell's writings on thermodynamics and statistical physics, a conception of the nature of these subjects that differs in interesting ways from the way they are usually conceived. In particular, though—in agreement with the currently accepted view—Maxwell maintains that the second law of thermodynamics, as originally conceived, cannot be strictly true, the replacement he proposes is different from the version accepted by most physicists today. The modification of the second law accepted by most physicists is a probabilistic one: although statistical fluctuations will result in occasional spontaneous differences in temperature or pressure, there is no way to predictably and reliably harness these to produce large violations of the original version of the second law. Maxwell advocates a version of the second law that is strictly weaker; the validity of even this probabilistic version is of limited scope, limited to situations in which we are dealing with large numbers of molecules en masse and have no ability to manipulate individual molecules. Connected with this is his conception of the thermodynamic concepts of heat, work, and entropy; on the Maxwellian view, these are concept that must be relativized to the means we have available for gathering information about and manipulating physical systems. The Maxwellian view is one that deserves serious consideration in discussions of the foundation of statistical mechanics. It has relevance for the project of recovering thermodynamics from statistical mechanics because, in such a project, it matters which version of the second law we are trying to recover.  相似文献   

4.
5.
6.
I present in detail the case for regarding black hole thermodynamics as having a statistical-mechanical explanation in exact parallel with the statistical-mechanical explanation believed to underlie the thermodynamics of other systems. (Here I presume that black holes are indeed thermodynamic systems in the fullest sense; I review the evidence for that conclusion in the prequel to this paper.) I focus on three lines of argument: (i) zero-loop and one-loop calculations in quantum general relativity understood as a quantum field theory, using the path-integral formalism; (ii) calculations in string theory of the leading-order terms, higher-derivative corrections, and quantum corrections, in the black hole entropy formula for extremal and near-extremal black holes; (iii) recovery of the qualitative and (in some cases) quantitative structure of black hole statistical mechanics via the AdS/CFT correspondence. In each case I briefly review the content of, and arguments for, the form of quantum gravity being used (effective field theory; string theory; AdS/CFT) at a (relatively) introductory level: the paper is aimed at readers with some familiarity with thermodynamics, quantum mechanics and general relativity but does not presume advanced knowledge of quantum gravity. My conclusion is that the evidence for black hole statistical mechanics is as solid as we could reasonably expect it to be in the absence of a directly-empirically-verified theory of quantum gravity.  相似文献   

7.
The aim of this study is twofold: to explore, first, the influence of the intellectual and social conditions on the transfer of thermodynamics to chemistry and thereby the making of chemical thermodynamics, and second, the way that this knowledge was transferred from Europe to America. Consequently, it is of interest to examine the methodological approaches used by physicists and chemists to transfer thermodynamics to chemistry, to evaluate the potential of this science to offer solutions to existing chemical problems, and to discuss the attitude of the scientific community towards these new ideas. The development of chemical thermodynamics in America followed a different route compared to the European experience. Although it was transferred from Europe, it had distinctive characteristics imposed by a different traditional, intellectual and social milieu. This study focuses on the content of the transferred knowledge to America and the direction that this knowledge assumed by the American scientists. As a paradigm, the chemical thermodynamics of Gilbert Newton Lewis will be considered.  相似文献   

8.
The three basic modelling approaches used to explain forest fire behaviour are theoretically, laboratory or empirically based. Results of all three approaches are reviewed, but it is noted that only the laboratory- and empirically based models have led to forecasting techniques that are in widespread use. These are the Rothermel model and the McArthur meters, respectively. Field tests designed to test the performance of these operational models were carried out in tropical grasslands. A preliminary analysis indicated that the Rothermel model overpredicted spread rates while the McArthur model underpredicted. To improve the forecast of bushfire rate of spread available to operational firefighting crews it is suggested that a time-variable parameter (TYP) recursive least squares algorithm can be used to assign weights to the respective models, with the weights recursively updated as information on fire-front location becomes available. Results of this methodology when applied to US Grasslands fire experiment data indicate that the quality of the input combined with a priori knowledge of the performance of the candidate models plays an important role in the performance of the TVP algorithm. With high-quality input data, the Rothermel model on its own outperformed the TVP algorithm, but with slightly inferior data both approaches were comparable. Though the use of all available data in a multiple linear regression produces a lower sum of squared errors than the recursive, time-variable weighting approach, or that of any single model, the uncertainties of data input and consequent changes in weighting coefficients during operational conditions suggest the use of the TVP algorithm approach.  相似文献   

9.
In this paper, I examine William Whewell’s (1794–1866) ‘Discoverer’s Induction’, and argue that it supplies a strikingly accurate characterization of the logic behind many statistical methods, exploratory data analysis (EDA) in particular. Such methods are additionally well-suited as a point of evaluation of Whewell’s philosophy since the central techniques of EDA were not invented until after Whewell’s death, and so couldn’t have influenced his views. The fact that the quantitative details of some very general methods designed to suggest hypotheses would so closely resemble Whewell’s views of how theories are formed is, I suggest, a strongly positive comment on his views.  相似文献   

10.
C S Tsao 《Experientia》1984,40(2):168-170
The combination of calcium and ascorbic acid in water at 25 degrees C has been examined by measuring the change of free calcium ion concentration as ascorbate was added in small increment to a solution of calcium. The data show clearly that complex formation between calcium ion and ascorbate ion occurred. At ionic strength mu = 0.1-0.2, the equilibrium constant of Ca++ and the singly-charged ascorbate ion has been measured to be 2.1 M-1. The precision of the result is better than 5% and the accuracy is estimated to be better than 20%. The application of the equilibrium constants is discussed.  相似文献   

11.
Summary The combination of calcium and ascorbic acid in water at 25°C has been examined by measuring the change of free calcium ion concentration as ascorbate was added in small increment to a solution of calcium. The data show clearly that complex formation between calcium ion and ascorbate ion occurred. At ionic strength =0.1–0.2, the equilibrium constant of Ca++ and the singly-charged ascorbate ion has been measured to be 2.1 M–1. The precision of the result is better than 5% and the accuracy is estimated to be better than 20%. The application of the equilibrium constants is discussed.  相似文献   

12.
13.
This paper argues in favour of a closer link between the decision and the forecast evaluation problems. Although the idea of using decision theory for forecast evaluation appears early in the dynamic stochastic programming literature, and has continued to be used with meteorological forecasts, it is hardly mentioned in standard academic texts on economic forecasting. Some of the main issues involved are illustrated in the context of a two‐state, two‐action decision problem as well as in a more general setting. Relationships between statistical and economic methods of forecast evaluation are discussed and links between the Kuipers score used as a measure of forecast accuracy in the meteorology literature and the market timing tests used in finance are established. An empirical application to the problem of stock market predictability is also provided, and the conditions under which such predictability could be explained in the presence of transaction costs are discussed. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

14.
Recent evidence indicates that cell death can be induced through multiple mechanisms. Strikingly, the same death signal can often induce apoptotic as well as non-apoptotic cell death. For instance, inhibition of caspases often converts an apoptotic stimulus to one that causes necrosis. Because a dedicated molecular circuitry distinct from that controlling apoptosis is required for necrotic cell injury, terms such as “programmed necrosis” or “necroptosis” have been used to distinguish stimulus-dependent necrosis from those induced by non-specific traumas (e.g., heat shock) or secondary necrosis induced as a consequence of apoptosis. In several experimental models, programmed necrosis/necroptosis has been shown to be a crucial control point for pathogen- or injury-induced inflammation. In this review, we will discuss the molecular mechanisms that regulate programmed necrosis/necroptosis and its biological significance in pathogen infections, drug-induced cell injury, and trauma-induced tissue damage.  相似文献   

15.
The distribution function associated with a classical gas at equilibrium is considered. We prove that apart from a factorisable multiplier, the distribution function is fully determined by the correlations among local momenta fluctuations. Using this result we discuss the conditions which enable idealised local observers, who are immersed in the gas and form a part of it, to determine the distribution ‘from within’. This analysis sheds light on two views on thermodynamic equilibrium, the ‘ergodic’ and the ‘thermodynamic limit’ schools, and the relations between them. It also provides an outline for a new definition of equilibrium that is weaker than full ergodicity. Finally, we briefly discuss the possibility that the distribution can be determined by external observers.  相似文献   

16.
This paper is concerned with modelling time series by single hidden layer feedforward neural network models. A coherent modelling strategy based on statistical inference is presented. Variable selection is carried out using simple existing techniques. The problem of selecting the number of hidden units is solved by sequentially applying Lagrange multiplier type tests, with the aim of avoiding the estimation of unidentified models. Misspecification tests are derived for evaluating an estimated neural network model. All the tests are entirely based on auxiliary regressions and are easily implemented. A small‐sample simulation experiment is carried out to show how the proposed modelling strategy works and how the misspecification tests behave in small samples. Two applications to real time series, one univariate and the other multivariate, are considered as well. Sets of one‐step‐ahead forecasts are constructed and forecast accuracy is compared with that of other nonlinear models applied to the same series. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

17.
Summary A method is described for determining and statistically comparing Michaelis-Menten kinetic parameters.This work was supported in part by a U.S.P.H.S. Grant No. AM 15745 from the National Institute of Arthritis, Metabolism and Digestive Diseases and by the Maine State Agricultural Experiment Station.  相似文献   

18.
In order to investigate the influence of various environmental parameters on melatonin excretion, the night-time urinary melatonin excretion of 16 healthy volunteers was measured in samples collected monthly over a period of one year. No significant interindividual differences were detected in the monthly rate of change of melatonin excretion. A seasonal bimodal pattern did, however, emerge. Peak values were observed in June and November. In these months a combination of high daylength stability and low values of the vertical component of the geomagnetic field was recorded. Trough values were found in April and August–October when low daylength stability was combined with high values of the vertical component of the geomagnetic field. We propose that the daylength variation rate, and the fluctuations of the vertical component of the geomagnetic field, interact to induce the changes in melatonin secretion which signalize the different seasons in humans.  相似文献   

19.
Ulrich Meyer’s book The Nature of Time uses tense logic to argue for a ‘modal’ view of time, which replaces substantial times (as in Newton’s Absolute Time) with ‘ersatz times’ constructed using conceptually basic tense operators. He also argues against Bertrand Russell’s relationist theory, in which times are classes of events, and against the idea that relativity compels the integration of time and space (called by Meyer the Inseparability Argument). I find fault with each of these negative arguments, as well as with Meyer’s purported reconstruction of empty spacetime from tense operators and substantial spatial points. I suggest that Meyer’s positive project is best conceived as an elimination of time in the mode of Julian Barbour's The End of Time.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号