首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
In this second part of our two-part paper we review and analyse attempts since 1950 to use information theoretic notions to exorcise Maxwell’s Demon. We argue through a simple dilemma that these attempted exorcisms are ineffective, whether they follow Szilard in seeking a compensating entropy cost in information acquisition or Landauer in seeking that cost in memory erasure. In so far as the Demon is a thermodynamic system already governed by the Second Law, no further supposition about information and entropy is needed to save the Second Law. In so far as the Demon fails to be such a system, no supposition about the entropy cost of information acquisition and processing can save the Second Law from the Demon.  相似文献   

2.
When considering controversial thermodynamic scenarios such as Maxwell's demon, it is often necessary to consider probabilistic mixtures of macrostates. This raises the question of how, if at all, to assign entropy to them. The information-theoretic entropy is often used in such cases; however, no general proof of the soundness of doing so has been given, and indeed some arguments against doing so have been presented. We offer a general proof of the applicability of the information-theoretic entropy to probabilistic mixtures of macrostates that is based upon a probabilistic generalisation of the Kelvin statement of the second law. We defend the latter and make clear the other assumptions on which our main result depends. We also briefly discuss the interpretation of our result.  相似文献   

3.
The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with certain compelling consistency requirements. This paper reviews these consistency arguments and the surrounding controversy. It is shown that the uniqueness proofs are flawed, or rest on unreasonably strong assumptions. A more general class of inference rules, maximizing the so-called Rényi entropies, is exhibited which also fulfill the reasonable part of the consistency assumptions.  相似文献   

4.
This paper attempts to argue for the theory-ladenness of evidence. It does so by employing and analysing an episode from the history of eighteenth century chemistry. It delineates attempts by Joseph Priestley and Antoine Lavoisier to construct entirely different kinds of evidence for and against a particular hypothesis from a set of agreed upon observations or (raw) data. Based on an augmented version of a distinction, drawn by J. Bogen and J. Woodward, between data and phenomena it is shown that the role of theoretical auxiliary assumptions is very important in constructing evidence for (or against) a theory from observation or (raw) data. In revolutionary situations, rival groups hold radically different theories and theoretical auxiliary assumptions. These are employed to construct very different evidence from the agreed upon set of observations or (raw) data. Hence, theory resolution becomes difficult. It is argued that evidence construction is a multi-layered exercise and can be disputed at any level. What counts as unproblematic observation or (raw) data at one level may become problematic at another level. The contingency of these constructions and the (un)problematic nature of evidence are shown to be partially dependent upon the scientific knowledge that the scientific community possesses.  相似文献   

5.
6.
It is generally accepted, following Landauer and Bennett, that the process of measurement involves no minimum entropy cost, but the erasure of information in resetting the memory register of a computer to zero requires dissipating heat into the environment. This thesis has been challenged recently in a two-part article by Earman and Norton. I review some relevant observations in the thermodynamics of computation and argue that Earman and Norton are mistaken: there is in principle no entropy cost to the acquisition of information, but the destruction of information does involve an irreducible entropy cost.  相似文献   

7.
The distribution function associated with a classical gas at equilibrium is considered. We prove that apart from a factorisable multiplier, the distribution function is fully determined by the correlations among local momenta fluctuations. Using this result we discuss the conditions which enable idealised local observers, who are immersed in the gas and form a part of it, to determine the distribution ‘from within’. This analysis sheds light on two views on thermodynamic equilibrium, the ‘ergodic’ and the ‘thermodynamic limit’ schools, and the relations between them. It also provides an outline for a new definition of equilibrium that is weaker than full ergodicity. Finally, we briefly discuss the possibility that the distribution can be determined by external observers.  相似文献   

8.
One finds, in Maxwell's writings on thermodynamics and statistical physics, a conception of the nature of these subjects that differs in interesting ways from the way they are usually conceived. In particular, though—in agreement with the currently accepted view—Maxwell maintains that the second law of thermodynamics, as originally conceived, cannot be strictly true, the replacement he proposes is different from the version accepted by most physicists today. The modification of the second law accepted by most physicists is a probabilistic one: although statistical fluctuations will result in occasional spontaneous differences in temperature or pressure, there is no way to predictably and reliably harness these to produce large violations of the original version of the second law. Maxwell advocates a version of the second law that is strictly weaker; the validity of even this probabilistic version is of limited scope, limited to situations in which we are dealing with large numbers of molecules en masse and have no ability to manipulate individual molecules. Connected with this is his conception of the thermodynamic concepts of heat, work, and entropy; on the Maxwellian view, these are concept that must be relativized to the means we have available for gathering information about and manipulating physical systems. The Maxwellian view is one that deserves serious consideration in discussions of the foundation of statistical mechanics. It has relevance for the project of recovering thermodynamics from statistical mechanics because, in such a project, it matters which version of the second law we are trying to recover.  相似文献   

9.
Computer simulations at the atomic level have arrived at a stage where they provide realistic modeling of flexibility in proteins (and the mobility of their associated solvent) that is important in understanding the nature of molecular motions. This can now be extended to the molecular and atomic motions that are associated with protein mechanisms. Moreover, the derived data agree reasonably accurately with experimental measurements of several kinetic and thermodynamic parameters. Fundamental insights emerge on the roles that this intrinsic flexibility plays in the thermodynamic characteristics of macromolecules in solution; these equip the investigator to probe the consequences of cognate interactions and ligand binding on entropy and enthalpy. Thus simulations can now provide a powerful tool for investigating protein mechanisms that complements the existing and the emerging experimental techniques. Received 29 May 2005; received after revision 23 August 2005; accepted 21 October 2005  相似文献   

10.
Summary The thermodynamic study of systems in which stationary (non equilibrium) states were possible, led one of us (I. P.) to a number of general conclusions. In the present paper these conclusions are summarized and briefly discussed from a biological standpoint. It appears that the evolution of such systems is towards states with the least production of entropy (per mass unit) compatible with the conditions imposed. In the case of living matter this corresponds approximately to states of minimum metabolism. During this evolution the entropy contained in the system may decrease whilst the heterogenity increases. But this increase in heterogenity can only take place when there is a decrease in the entropy production, that is an evolution of the metabolism. We are thus led to suggest a physicochemical interpretation of Lamarchism. Finally we call attention to the fact that the moderation principle ofLe Chatelier-Braun is not limited to equilibrium states.  相似文献   

11.
This paper is concerned with one-day-ahead hourly predictions of electricity demand for Puget Power, a local electricity utility for the Seattle area. Standard modelling techniques, including neural networks, will fail when the assumptions of the model are violated. It is demonstrated that typical modelling assumptions such as no outliers or level shifts are incorrect for electric power demand time series. A filter which removes or lessens the significance of outliers and level shifts is demonstrated. This filter produces ‘clean data’ which is used as the basis for future robust predictions. The robust predictions are shown to be better than non-robust counterparts on electricity load data. The outliers identified by the filter are shown to correspond with suspicious data. Finally, the estimated level shifts are in agreement with the belief that load growth is taking place year to year.  相似文献   

12.
Two complementary debates of the turn of the nineteenth and twentieth century are examined here: the debate on the legitimacy of hypotheses in the natural sciences and the debate on intentionality and ‘representations without object’ in philosophy. Both are shown to rest on two core issues: the attitude of the subject and the mode of presentation chosen to display a domain of phenomena. An orientation other than the one which contributed to shape twentieth-century philosophy of science is explored through the analysis of the role given to assumptions in Boltzmann’s research strategy, where assumptions are contrasted to hypotheses, axioms, and principles, and in Meinong’s criticism of the privileged status attributed to representations in mental activities. Boltzmann’s computational style in mathematics and Meinong’s criticism of the confusion between representation and judgment give prominence to an indirect mode of presentation, adopted in a state of suspended belief which is characteristic of assumptions and which enables one to grasp objects that cannot be reached through direct representation or even analogies. The discussion shows how assumptions and the movement to fiction can be essential steps in the quest for objectivity. The conclusion restates the issues of the two debates in a contemporary perspective and shows how recent developments in philosophy of science and philosophy of language and mind can be brought together by arguing for a twofold conception of reference.  相似文献   

13.
In The Theory of Relativity and A Priori Knowledge (1920b), Reichenbach developed an original account of cognition as coordination of formal structures to empirical ones. One of the most salient features of this account is that it is explicitly not a top-down type of coordination, and in fact it is crucially “directed” by the empirical side. Reichenbach called this feature “the mutuality of coordination” but, in that work, did not elaborate sufficiently on how this is supposed to work. In a paper that he wrote less than two years afterwards (but that he published only in 1932), “The Principle of Causality and the Possibility of its Empirical Confirmation” (1923/1932), he described what seems to be a model for this idea, now within an analysis of causality that results in an account of scientific inference. Recent reassessments of his early proposal do not seem to capture the extent of Reichenbach's original worries. The present paper analyses Reichenbach's early account and suggests a new way to look at his early work. According to it, we perform measurements, individuate parameters, collect and analyse data, by using a “constructive” approach, such as the one with which we formulate and test hypotheses, which paradigmatically requires some simplicity assumptions. Reichenbach's attempt to account for all these aspects in 1923 was obviously limited and naive in many ways, but it shows that, in his view, there were multiple ways in which the idea of “constitution” is embodied in scientific practice.  相似文献   

14.
The microscopic explanation of the physical phenomena represented by a macroscopic theory is often cast in terms of the reduction of the latter to a more fundamental theory, which represents the same phenomena at the microscopic level, albeit in an idealized way. In particular, the reduction of thermodynamics to statistical mechanics is a much discussed case-study in philosophy of physics. Based on the Generalized Nagel–Schaffner model, the alleged reductive explanation would be accomplished if one finds a corrected version of classical thermodynamics that can be strictly derived from statistical mechanics. That is the sense in which, according to Callender (1999, 2001), one should not take thermodynamics too seriously. Arguably, the sought-after revision is given by statistical thermodynamics, intended as a macroscopic theory equipped with a probabilistic law of equilibrium fluctuations. The present paper aims to evaluate this proposal. The upshot is that, while statistical thermodynamics enables one to re-define equilibrium so as to agree with Boltzmann entropy, it does not provide a definitive solution to the problem of explaining macroscopic irreversibility from a microscopic point of view.  相似文献   

15.
Curie’s Principle says that any symmetry property of a cause must be found in its effect. In this article, I consider Curie’s Principle from the point of view of graphical causal models, and demonstrate that, under one definition of a symmetry transformation, the causal modeling framework does not require anything like Curie’s Principle to be true. On another definition of a symmetry transformation, the graphical causal modeling formalism does imply a version of Curie’s Principle. These results yield a better understanding of the logical landscape with respect to the relationship between Curie’s Principle and graphical causal modeling.  相似文献   

16.
经济-资源-环境(ERE)系统是个典型的开放复杂自组织系统,将广义信息熵的概念引入复杂ERE系统,建立其普适的结构自组织演化方程,形成了复杂系统建模新方法,可揭示复杂ERE系统中多主体作用动力学及结构自组织演化过程。以国家可持续发展实验区ERE发展模式为例,根据1994~2005年指标数据,对其发展演化过程做了具体的模拟研究,揭示了实验区规划的实施对可持续发展系统的改进和完善是一个复杂的自组织过程.  相似文献   

17.
Many studies have shown that, in general, a combination of forecasts often outperforms the forecasts of a single model or expert. In this paper we postulate that obtaining forecasts is costly, and provide models for optimally selecting them. Based on normality assumptions, we derive a dynamic programming procedure for maximizing precision net of cost. We examine the solution for cases where the forecasters are independent, correlated and biased. We provide illustrative examples for each case.  相似文献   

18.
In the Second Analogy, Kant argues that every event has a cause. It remains disputed what this conclusion amounts to. Does Kant argue only for the Weak Causal Principle that every event has some cause, or for the Strong Causal Principle that every event is produced according to a universal causal law? Existing interpretations have assumed that, by Kant’s lights, there is a substantive difference between the two. I argue that this is false. Kant holds that the concept of cause contains the notion of lawful connection, so it is analytic that causes operate according to universal laws. He is explicit about this commitment, not least in his derivation of the Categorical Imperative in Groundwork III. Consequently, Kant’s move from causal rules to universal laws is much simpler than previously assumed. Given his commitments, establishing the Strong Causal Principle requires no more argument than establishing the Weak Causal Principle.  相似文献   

19.
The development and maturation of an oligodendroglial cell is comprised of three intimately related processes that include proliferation, differentiation, and myelination. Here we review how proliferation and differentiation are controlled by distinct molecular mechanisms and discuss whether differentiation is merely a default of inhibited proliferation. We then address whether differentiation and myelination can be uncoupled in a similar manner. This task is particularly challenging because an oligodendrocyte cannot myelinate without first differentiating, and these processes are therefore not mutually exclusive. Is it solely the presence of the axon that distinguishes a differentiated oligodendrocyte from a myelinating one? Uncoupling these two processes requires identifying specific signals that regulate myelination without affecting the differentiation process. We will review current understanding of the relationship between differentiation and myelination and discuss whether these two processes can truly be uncoupled.  相似文献   

20.
考虑影响台风降雨因素的复杂性以及现有水库台风遥测资料的不完备性,采用基于信息熵与条件信息熵建立起来的不完备信息系统的约简方法,对台风形成库区降雨的影响因子进行挖掘.以典型水库中的台风特性参数及库区降雨实时监测信息为依据,忽略环境因素影响,重点研究包括台风风速、风力、7级/10级风圈半径、中心气压、台风中心与库区距离等典型台风特性因子对库区降雨量的影响,经信息约简认为台风风速、中心气压、与库区距离等台风特性因子是库区降雨的主要影响因子.分析结果对水库台风监测及库区降雨强度分析具有一定指导作用.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号