首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
It is widely believed that the underlying reality behind statistical mechanics is a deterministic and unitary time evolution of a many-particle wave function, even though this is in conflict with the irreversible, stochastic nature of statistical mechanics. The usual attempts to resolve this conflict for instance by appealing to decoherence or eigenstate thermalization are riddled with problems. This paper considers theoretical physics of thermalized systems as it is done in practice and shows that all approaches to thermalized systems presuppose in some form limits to linear superposition and deterministic time evolution. These considerations include, among others, the classical limit, extensivity, the concepts of entropy and equilibrium, and symmetry breaking in phase transitions and quantum measurement. As a conclusion, the paper suggests that the irreversibility and stochasticity of statistical mechanics should be taken as a real property of nature. It follows that a gas of a macroscopic number N of atoms in thermal equilibrium is best represented by a collection of N wave packets of a size of the order of the thermal de Broglie wave length, which behave quantum mechanically below this scale but classically sufficiently far beyond this scale. In particular, these wave packets must localize again after scattering events, which requires stochasticity and indicates a connection to the measurement process.  相似文献   

2.
I propose a general geometric framework in which to discuss the existence of time observables. This framework allows one to describe a local sense in which time observables always exist, and a global sense in which they can sometimes exist subject to a restriction on the vector fields that they generate. Pauli׳s prohibition on quantum time observables is derived as a corollary to this result. I will then discuss how time observables can be regained in modest extensions of quantum theory beyond its standard formulation.  相似文献   

3.
Summary The explosion of new techniques, made available by the rapid advance in molecular biology, has provided a battery of novel approaches and technology which can be applied to more practical issues such as the epidemiology of parasites. In this review, we discuss the ways in which this new field of molecular epidemiology has contributed to and corroborated our existing knowledge of parasite epidemiology. Similar epidemiological questions can be asked about many different types of parasites and, using detailed examples such as the African trypanosomes and theLeishmania parasites, we discuss the techniques and the methodologies that have been or could be employed to solve many of these epidemiological problems.  相似文献   

4.
Our paper discusses the epistemic attitudes of particle physicists on the discovery of the Higgs boson at the Large Hadron Collider (LHC). It is based on questionnaires and interviews made shortly before and shortly after the discovery in 2012. We show, to begin with, that the discovery of a Standard Model (SM) Higgs boson was less expected than is sometimes assumed. Once the new particle was shown to have properties consistent with SM expectations – albeit with significant experimental uncertainties –, there was a broad agreement that ‘a’ Higgs boson had been found. Physicists adopted a two-pronged strategy. On the one hand, they treated the particle as a SM Higgs boson and tried to establish its properties with higher precision; on the other hand, they searched for any hints of physics beyond the SM. This motivates our first philosophical thesis: the Higgs discovery, being of fundamental importance and establishing a new kind of particle, represented a crucial experiment if one interprets this notion in an appropriate sense. Duhemian underdetermination is kept at bay by embedding the LHC into the tradition of previous precision experiments and the experimental strategies thus established. Second, our case study suggests that criteria of theory (or model) preference should be understood as epistemic and pragmatic values that have to be weighed in factual research practice. The Higgs discovery led to a shift from pragmatic to epistemic values in physicists’ assessment of the mechanisms of electroweak symmetry breaking. Complex criteria, such as naturalness, combine epistemic and pragmatic values, but are coherently applied by the community.  相似文献   

5.
Curie’s Principle says that any symmetry property of a cause must be found in its effect. In this article, I consider Curie’s Principle from the point of view of graphical causal models, and demonstrate that, under one definition of a symmetry transformation, the causal modeling framework does not require anything like Curie’s Principle to be true. On another definition of a symmetry transformation, the graphical causal modeling formalism does imply a version of Curie’s Principle. These results yield a better understanding of the logical landscape with respect to the relationship between Curie’s Principle and graphical causal modeling.  相似文献   

6.
G Hide  A Tait 《Experientia》1991,47(2):128-142
The explosion of new techniques, made available by the rapid advance in molecular biology, has provided a battery of novel approaches and technology which can be applied to more practical issues such as the epidemiology of parasites. In this review, we discuss the ways in which this new field of molecular epidemiology has contributed to and corroborated our existing knowledge of parasite epidemiology. Similar epidemiological questions can be asked about many different types of parasites and, using detailed examples such as the African trypanosomes and the Leishmania parasites, we discuss the techniques and the methodologies that have been or could be employed to solve many of these epidemiological problems.  相似文献   

7.
During the 1960s and 1970s population geneticists pushed beyond models of single genes to grapple with the effect on evolution of multiple genes associated by linkage. The resulting models of multiple interacting loci suggested that blocks of genes, maybe even entire chromosomes or the genome itself, should be treated as a unit. In this context, Richard Lewontin wrote his famous 1974 book The Genetic Basis of Evolutionary Change, which concludes with an argument for considering the entire genome as the unit of selection as a result of linkage. Why did Lewontin and others devote so much intellectual energy to the “complications of linkage” in the 1960s and 1970s? We argue that this attention to linkage should be understood in the context of research on chromosomal inversions and co-adapted gene complexes that occupied mid-century evolutionary genetics. For Lewontin, the complications of linkage were an extension of this chromosomal focus expressed in the new language of models for linkage disequilibrium.  相似文献   

8.
When attempting to assess the strengths and weaknesses of various principles in their potential role of guiding the formulation of a theory of quantum gravity, it is crucial to distinguish between principles which are strongly supported by empirical data – either directly or indirectly – and principles which instead (merely) rely heavily on theoretical arguments for their justification. Principles in the latter category are not necessarily invalid, but their a priori foundational significance should be regarded with due caution. These remarks are illustrated in terms of the current standard models of cosmology and particle physics, as well as their respective underlying theories, i.e., essentially general relativity and quantum (field) theory. For instance, it is clear that both standard models are severely constrained by symmetry principles: an effective homogeneity and isotropy of the known universe on the largest scales in the case of cosmology and an underlying exact gauge symmetry of nuclear and electromagnetic interactions in the case of particle physics. However, in sharp contrast to the cosmological situation, where the relevant symmetry structure is more or less established directly on observational grounds, all known, nontrivial arguments for the “gauge principle” are purely theoretical (and far less conclusive than usually advocated). Similar remarks apply to the larger theoretical structures represented by general relativity and quantum (field) theory, where – actual or potential – empirical principles, such as the (Einstein) equivalence principle or EPR-type nonlocality, should be clearly differentiated from theoretical ones, such as general covariance or renormalizability. It is argued that if history is to be of any guidance, the best chance to obtain the key structural features of a putative quantum gravity theory is by deducing them, in some form, from the appropriate empirical principles (analogous to the manner in which, say, the idea that gravitation is a curved spacetime phenomenon is arguably implied by the equivalence principle). Theoretical principles may still be useful however in formulating a concrete theory (analogous to the manner in which, say, a suitable form of general covariance can still act as a sieve for separating theories of gravity from one another). It is subsequently argued that the appropriate empirical principles for deducing the key structural features of quantum gravity should at least include (i) quantum nonlocality, (ii) irreducible indeterminacy (or, essentially equivalently, given (i), relativistic causality), (iii) the thermodynamic arrow of time, (iv) homogeneity and isotropy of the observable universe on the largest scales. In each case, it is explained – when appropriate – how the principle in question could be implemented mathematically in a theory of quantum gravity, why it is considered to be of fundamental significance and also why contemporary accounts of it are insufficient. For instance, the high degree of uniformity observed in the Cosmic Microwave Background is usually regarded as theoretically problematic because of the existence of particle horizons, whereas the currently popular attempts to resolve this situation in terms of inflationary models are, for a number of reasons, less than satisfactory. However, rather than trying to account for the required empirical features dynamically, an arguably much more fruitful approach consists in attempting to account for these features directly, in the form of a lawlike initial condition within a theory of quantum gravity.  相似文献   

9.
Quine is routinely perceived as having changed his mind about the scope of the Duhem-Quine thesis, shifting from what has been called an 'extreme holism' to a more moderate view. Where the Quine of 'Two Dogmas of Empiricism' argues that “the unit of empirical significance is the whole of science” (1951, 42), the later Quine seems to back away from this “needlessly strong statement of holism” (1991, 393). In this paper, I show that the received view is incorrect. I distinguish three ways in which Quine's early holism can be said to be wide-scoped and show that he has never changed his mind about any one of these aspects of his early view. Instead, I argue that Quine's apparent change of mind can be explained away as a mere shift of emphasis.  相似文献   

10.
The study of brand choice decisions with multiple alternatives has been successfully modelled for more than a decade using the Multinomial Logit model. Recently, neural network modelling has received increasing attention and has been applied to an array of marketing problems such as market response or segmentation. We show that a Feedforward Neural Network with Softmax output units and shared weights can be viewed as a generalization of the Multinomial Logit model. The main difference between the two approaches lies in the ability of neural networks to model non‐linear preferences with few (if any) a priori assumptions about the nature of the underlying utility function, while the Multinomial Logit can suffer from a specification bias. Being complementary, these approaches are combined into a single framework. The neural network is used as a diagnostic and specification tool for the Logit model, which will provide interpretable coefficients and significance statistics. The method is illustrated on an artificial dataset where the market is heterogeneous. We then apply the approach to panel scanner data of purchase records, using the Logit to analyse the non‐linearities detected by the neural network. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

11.
The linear multiregression dynamic model (LMDM) is a Bayesian dynamic model which preserves any conditional independence and causal structure across a multivariate time series. The conditional independence structure is used to model the multivariate series by separate (conditional) univariate dynamic linear models, where each series has contemporaneous variables as regressors in its model. Calculating the forecast covariance matrix (which is required for calculating forecast variances in the LMDM) is not always straightforward in its current formulation. In this paper we introduce a simple algebraic form for calculating LMDM forecast covariances. Calculation of the covariance between model regression components can also be useful and we shall present a simple algebraic method for calculating these component covariances. In the LMDM formulation, certain pairs of series are constrained to have zero forecast covariance. We shall also introduce a possible method to relax this restriction. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

12.
Structural symmetry is observed in the majority of fundamental protein folds and gene duplication and fusion evolutionary processes are postulated to be responsible. However, convergent evolution leading to structural symmetry has also been proposed; additionally, there is debate regarding the extent to which exact primary structure symmetry is compatible with efficient protein folding. Issues of symmetry in protein evolution directly impact strategies for de novo protein design as symmetry can substantially simplify the design process. Additionally, when considering gene duplication and fusion in protein evolution, there are two competing models: “emergent architecture” and “conserved architecture”. Recent experimental work has shed light on both the evolutionary process leading to symmetric protein folds as well as the ability of symmetric primary structure to efficiently fold. Such studies largely support a “conserved architecture” evolutionary model, suggesting that complex protein architecture was an early evolutionary achievement involving oligomerization of smaller polypeptides.  相似文献   

13.
In this paper I argue that the Strong Programme’s aim to provide robust explanations of belief acquisition is limited by its commitment to the symmetry principle. For Bloor and Barnes, the symmetry principle is intended to drive home the fact that epistemic norms are socially constituted. My argument here is that even if our epistemic standards are fully naturalized—even relativized—they nevertheless can play a pivotal role in why individuals adopt the beliefs that they do. Indeed, sometimes the fact that a belief is locally endorsed as rational is the only reason why an individual holds it. In this way, norms of rationality have a powerful and unique role in belief formation. But if this is true then the symmetry principle’s emphasis on ‘sameness of type’ is misguided. It has the undesirable effect of not just naturalizing our cognitive commitments, but trivializing them. Indeed, if the notion of ‘similarity’ is to have any content, then we are not going to classify as ‘the same’ beliefs that are formed in accordance with deeply entrenched epistemic norms as ones formed without reflection on these norms, or ones formed in spite of these norms. My suggestion here is that we give up the symmetry principle in favor of a more sophisticated principle, one that allows for a taxonomy of causes rich enough to allow us to delineate the unique impact epistemic norms have on those individuals who subscribe to them.  相似文献   

14.
Everett׳s interpretation of quantum mechanics was proposed to avoid problems inherent in the prevailing interpretational frame. It assumes that quantum mechanics can be applied to any system and that the state vector always evolves unitarily. It then claims that whenever an observable is measured, all possible results of the measurement exist. This notion of multiplicity has been understood in different ways by proponents of Everett׳s theory. In fact the spectrum of opinions on various ontological questions raised by Everett׳s approach is rather large, as we attempt to document in this critical review. We conclude that much remains to be done to clarify and specify Everett׳s approach.  相似文献   

15.
It is well understood that the standard formulation for the variance of a regression‐model forecast produces interval estimates that are too narrow, principally because it ignores regressor forecast error. While the theoretical problem has been addressed, there has not been an adequate explanation of the effect of regressor forecast error, and the empirical literature has supplied a disparate variety of bits and pieces of evidence. Most business‐forecasting software programs continue to supply only the standard formulation. This paper extends existing analysis to derive and evaluate large‐sample approximations for the forecast error variance in a single‐equation regression model. We show how these approximations substantially clarify the expected effects of regressor forecast error. We then present a case study, which (a) demonstrates how rolling out‐of‐sample evaluations can be applied to obtain empirical estimates of the forecast error variance, (b) shows that these estimates are consistent with our large‐sample approximations and (c) illustrates, for ‘typical’ data, how seriously the standard formulation can understate the forecast error variance. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

16.
It is argued that de Vries did not see Mendel's paper until 1900, and that, while his own theory of inheritance may have incorporated the notion of independent units, this pre-Mendelian formulation was not the same as Mendel's since it did not apply to paired hereditary units. Moreover, the way in which the term ‘segregation’ has been applied in the secondary literature has blurred the distinction between what is explained and the law which facilitates explanation.  相似文献   

17.
18.
Testing the existence of unit root and/or level change is necessary in order to understand the underlying processes of time series. In many studies carried out so far, the focus was only on a single aspect of unit root and level change, therefore limiting a full assessment of the given problems. Our study aims to find a solution to the given problems by testing the two hypotheses simultaneously. We derive the likelihood ratio test statistic based on the state space model, and their distributions are created by the simulation method. The performance of the proposed method is validated by simulated time series and also applied to two Korean macroeconomic time series to confirm its practical application. This analysis can provide a solution to determine the underlying structure of arguable time series. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

19.
Despite remarkable efforts, it remains notoriously difficult to equip quantum theory with a coherent ontology. Hence, Healey (2017, 12) has recently suggested that “quantum theory has no physical ontology and states no facts about physical objects or events”, and Fuchs et al. (2014, 752) similarly hold that “quantum mechanics itself does not deal directly with the objective world”. While intriguing, these positions either raise the question of how talk of ‘physical reality’ can even remain meaningful, or they must ultimately embrace a hidden variables-view, in tension with their original project. I here offer a neo-Kantian alternative. In particular, I will show how constitutive elements in the sense of Reichenbach (1920) and Friedman (1999, 2001) can be identified within quantum theory, through considerations of symmetries that allow the constitution of a ‘quantum reality’, without invoking any notion of a radically mind-independent reality. The resulting conception will inherit elements from pragmatist and ‘QBist’ approaches, but also differ from them in crucial respects. Furthermore, going beyond the Friedmanian program, I will show how non-fundamental and approximate symmetries can be relevant for identifying constitutive principles.  相似文献   

20.
Can stable regularities be explained without appealing to governing laws or any other modal notion? In this paper, I consider what I will call a ‘Humean system’—a generic dynamical system without guiding laws—and assess whether it could display stable regularities. First, I present what can be interpreted as an account of the rise of stable regularities, following from Strevens (2003), which has been applied to explain the patterns of complex systems (such as those from meteorology and statistical mechanics). Second, since this account presupposes that the underlying dynamics displays deterministic chaos, I assess whether it can be adapted to cases where the underlying dynamics is not chaotic but truly random—that is, cases where there is no dynamics guiding the time evolution of the system. If this is so, the resulting stable, apparently non-accidental regularities are the fruit of what can be called statistical necessity rather than of a primitive physical necessity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号