首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 390 毫秒
1.
Structural symmetry is observed in the majority of fundamental protein folds and gene duplication and fusion evolutionary processes are postulated to be responsible. However, convergent evolution leading to structural symmetry has also been proposed; additionally, there is debate regarding the extent to which exact primary structure symmetry is compatible with efficient protein folding. Issues of symmetry in protein evolution directly impact strategies for de novo protein design as symmetry can substantially simplify the design process. Additionally, when considering gene duplication and fusion in protein evolution, there are two competing models: “emergent architecture” and “conserved architecture”. Recent experimental work has shed light on both the evolutionary process leading to symmetric protein folds as well as the ability of symmetric primary structure to efficiently fold. Such studies largely support a “conserved architecture” evolutionary model, suggesting that complex protein architecture was an early evolutionary achievement involving oligomerization of smaller polypeptides.  相似文献   

2.
In the published version of Hugh Everett III's doctoral dissertation, he inserted what has become a famous footnote, the “note added in proof”. This footnote is often the strongest evidence given for any of various interpretations of Everett (the many worlds, many minds, many histories and many threads interpretations). In this paper I will propose a new interpretation of the footnote. One that is supported by evidence found in letters written to and by Everett; one that is suggested by a new interpretation of Everett, an interpretation that takes seriously the central position of relative states in Everett's pure wave mechanics: the relative facts interpretation. Of central interest in this paper is how to make sense of Everett's claim in the “note added in proof” that “all elements of a superposition (all “branches”) are “actual,” none any more “real” than the rest.”  相似文献   

3.
4.
Naturalized metaphysics remains the default presupposition of much contemporary philosophy of physics. As metaphysics is supposed to concern the general structure of reality, so scientific naturalism draws upon our best physical theories to attempt to answer the foundational question “par excellenceviz., “how could the world possibly be the way this theory says it is?” A particular case study, Hilbert's attempt to analyze and explain a seeming “pre-established harmony” between mind and nature, is offered as a salutary reminder that naturalism's ready inference from physical theory to ontology may be too quick.  相似文献   

5.
Inferentialists about scientific representation hold that an apparatus's representing a target system consists in the apparatus allowing “surrogative inferences” about the target. I argue that a serious problem for inferentialism arises from the fact that many scientific theories and models contain internal inconsistencies. Inferentialism, left unamended, implies that inconsistent scientific models have unlimited representational power, since an inconsistency permits any conclusion to be inferred. I consider a number of ways that inferentialists can respond to this challenge before suggesting my own solution. I develop an analogy to exploitable glitches in a game. Even though inconsistent representational apparatuses may in some sense allow for contradictions to be generated within them, doing so violates the intended function of the apparatus's parts and hence violates representational “gameplay”.  相似文献   

6.
This paper takes up Huw Price׳s challenge to develop a retrocausal toy model of the Bell-EPR experiment. I develop three such models which show that a consistent, local, hidden-variables interpretation of the EPR experiment is indeed possible, and which give a feel for the kind of retrocausation involved. The first of the models also makes clear a problematic feature of retrocausation: it seems that we cannot interpret the hidden elements of reality in a retrocausal model as possessing determinate dispositions to affect the outcome of experiments. This is a feature which Price has embraced, but Gordon Belot has argued that this feature renders retrocausal interpretations “unsuitable for formal development”, and the lack of such determinate dispositions threatens to undermine the motivation for hidden-variables interpretations in the first place. But Price and Belot are both too quick in their assessment. I show that determinate dispositions are indeed consistent with retrocausation. What is more, I show that the ontological economy allowed by retrocausation holds out the promise of a classical understanding of spin and polarization.  相似文献   

7.
A number of scholars have recently drawn attention to the importance of iteration in scientific research. This paper builds on these previous discussions by drawing a distinction between epistemic and methodological forms of iteration and by clarifying the relationships between them. As defined here, epistemic iteration involves progressive alterations to scientific knowledge claims, whereas methodological iteration refers to an interplay between different modes of research practice. While distinct, these two forms of iteration are related in important ways. Contemporary research on the biological effects of nanomaterials illustrates that methodological iteration can help to “initiate,” “equip,” and “stimulate” epistemic iteration.  相似文献   

8.
During the 1960s and 1970s population geneticists pushed beyond models of single genes to grapple with the effect on evolution of multiple genes associated by linkage. The resulting models of multiple interacting loci suggested that blocks of genes, maybe even entire chromosomes or the genome itself, should be treated as a unit. In this context, Richard Lewontin wrote his famous 1974 book The Genetic Basis of Evolutionary Change, which concludes with an argument for considering the entire genome as the unit of selection as a result of linkage. Why did Lewontin and others devote so much intellectual energy to the “complications of linkage” in the 1960s and 1970s? We argue that this attention to linkage should be understood in the context of research on chromosomal inversions and co-adapted gene complexes that occupied mid-century evolutionary genetics. For Lewontin, the complications of linkage were an extension of this chromosomal focus expressed in the new language of models for linkage disequilibrium.  相似文献   

9.
The term “analogy” stands for a variety of methodological practices all related in one way or another to the idea of proportionality. We claim that in his first substantial contribution to electromagnetism James Clerk Maxwell developed a methodology of analogy which was completely new at the time or, to borrow John North’s expression, Maxwell’s methodology was a “newly contrived analogue”. In his initial response to Michael Faraday’s experimental researches in electromagnetism, Maxwell did not seek an analogy with some physical system in a domain different from electromagnetism as advocated by William Thomson; rather, he constructed an entirely artificial one to suit his needs. Following North, we claim that the modification which Maxwell introduced to the methodology of analogy has not been properly appreciated. In view of our examination of the evidence, we argue that Maxwell gave a new meaning to analogy; in fact, it comes close to modeling in current usage.  相似文献   

10.
Building on Norton's “material theory of induction,” this paper shows through careful historical analysis that analogy can act as a methodological principle or stratagem, providing experimentalists with a useful framework to assess data and devise novel experiments. Although this particular case study focuses on late eighteenth and early nineteenth-century experiments on the properties and composition of acids, the results of this investigation may be extended and applied to other research programs. A stage in-between what Steinle calls “exploratory experimentation” and robust theory, I argue that analogy encouraged research to substantiate why the likenesses should outweigh the differences (or vice versa) when evaluating results and designing experiments.  相似文献   

11.
Projections of future climate change cannot rely on a single model. It has become common to rely on multiple simulations generated by Multi-Model Ensembles (MMEs), especially to quantify the uncertainty about what would constitute an adequate model structure. But, as Parker points out (2018), one of the remaining philosophically interesting questions is: “How can ensemble studies be designed so that they probe uncertainty in desired ways?” This paper offers two interpretations of what General Circulation Models (GCMs) are and how MMEs made of GCMs should be designed. In the first interpretation, models are combinations of modules and parameterisations; an MME is obtained by “plugging and playing” with interchangeable modules and parameterisations. In the second interpretation, models are aggregations of expert judgements that result from a history of epistemic decisions made by scientists about the choice of representations; an MME is a sampling of expert judgements from modelling teams. We argue that, while the two interpretations involve distinct domains from philosophy of science and social epistemology, they both could be used in a complementary manner in order to explore ways of designing better MMEs.  相似文献   

12.
Work throughout the history and philosophy of biology frequently employs ‘chance’, ‘unpredictability’, ‘probability’, and many similar terms. One common way of understanding how these concepts were introduced in evolution focuses on two central issues: the first use of statistical methods in evolution (Galton), and the first use of the concept of “objective chance” in evolution (Wright). I argue that while this approach has merit, it fails to fully capture interesting philosophical reflections on the role of chance expounded by two of Galton's students, Karl Pearson and W.F.R. Weldon. Considering a question more familiar from contemporary philosophy of biology—the relationship between our statistical theories of evolution and the processes in the world those theories describe—is, I claim, a more fruitful way to approach both these two historical actors and the broader development of chance in evolution.  相似文献   

13.
Understanding complex physical systems through the use of simulations often takes on a narrative character. That is, scientists using simulations seek an understanding of processes occurring in time by generating them from a dynamic model, thereby producing something like a historical narrative. This paper focuses on simulations of the Diels-Alder reaction, which is widely used in organic chemistry. It calls on several well-known works on historical narrative to draw out the ways in which use of these simulations mirrors aspects of narrative understanding: Gallie for “followability” and “contingency”; Mink for “synoptic judgment”; Ricoeur for “temporal dialectic”; and Hawthorn for a related dialectic of the “actual and the possible”. Through these reflections on narrative, the paper aims for a better grasp of the role that temporal development sometimes plays in understanding physical processes and of how considerations of possibility enhance that understanding.  相似文献   

14.
The Higgs naturalness principle served as the basis for the so far failed prediction that signatures of physics beyond the Standard Model (SM) would be discovered at the LHC. One influential formulation of the principle, which prohibits fine tuning of bare Standard Model (SM) parameters, rests on the assumption that a particular set of values for these parameters constitute the “fundamental parameters” of the theory, and serve to mathematically define the theory. On the other hand, an old argument by Wetterich suggests that fine tuning of bare parameters merely reflects an arbitrary, inconvenient choice of expansion parameters and that the choice of parameters in an EFT is therefore arbitrary. We argue that these two interpretations of Higgs fine tuning reflect distinct ways of formulating and interpreting effective field theories (EFTs) within the Wilsonian framework: the first takes an EFT to be defined by a single set of physical, fundamental bare parameters, while the second takes a Wilsonian EFT to be defined instead by a whole Wilsonian renormalization group (RG) trajectory, associated with a one-parameter class of physically equivalent parametrizations. From this latter perspective, no single parametrization constitutes the physically correct, fundamental parametrization of the theory, and the delicate cancellation between bare Higgs mass and quantum corrections appears as an eliminable artifact of the arbitrary, unphysical reference scale with respect to which the physical amplitudes of the theory are parametrized. While the notion of fundamental parameters is well motivated in the context of condensed matter field theory, we explain why it may be superfluous in the context of high energy physics.  相似文献   

15.
Quine is routinely perceived as having changed his mind about the scope of the Duhem-Quine thesis, shifting from what has been called an 'extreme holism' to a more moderate view. Where the Quine of 'Two Dogmas of Empiricism' argues that “the unit of empirical significance is the whole of science” (1951, 42), the later Quine seems to back away from this “needlessly strong statement of holism” (1991, 393). In this paper, I show that the received view is incorrect. I distinguish three ways in which Quine's early holism can be said to be wide-scoped and show that he has never changed his mind about any one of these aspects of his early view. Instead, I argue that Quine's apparent change of mind can be explained away as a mere shift of emphasis.  相似文献   

16.
In his book, The Material Theory of Induction, Norton argues that the quest for a universal formal theory or ‘schema’ for analogical inference should be abandoned. In its place, he offers the “material theory of analogy”: each analogical inference is “powered” by a local fact of analogy rather than by any formal schema. His minimalist model promises a straightforward, fact-based approach to the evaluation and justification of analogical inferences. This paper argues that although the rejection of universal schemas is justified, Norton's positive theory is limited in scope: it works well only for a restricted class of analogical inferences. Both facts and quasi-formal criteria have roles to play in a theory of analogical reasoning.  相似文献   

17.
In this paper I address Descartes’ use of analogy in physics. First, I introduce Descartes’ hypothetical reasoning, distinguishing between analogy and hypothesis. Second, I examine in detail Descartes’ use of analogy to both discover causes and add plausibility to his hypotheses—even though not always explicitly stated, Descartes’ practice assumes a unified view of the subject matter of physics as the extension of bodies in terms of their size, shape and the motion of their parts. Third, I present Descartes’ unique “philosophy of analogy”, where the absence of analogy serves as a criterion for falsifying proposed explanations in physics. I conclude by defending Descartes’ philosophy of analogy by appeal to the value scientists assign to simplicity in their explanations.  相似文献   

18.
The evolutionary process of interspecific hybridization in cladocerans is reviewed based on ecological and population genetic data. The evolutionary consequences of hybridization, biogeographic patterns and fitness comparisons are analyzed within the conceptual framework of theories on hybridization. Among species of theD. longispina complex no interpopulational transition zones (hybrid zones) have been detected, but rather patchy distributions of hybrids and parentals have been found. Hybrids occur across broad geographic ranges and can be more abundant than parental species. Due to asexual reproduction (ameiotic parthenogenesis), hybrid breakdown can be avoided, and hybrids can even (temporarily) combine advantageous traits of both parental species. Evolutionary consequences may arise from repeated backcrossing, which in some cases results in introgression and patterns of reticulate evolution.Those forms which possess in some considerable degree the character of species, but which are so closely similar to some other forms, or are so closely linked to them by intermediate gradations, that naturalists do not like to rank them as distinct species, are in several respects the most important to us (Charles Robert Darwin 1859)  相似文献   

19.
Theories are composed of multiple interacting components. I argue that some theories have narratives as essential components, and that narratives function as integrative devices of the mathematical components of theories. Narratives represent complex processes unfolding in time as a sequence of stages, and hold the mathematical elements together as pieces in the investigation of a given process. I present two case studies from population genetics: R. A. Fisher's “mas selection” theory, and Sewall Wright's shifting balance theory. I apply my analysis to an early episode of the “R. A. Fisher – Sewall Wright controversy.”  相似文献   

20.
One of the central problems of Kant's account of the empirical laws of nature is: What grounds their necessity? In this article I discuss the three most important lines of interpretation and suggest a novel version of one of them. While the first interpretation takes the transcendental principles as the only sources of the empirical laws' necessity, the second interpretation takes the systematicity of the laws to guarantee their necessity. It is shown that both views involve serious problems. The third interpretation, the “causal powers interpretation”, locates the source of the laws' necessity in the properties of natural objects. Although the second and third interpretations seem incompatible, I analyse why Kant held both views and I argue that they can be reconciled, because the metaphysical grounding project of the laws' necessity is accounted for by Kant's causal powers account, while his best system account explains our epistemic access to the empirical laws. If, however, causal powers are supposed to fulfil the grounding function for the laws' natural modality, then I suggest that a novel reading of the causal powers interpretation should be formulated along the lines of a genuine dispositionalist conception of the laws of nature.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号