首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Almost all solid malignancies exhibit complex cytological and architectural abnormalities, which vary from cell to cell and area to area within the same tumour, and between tumours of the same type. The degrees of these abnormalities do not correlate perfectly with the biological behaviour (especially growth rate and metastatic potential) among the various tumour types. These features of tumours have long been considered to invalidate simple mutational or 'abnormal gene expression' (epigenetic) theories of carcinogenesis. The 'mutator phenotype/clonal selection' hypothesis is based on the now well-established phenomenon of genetic instability of cancer cells, and proposes that this instability is an essential requirement for the development of tumours, and not an irrelevant side-effect of some other process. This paper argues that this hypothesis can provide a satisfactory explanation for the diverse histological and biological features of solid malignancies. Further, because virtually all solid tumours are histologically abnormal, genetic instability is likely to be essential for the malignant process. The concepts of mutator phenotype and clonal selection are therefore supported. Received 8 April 2002; accepted 25 April 2002  相似文献   

2.
Baker (2011) argues that broken symmetries pose a number of puzzles for the interpretation of quantum theories—puzzles which he claims do not arise in classical theories. I provide examples of classical cases of symmetry breaking and show that they have precisely the same features that Baker finds puzzling in quantum theories. To the extent that Baker is correct that the classical cases pose no puzzles, the features of the quantum case that Baker highlights should not be puzzling either.  相似文献   

3.
Dark matter (DM) is an essential ingredient of the present Standard Cosmological Model, according to which only 5% of the mass/energy content of our universe is made of ordinary matter. In recent times, it has been argued that certain cases of gravitational lensing represent a new type of evidence for the existence of DM. In a recent paper, Peter Kosso attempts to substantiate that claim. His argument is that, although in such cases DM is only detected by its gravitational effects, gravitational lensing is a direct consequence of Einstein's Equivalence Principle (EEP) and therefore the complete gravitational theory is not needed in order to derive such lensing effects. In this paper I critically examine Kosso's argument: I confront the notion of empirical evidence involved in the discussion and argue that EEP does not have enough power by itself to sustain the claim that gravitational lensing in the Bullet Cluster constitutes evidence for the DM Hypothesis. As a consequence of this, it is necessary to examine the details of alternative theories of gravity to decide whether certain empirical situations are indeed evidence for the existence of DM. It may well be correct that gravitational lensing does constitute evidence for the DM Hypothesis—at present it is controversial whether the proposed modifications of gravitation all need DM to account for the phenomenon of gravitational lensing and if so, of which kind—but this will not be a direct consequence of EEP.  相似文献   

4.
In his 1966 paper “The Strategy of model-building in Population Biology”, Richard Levins argues that no single model in population biology can be maximally realistic, precise and general at the same time. This is because these desirable model properties trade-off against one another. Recently, philosophers have developed Levins’ claims, arguing that trade-offs between these desiderata are generated by practical limitations on scientists, or due to formal aspects of models and how they represent the world. However this project is not complete. The trade-offs discussed by Levins had a noticeable effect on modelling in population biology, but not on other sciences. This raises questions regarding why such a difference holds. I claim that in order to explain this finding, we must pay due attention to the properties of the systems, or targets modelled by the different branches of science.  相似文献   

5.
It is well-known that Newtonian gravity, commonly held to describe a gravitational force, can be recast in a form that incorporates gravity into the geometry of the theory: Newton–Cartan theory. It is less well-known that general relativity, a geometrical theory of gravity, can be reformulated in such a way that it resembles a force theory of gravity; teleparallel gravity does just this. This raises questions. One of these concerns theoretical underdetermination. I argue that these theories do not, in fact, represent cases of worrying underdetermination. On close examination, the alternative formulations are best interpreted as postulating the same spacetime ontology. In accepting this, we see that the ontological commitments of these theories cannot be directly deduced from their mathematical form. The spacetime geometry involved in a gravitational theory is not a straightforward consequence of anything internal to that theory as a theory of gravity. Rather, it essentially relies on the rest of nature (the non-gravitational interactions) conspiring to choose the appropriate set of inertial frames.  相似文献   

6.
Social epistemologists have argued that high risk, high reward science has an important role to play in scientific communities. Recently, though, it has also been argued that various scientific fields seem to be trending towards conservatism—the increasing production of what Kuhn (1962) might have called ‘normal science’. This paper will explore a possible explanation for this sort of trend: that the process by which scientific research groups form, grow, and dissolve might be inherently hostile to such science. In particular, I employ a paradigm developed by Smaldino and McElreath (2016) that treats a scientific community as a population undergoing selection. As will become clear, perhaps counter-intuitively this sort of process in some ways promotes high risk, high reward science. But, as I will point out, risky science is, in general, the sort of thing that is hard to repeat. While more conservative scientists will be able to train students capable of continuing their successful projects, and so create thriving lineages, successful risky science may not be the sort of thing one can easily pass on. In such cases, the structure of scientific communities selects against high risk, high rewards projects. More generally, this project makes clear that there are at least two processes to consider in thinking about how incentives shape scientific communities—the process by which individual scientists make choices about their careers and research, and the selective process governing the formation of new research groups.  相似文献   

7.
一定的自然环境条件孕育着与之相对应的资源利用方式,而不同的资源利用方式常常深刻地影响着不同地区人们的社会、经济、文化活动,同时资源利用方式也常常对少数民族文化的保留和传承产生着重要的影响。此种现象在川西北的民族地区(如阿坝藏族羌族自治州)表现尤为突出。本文通过对藏、羌、回三个不同民族的五个村寨进行的入户调查揭示出这一规律。  相似文献   

8.
This paper investigates the significance of T-duality in string theory: the indistinguishability with respect to all observables, of models attributing radically different radii to space—larger than the observable universe, or far smaller than the Planck length, say. Two interpretational branch points are identified and discussed. First, whether duals are physically equivalent or not: by considering a duality of the familiar simple harmonic oscillator, I argue that they are. Unlike the oscillator, there are no measurements ‘outside’ string theory that could distinguish the duals. Second, whether duals agree or disagree on the radius of ‘target space’, the space in which strings evolve according to string theory. I argue for the latter position, because the alternative leaves it unknown what the radius is. Since duals are physically equivalent yet disagree on the radius of target space, it follows that the radius is indeterminate between them. Using an analysis of Brandenberger and Vafa (1989), I explain why—even so—space is observed to have a determinate, large radius. The conclusion is that observed, ‘phenomenal’ space is not target space, since a space cannot have both a determinate and indeterminate radius: instead phenomenal space must be a higher-level phenomenon, not fundamental.  相似文献   

9.
Experimental modeling is the construction of theoretical models hand in hand with experimental activity. As explained in Section 1, experimental modeling starts with claims about phenomena that use abstract concepts, concepts whose conditions of realization are not yet specified; and it ends with a concrete model of the phenomenon, a model that can be tested against data. This paper argues that this process from abstract concepts to concrete models involves judgments of relevance, which are irreducibly normative. In Section 2, we show, on the basis of several case studies, how these judgments contribute to the determination of the conditions of realization of the abstract concepts and, at the same time, of the quantities that characterize the phenomenon under study. Then, in Section 3, we compare this view on modeling with other approaches that also have acknowledged the role of relevance judgments in science. To conclude, in Section 4, we discuss the possibility of a plurality of relevance judgments and introduce a distinction between locally and generally relevant factors.  相似文献   

10.
Calibration procedures establish a reliable relation between the final states (‘indications’) of a measurement process and features of the objects being measured (‘outcomes’). This article analyzes the inferential structure of calibration procedures. I show that calibration is a modelling activity, namely the activity of constructing, deriving predictions from, and testing theoretical and statistical models of a measurement process. Measurement outcomes are parameter value ranges that maximize the predictive accuracy and mutual coherence of such models, among other desiderata. This model-based view of calibration clarifies the source of objectivity of measurement outcomes, the nature of measurement accuracy, and the close relationship between measurement and prediction. Contrary to commonly held views, I argue that measurement standards are not necessary for calibration, although they are useful in maintaining coherence across large networks of measurement procedures.  相似文献   

11.
In this paper I take a sceptical view of the standard cosmological model and its variants, mainly on the following grounds: (i) The method of mathematical modelling that characterises modern natural philosophy—as opposed to Aristotle's—goes well with the analytic, piecemeal approach to physical phenomena adopted by Galileo, Newton and their followers, but it is hardly suited for application to the whole world. (ii) Einstein's first cosmological model (1917) was not prompted by the intimations of experience but by a desire to satisfy Mach's Principle. (iii) The standard cosmological model—a Friedmann–Lemaı̂tre–Robertson–Walker spacetime expanding with or without end from an initial singularity—is supported by the phenomena of redshifted light from distant sources and very nearly isotropic thermal background radiation provided that two mutually inconsistent physical theories are jointly brought to bear on these phenomena, viz the quantum theory of elementary particles and Einstein's theory of gravity. (iv) While the former is certainly corroborated by high-energy experiments conducted under conditions allegedly similar to those prevailing in the early world, precise tests of the latter involve applications of the Schwarzschild solution or the PPN formalism for which there is no room in a Friedmann–Lemaı̂tre–Robertson–Walker spacetime.  相似文献   

12.
In his book Thing Knowledge Davis Baird argues that our accustomed understanding of knowledge as justified true beliefs is not enough to understand progress in science and technology. To be more accurate he argues that scientific instruments are to be seen as a form of “objective knowledge” in the sense of Karl Popper.I want to examine if this idea is plausible. In a first step I want to show that this proposal implies that nearly all man-made artifacts are materialized objective knowledge. I argue that this radical change in our concept of knowledge demands strong reasons and that Baird does not give them. I take a look at the strongest strand of arguments of Baird's book—the arguments from cognitive autonomy—and conclude that they do not suffice to make Baird's view of scientific instruments tenable.  相似文献   

13.
14.
Aging—defined as the progressive impairment of an organism’s functional capacity, resulting from deleterious changes in cells, organs, and biological systems—is one of the most fundamental features of Eukaryotes, from humans to the unicellular budding yeast Saccharomyces cerevisiae. It has recently been reported that this may also be the case for certain (if not all) types of bacteria. In this paper, the current view on the mechanistic background and evolutionary significance of bacterial kind of aging is presented, with particular emphasis on the role of asymmetric cell division, the characteristics of stationary growth phase, and the role of oxidative protein damage.  相似文献   

15.
基于特征的矢量场自适应纹理绘制   总被引:1,自引:0,他引:1  
针对3D流场纹理可视化方法面临的遮挡问题,提出了一种基于流场特征模糊提取的自适应稀疏纹理绘制方法.该方法建立在一种可扩展的模糊流场特征区域描述与提取算法基础上,首先通过采用模糊理论对流场特征区域进行描述,建立了流场特征的测度规则并得到相应特征向量,然后基于特征向量偏差给出了最小平方和误差准则下的模糊测度隶属度的确定方法.根据得到的流场特征模糊隶属度场,生成可突出流场特征信息的自适应高斯噪声以进行纹理LIC卷积.同时针对纹理方法难以明确表示3D流场具体方向的缺陷,提出了两种冷暖光照方法,通过在LIC卷积过程中采用冷暖光照方法突出流场具体方向显示,有效解决了该问题.实验表明,本文方法相对于传统特征提取方法更具有可扩展性.同时自适应技术的采用,有效缓解了3D流场可视化存在的遮挡和混淆现象.而冷暖光照处理的加入,则解决了纹理方法对流场具体方向表现上的缺陷.  相似文献   

16.
SUMMARY

Machine Translation (MT) is now ubiquitous in discussions of translation. The roots of this phenomenon — first publicly unveiled in the so-called ‘Georgetown-IBM Experiment’ on 9 January 1954 — displayed not only the technological utopianism still associated with dreams of a universal computer translator, but was deeply enmeshed in the political pressures of the Cold War and a dominating conception of scientific writing as both the goal of machine translation as well as its method. Machine translation was created, in part, as a solution to a perceived crisis sparked by the massive expansion of Soviet science. Scientific prose was also perceived as linguistically simpler, and so served as the model for how to turn a language into a series of algorithms. This paper follows the rise of the Georgetown program — the largest single program in the world — from 1954 to the (as it turns out, temporary) collapse of MT in 1964.  相似文献   

17.
Bronchial asthma is a chronic inflammatory disease in which bronchial wall remodelling plays a significant role. This phenomenon is related to enhanced proliferation of airway smooth muscle cells, elevated extracellular matrix protein secretion and an increased number of myofibroblasts. Phenotypic fibroblast-to-myofibroblast transition represents one of the primary mechanisms by which myofibroblasts arise in fibrotic lung tissue. Fibroblast-to-myofibroblast transition requires a combination of several types of factors, the most important of which are divided into humoural and mechanical factors, as well as certain extracellular matrix proteins. Despite intensive research on the nature of this process, its underlying mechanisms during bronchial airway wall remodelling in asthma are not yet fully clarified. This review focuses on what is known about the nature of fibroblast-to-myofibroblast transition in asthma. We aim to consider possible mechanisms and conditions that may play an important role in fibroblast-to-myofibroblast transition but have not yet been discussed in this context. Recent studies have shown that some inherent and previously undescribed features of fibroblasts can also play a significant role in fibroblast-to-myofibroblast transition. Differences observed between asthmatic and non-asthmatic bronchial fibroblasts (e.g., response to transforming growth factor β, cell shape, elasticity, and protein expression profile) may have a crucial influence on this phenomenon. An accurate understanding and recognition of all factors affecting fibroblast-to-myofibroblast transition might provide an opportunity to discover efficient methods of counteracting this phenomenon.  相似文献   

18.
Forecasting advice from human advisors is often utilized more than advice from automation. There is little understanding of why “algorithm aversion” occurs, or specific conditions that may exaggerate it. This paper first reviews literature from two fields—interpersonal advice and human–automation trust—that can inform our understanding of the underlying causes of the phenomenon. Then, an experiment is conducted to search for these underlying causes. We do not replicate the finding that human advice is generally utilized more than automated advice. However, after receiving bad advice, utilization of automated advice decreased significantly more than advice from humans. We also find that decision makers describe themselves as having much more in common with human than automated advisors despite there being no interpersonal relationship in our study. Results are discussed in relation to other findings from the forecasting and human–automation trust fields and provide a new perspective on what causes and exaggerates algorithm aversion.  相似文献   

19.
Two inter-linked theses are defended in this paper. One is the Duhemian theme that a rigid distinction between physical and chemical properties cannot be upheld. Duhem maintained this view not because the latter are reducible to the former, but because if physics is to remain consistent with chemistry it must prove possible to expand it to accommodate new features, and a rigid distinction would be a barrier to this process. The second theme is that naturally occurring isotopic variants of water are in fact distinct substances, and naturally occurring samples of water are mixtures of these substances. For most practical purposes it is convenient to treat protium oxide, deuterium oxide, and so on, as the same chemical substance, but to insist on this as a matter of principle would stand in conflict with the first thesis.  相似文献   

20.
After preparing the way with comments on evanescent quantities and then Newton’s interpretation of his second law, this study of Proposition II (Book I)— Proposition II Every body that moves in some curved line described in a plane and, by a radius drawn to a point, either unmoving or moving uniformly forward with a rectilinear motion, describes areas around that point proportional to the times, is urged by a centripetal force tending toward that same point. —asks and answers the following questions: When does a version of Proposition II first appear in Newton’s work? What revisions bring that initial version to the final form in the 1726 Principia? What, exactly, does this proposition assert? In particular, what does Newton mean by the motion of a body “urged by a centripetal force”? Does it assert a true mathematical claim? If not, what revision makes it true? Does the demonstration of Proposition II persuade? Is it as convincing, for example, as the most convincing arguments of the Principia? If not, what revisions would make the demonstration more persuasive? What is the importance of Proposition II, to the physics of Book III and the mathematics of Book I?  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号