首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
2.
In this paper I assess whether the recently proposed “No De-Coupling” (NDC) theory of constitutive relevance in mechanisms is a useful tool to reconstruct constitutive relevance investigations in scientific practice. The NDC theory has been advanced as a framework theoretically superior to the mutual manipulability (MM) account of constitutive relevance in mechanisms but, in contrast to the MM account, has not yet been applied to detailed case studies. I argue that the NDC account is also applicable to empirical practice and that it fares better than the MM account on both theoretical and empirical grounds. I elaborate these claims in terms of applications of the NDC theory to two case studies of cognitive science research on the role of eye movements in mechanisms for cognitive capacities.  相似文献   

3.
This paper analyses the importance of Giordano Bruno's belief in many worlds, including the Moon, the planets and the stars, in the context of his trial by the Inquisitions in Venice and Rome. Historians have claimed that this belief was not heretical and therefore was not a major factor in Bruno's trial or execution. On the contrary, by examining neglected treatises on theology, heresies and Catholic canon law, I show that the belief in many worlds was formally heretical. Multiple Christian authorities denounced it. A systematic analysis of the extant primary sources shows that Bruno's belief in many worlds was, surprisingly, of primary importance in his trial and execution. The evidence includes recent and newly discovered primary sources.  相似文献   

4.
It is claimed that the `problem of the arrow of time in classical dynamics’ has been solved. Since all classical particles have a self-field (gravitational and in some cases also electromagnetic), their dynamics must include self-interaction. This fact and the observation that the domain of validity of classical physics is restricted to distances not less than of the order of a Compton wavelength (thus excluding point particles), leads to the conclusion that the fundamental classical equations of motion are not invariant under time reversal: retarded self-interactions lead to different equations than advanced ones. Since causality (the time order of cause and effect) requires retarded rather than advanced self-interaction, it is causality which is ultimately responsible for the arrow of time. Classical motions described by equations with advanced self-interactions differ from retarded ones and do not occur in nature.  相似文献   

5.
Constitutive mechanistic explanations are said to refer to mechanisms that constitute the phenomenon-to-be-explained. The most prominent approach of how to understand this relation is Carl Craver's mutual manipulability approach (MM) to constitutive relevance. Recently, MM has come under attack (Baumgartner and Casini 2017; Baumgartner and Gebharter 2015; Harinen 2014; Kästner 2017; Leuridan 2012; Romero 2015). It is argued that MM is inconsistent because, roughly, it is spelled out in terms of interventionism (which is an approach to causation), whereas constitutive relevance is said to be a non-causal relation. In this paper, I will discuss a strategy of how to resolve this inconsistency—so-called fat-handedness approaches (Baumgartner and Casini 2017; Baumgartner and Gebharter 2015; Romero 2015). I will argue that these approaches are problematic. I will present a novel suggestion for how to consistently define constitutive relevance in terms of interventionism. My approach is based on a causal interpretation of manipulability in terms of causal relations between the mechanism's components and what I will call temporal EIO-parts of the phenomenon. Still, this interpretation accounts for the fundamental difference between constitutive relevance and causal relevance.  相似文献   

6.
The work of Thomas Kuhn has been very influential in Anglo-American philosophy of science and it is claimed that it has initiated the historical turn. Although this might be the case for English speaking countries, in France an historical approach has always been the rule. This article aims to investigate the similarities and differences between Kuhn and French philosophy of science or ‘French epistemology’. The first part will argue that he is influenced by French epistemologists, but by lesser known authors than often thought. The second part focuses on the reactions of French epistemologists on Kuhn’s work, which were often very critical. It is argued that behind some superficial similarities there are deep disagreements between Kuhn and French epistemology. This is finally shown by a brief comparison with the reaction of more recent French philosophers of science, who distance themselves from French epistemology and are more positive about Kuhn. Based on these diverse appreciations of Kuhn, a typology of the different positions within the philosophy of science is suggested.  相似文献   

7.
Summary Normal human spermatozoa were demonstrated by dot immunoblot analysis and immunohistochemistry to possess transglutaminase (TGase). The immunological identification of spermatozoal TGase is consistent with reports by others of its biochemical identification and suggested role in sperm motility,and provides, in view of the immunoregulatory properties of seminal plasma TGase, presumptive identification of a means whereby spermatozoa, under normal physiological conditions, may possibly be protected from immunological attack within the female reproductive tract.  相似文献   

8.
Summary An empirical and mathematical model for self-organization is proposed, based on elemental properties, on unique interaction and on the combination of hierarchical elements. In the model, higher elements are stabilized by the cognitive (strong) interaction of subelements, disregarding intermediate elements. This is called elementary reductionism and is illustrated by the sequence quarks-elementary particles-atoms-molecules-cells-organisms-societies. Optimal dynamic interaction of nonidentical elements is called cognitive stability. This is compared with thermodynamic equilibrium. The principal differences are outlined.  相似文献   

9.
The paper examines philosophical issues that arise in contexts where one has many different models for treating the same system. I show why in some cases this appears relatively unproblematic (models of turbulence) while others represent genuine difficulties when attempting to interpret the information that models provide (nuclear models). What the examples show is that while complementary models needn’t be a hindrance to knowledge acquisition, the kind of inconsistency present in nuclear cases is, since it is indicative of a lack of genuine theoretical understanding. It is important to note that the differences in modeling do not result directly from the status of our knowledge of turbulent flows as opposed to nuclear dynamics—both face fundamental theoretical problems in the construction and application of models. However, as we shall, the ‘problem context(s)’ in which the modeling takes plays a decisive role in evaluating the epistemic merit of the models themselves. Moreover, the theoretical difficulties that give rise to inconsistent as opposed to complementary models (in the cases I discuss) impose epistemic and methodological burdens that cannot be overcome by invoking philosophical strategies like perspectivism, paraconsistency or partial structures.  相似文献   

10.
I display, by explicit construction, an account of the Aharonov–Bohm effect that employs only locally operative electrodynamical field strengths. The terms in the account are the components of the magnetic field of the solenoid at the location of electron, and even though the total field vanishes there, the components do not. That such a construction can be carried out demonstrates at least that whatever virtues they have for understanding and constructing new field theories, gauge fields in general make no metaphysical demands, and commit us to no novel ontology. I reflect on the significance of this for our understanding of quantum time-evolution and conclude that we should think of quantized matter as interacting individually with the other matter in the systems of which it is a part.  相似文献   

11.
The majority of human cancers are initiated when a single cell in an epithelial sheet becomes transformed. Cell transformation arises from the activation of oncoproteins and/or inactivation of tumor suppressor proteins. Recent studies have independently revealed that interaction and communication between transformed cells and their normal neighbors have a significant impact on the fate of the transformed cell. Several reports have shown that various phenomena occur at the interface between normal and transformed epithelial cells following the initial transformation event. In epithelia of Drosophila melanogaster, transformed and normal cells compete for survival in a process termed cell competition. This review will summarize current research and discuss the impact of these studies on our understanding of how primary tumors emerge and develop within a normal epithelium.  相似文献   

12.
The ancient philosopher Theophrastus (c. 371-285 BC) described a gemstone called lyngurium, purported to be solidified lynx urine, in his work De lapidibus ('On Stones'). Knowledge of the stone passed from him to other classical authors and into the medieval lapidary tradition, but there it was almost always linked to the 'learned master Theophrastus'. Although no physical example of the stone appears to have been seen or touched in ancient, medieval, or early modern times, its physical and medicinal properties were continually reiterated and elaborated as if it did 'exist'. By the seventeenth century, it began to disappear from lapidaries, but with no attempt to explain previous authors' errors since it had never 'existed' anyway. In tracing the career of lyngurium, this study sheds some light on the transmission of knowledge from the classical world to the Renaissance and the changing criteria by which such knowledge was judged.  相似文献   

13.
Objectiveprobability in quantum mechanics is often thought to involve a stochastic process whereby an actual future is selected from a range of possibilities. Everett's seminal idea is that all possible definite futures on the pointer basis exist as components of a macroscopic linear superposition. I demonstrate that these two conceptions of what is involved in quantum processes are linked via two alternative interpretations of the mind-body relation. This leads to a fission, rather than divergence, interpretation of Everettian theory and to a novel explanation of why a principle of indifference does not apply to self-location uncertainty for a post-measurement, pre-observation subject, just as Sebens and Carroll claim. Their Epistemic Separability Principle is shown to arise out of this explanation and the derivation of the Born rule for Everettian theory is thereby put on a firmer footing.  相似文献   

14.
15.
Research of the past two decades has proved the relevance of single cell biology in basic research and translational medicine. Successful detection and isolation of specific subsets is the key to understand their functional heterogeneity. Antibodies are conventionally used for this purpose, but their relevance in certain contexts is limited. In this review, we discuss some of these contexts, posing bottle neck for different fields of biology including biomedical research. With the advancement of chemistry, several methods have been introduced to overcome these problems. Even though microfluidics and microraft array are newer techniques exploited for single cell biology, fluorescence-activated cell sorting (FACS) remains the gold standard technique for isolation of cells for many biomedical applications, like stem cell therapy. Here, we present a comprehensive and comparative account of some of the probes that are useful in FACS. Further, we illustrate how these techniques could be applied in biomedical research. It is postulated that intracellular molecular markers like nucleostemin (GNL3), alkaline phosphatase (ALPL) and HIRA can be used for improving the outcome of cardiac as well as bone regeneration. Another field that could utilize intracellular markers is diagnostics, and we propose the use of specific peptide nucleic acid probes (PNPs) against certain miRNAs for cancer surgical margin prediction. The newer techniques for single cell biology, based on intracellular molecules, will immensely enhance the repertoire of possible markers for the isolation of cell types useful in biomedical research.  相似文献   

16.
Experimental modeling is the construction of theoretical models hand in hand with experimental activity. As explained in Section 1, experimental modeling starts with claims about phenomena that use abstract concepts, concepts whose conditions of realization are not yet specified; and it ends with a concrete model of the phenomenon, a model that can be tested against data. This paper argues that this process from abstract concepts to concrete models involves judgments of relevance, which are irreducibly normative. In Section 2, we show, on the basis of several case studies, how these judgments contribute to the determination of the conditions of realization of the abstract concepts and, at the same time, of the quantities that characterize the phenomenon under study. Then, in Section 3, we compare this view on modeling with other approaches that also have acknowledged the role of relevance judgments in science. To conclude, in Section 4, we discuss the possibility of a plurality of relevance judgments and introduce a distinction between locally and generally relevant factors.  相似文献   

17.
This paper considers Newton’s position on gravity’s cause, both conceptually and historically. With respect to the historical question, I argue that while Newton entertained various hypotheses about gravity’s cause, he never endorsed any of them, and in particular, his lack of confidence in the hypothesis of robust and unmediated distant action by matter is explained by an inclination toward certain metaphysical principles. The conceptual problem about gravity’s cause, which I identified earlier along with a deeper problem about individuating substances, is that a decisive conclusion is impossible unless certain speculative aspects of his empiricism are abandoned. In this paper, I situate those conceptual problems in Newton’s natural philosophy. They arise from ideas that push empiricism to potentially self-defeating limits, revealing the danger of allowing immaterial spirits any place in natural philosophy, especially spatially extended spirits supposed capable of co-occupying place with material bodies. Yet because their source ideas are speculative, Newton’s method ensures that these problems pose no threat to his rational mechanics or the profitable core of his empiricism. They are easily avoided by avoiding their source ideas, and when science emerges from natural philosophy, it does so with an ontology unencumbered by immaterial spirits.  相似文献   

18.
We make a first attempt to axiomatically formulate the Montevideo interpretation of quantum mechanics. In this interpretation environmental decoherence is supplemented with loss of coherence due to the use of realistic clocks to measure time to solve the measurement problem. The resulting formulation is framed entirely in terms of quantum objects. Unlike in ordinary quantum mechanics, classical time only plays the role of an unobservable parameter. The formulation eliminates any privileged role of the measurement process giving an objective definition of when an event occurs in a system.  相似文献   

19.
The history of modern economics abounds with pleas for more pluralism as well as pleas for more unification. These seem to be contradictory goals, suggesting that pluralism and unification are mutually exclusive, or at least that they involve trade-offs with more of one necessarily being traded off against less of the other. This paper will use the example of Paul Samuelson's Foundations of Economic Analysis (1947) to argue that the relationship between pluralism and unification is often more complex than this simple dichotomy suggests. In particular, Samuelson's Foundations is invariably presented as a key text in the unification of modern economics during the middle of the twentieth century; and in many ways that is entirely correct. But Samuelson's unification was not at the theoretical (causal and explanatory) level, but rather at the purely mathematical derivational level. Although this fact is recognized in the literature on Samuelson, what seems to be less recognized is that for Samuelson, much of the motivation for this unification was pluralist in spirit: not to narrow scientific economics into one single theory, but rather to allow for more than one theory to co-exist under a single unified derivational technique. This hidden pluralism will be discussed in detail. The paper concludes with a discussion of the implications for more recent developments in economics.  相似文献   

20.
Van Fraassen, like Popper before him, assumes that confirmation and disconfirmation relations are logical relations and thus hold only among abstract items. This raises a problem about how experience, for Popper, and observables, for van Fraassen, enter into epistemic evaluations. Each philosopher offers a drastic proposal: Popper holds that basic statements are accepted by convention; van Fraassen introduces his “pragmatic tautology.” Another alternative is to reject the claim that all evaluative relations are logical relations. Ayer proposed this option in responding to Popper, as did Sosa in a different context. I argue that this option should be pursued and propose a line of research that the option suggests.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号