首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The emergence of dimensional analysis in the early nineteenth century involved a redefinition of the pre-existing concepts of homogeneity and dimensions, which entailed a shift from a qualitative to a quantitative conception of these notions. Prior to the nineteenth century, these concepts had been used as criteria to assess the soundness of operations and relations between geometrical quantities. Notably, the terms in such relations were required to be homogeneous, which meant that they needed to have the same geometrical dimensions. The latter reflected the nature of the quantities in question, such as volume vs area. As natural philosophy came to encompass non-geometrical quantities, the need arose to generalize the concept of homogeneity. In 1822, Jean Baptiste Fourier consequently redefined it to be the condition an equation must satisfy in order to remain valid under a change of units, and the ‘dimension' correspondingly became the power of a conversion factor. When these innovations eventually found an echo in France and Great Britain, in the second half of the century, tensions arose between the former, qualitative understanding of dimensions as reflecting the nature of physical quantities, and the new, quantitative conception based on unit conversion and measurement. The emergence of dimensional analysis thus provides a case study of how existing rules and concepts can find themselves redefined in the context of wider conceptual changes; in the present case this redefinition involved a generalization, but also a shift in meaning which led to conceptual tensions.  相似文献   

2.
I propose a new perspective with which to understand scientific revolutions. This is a conversion from an object-only perspective to one that properly treats object and process concepts as distinct kinds. I begin with a re-examination of the Copernican revolution. Recent findings from the history of astronomy suggest that the Copernican revolution was a move from a conceptual framework built around an object concept to one built around a process concept. Drawing from studies in the cognitive sciences, I then show that process concepts are independent of object concepts, grounded in specific regions of the brain and involving unique representational mechanisms. There are cognitive obstacles to the transformation from object to process concepts, and an object bias—a tendency to treat processes as objects—makes this kind of conceptual change difficult. Consequently, transformation from object to process concepts is disruptive and revolutionary. Finally, I explore the implications of this new perspective on scientific revolutions for both the history and philosophy of science.  相似文献   

3.
The development of evolutionary game theory (EGT) is closely linked with two interdisciplinary exchanges: the import of game theory into biology, and the import of biologists’ version of game theory into economics. This paper traces the history of these two import episodes. In each case the investigation covers what exactly was imported, what the motives for the import were, how the imported elements were put to use, and how they related to existing practices in the respective disciplines. Two conclusions emerged from this study. First, concepts derived from the unity of science discussion or the unification accounts of explanation are too strong and too narrow to be useful for analysing these interdisciplinary exchanges. Secondly, biology and economics—at least in relation to EGT—show significant differences in modelling practices: biologists seek to link EGT models to concrete empirical situations, whereas economists pursue conceptual exploration and possible explanation.  相似文献   

4.
Epigenetic concepts are fundamentally shaped by a legacy of negative definition, often understood by what they are not. Yet the function and implication of negative definition for scientific discourse has thus far received scant attention. Using the term epimutation as exemplar, we analyze the paradoxical like-but-unlike structure of a term that must simultaneously connect with but depart from genetic concepts. We assess the historical forces structuring the use of epimutation and like terms such as paramutation. This analysis highlights the positive characteristics defining epimutation: the regularity, oxymoronic temporality, and materiality of stable processes. Integrating historical work, ethnographic observation, and insights from philosophical practice-oriented conceptual analysis, we detail the distinctive epistemic goals the epimutation concept fulfils in medicine, plant biology and toxicology. Epimutation and allied epigenetic terms have succeeded by being mutation-like and recognizable, yet have failed to consolidate for exactly the same reason: they are tied simultaneously by likeness and opposition to nouns that describe things that are assumed to persist unchanged over space and time. Moreover, negative definition casts the genetic-epigenetic relationship as an either/or binary, overshadowing continuities and connections. This analysis is intended to assist practitioners and observers of genetics and epigenetics in recognizing and moving beyond the conceptual legacies of negative definition.  相似文献   

5.
Cassirer's philosophical agenda revolved around what appears to be a paradoxical goal, that is, to reconcile the Kantian explanation of the possibility of knowledge with the conceptual changes of nineteenth and early twentieth-century science. This paper offers a new discussion of one way in which this paradox manifests itself in Cassirer's philosophy of mathematics. Cassirer articulated a unitary perspective on mathematics as an investigation of structures independently of the nature of individual objects making up those structures. However, this posed the problem of how to account for the applicability of abstract mathematical concepts to empirical reality. My suggestion is that Cassirer was able to address this problem by giving a transcendental account of mathematical reasoning, according to which the very formation of mathematical concepts provides an explanation of the extensibility of mathematical knowledge. In order to spell out what this argument entails, the first part of the paper considers how Cassirer positioned himself within the Marburg neo-Kantian debate over intellectual and sensible conditions of knowledge in 1902–1910. The second part compares what Cassirer says about mathematics in 1910 with some relevant examples of how structural procedures developed in nineteenth-century mathematics.  相似文献   

6.
This paper motivates and outlines a new account of scientific explanation, which I term ‘collaborative explanation.’ My approach is pluralist: I do not claim that all scientific explanations are collaborative, but only that some important scientific explanations are—notably those of complex organic processes like development. Collaborative explanation is closely related to what philosophers of biology term ‘mechanistic explanation’ (e.g., Machamer et al., Craver, 2007). I begin with minimal conditions for mechanisms: complexity, causality, and multilevel structure. Different accounts of mechanistic explanation interpret and prioritize these conditions in different ways. This framework reveals two distinct varieties of mechanistic explanation: causal and constitutive. The two have heretofore been conflated, with philosophical discussion focusing on the former. This paper addresses the imbalance, using a case study of modeling practices in Systems Biology to reveals key features of constitutive mechanistic explanation. I then propose an analysis of this variety of mechanistic explanation, in terms of collaborative concepts, and sketch the outlines of a general theory of collaborative explanation. I conclude with some reflections on the connection between this variety of explanation and social aspects of scientific practice.  相似文献   

7.
This paper examines a historical case of conceptual change in mathematics that was fundamental to its progress. I argue that in this particular case, the change was conditioned primarily by social processes, and these are reflected in the intellectual development of the discipline. Reorganization of mathematicians and the formation of a new mathematical community were the causes of changes in intellectual content, rather than being mere effects. The paper focuses on the French Revolution, which gave rise to revolutionary developments in mathematics. I examine how changes in the political constellation affected mathematicians both individually and collectively, and how a new professional community—with different views on the objects, problems, aims, and values of the discipline—arose. On the basis of this account, I will discuss such Kuhnian themes as the role of the professional community and normal versus revolutionary development.  相似文献   

8.
This paper investigates the historical origins of the notion of incommensurability in contemporary philosophy of science. The aim is not to establish claims of priority, but to enhance our understanding of the notion by illuminating the various issues that contributed to its development. Kuhn developed his notion of incommensurability primarily under the influence of Fleck, Polanyi, and Köhler. Feyerabend, who had developed his notion more than a decade earlier, drew directly from Duhem, who had developed a notion of incommensurability in 1906. The idea is that in the course of scientific advance, when fundamental theories change, meanings change, which can result in a new conception of the nature of reality. Feyerabend repeatedly used this notion of incommensurability to attack various forms of conceptual conservativism. These include the logical positivists’ foundational use of protocol statements, Heisenberg’s methodological principle that established results must be presupposed by all further research, attempts to separate philosophical accounts of ontology from physics, Bohr’s principle of complementarity, and logical empiricist accounts of reduction and explanation. Focusing on the function of the notion of incommensurability common to Feyerabend’s various critiques explicates Feyerabend’s early philosophy as a series of challenges to forms of conceptual conservativism.  相似文献   

9.
This paper investigates the important role of narrative in social science case-based research. The focus is on the use of narrative in creating a productive ordering of the materials within such cases, and on how such ordering functions in relation to ‘narrative explanation’. It argues that narrative ordering based on juxtaposition - using an analogy to certain genres of visual representation - is associated with creating and resolving puzzles in the research field. Analysis of several examples shows how the use of conceptual or theoretical resources within the narrative ordering of ingredients enables the narrative explanation of the case to be resituated at other sites, demonstrating how such explanations can attain scope without implying full generality.  相似文献   

10.
Mathematical invariances, usually referred to as “symmetries”, are today often regarded as providing a privileged heuristic guideline for understanding natural phenomena, especially those of micro-physics. The rise of symmetries in particle physics has often been portrayed by physicists and philosophers as the “application” of mathematical invariances to the ordering of particle phenomena, but no historical studies exist on whether and how mathematical invariances actually played a heuristic role in shaping microphysics. Moreover, speaking of an “application” of invariances conflates the formation of concepts of new intrinsic degrees of freedom of elementary particles with the formulation of models containing invariances with respect to those degrees of freedom. I shall present here a case study from early particle physics (ca. 1930–1954) focussed on the formation of one of the earliest concepts of a new degree of freedom, baryon number, and on the emergence of the invariance today associated to it. The results of the analysis show how concept formation and “application” of mathematical invariances were distinct components of a complex historical constellation in which, beside symmetries, two further elements were essential: the idea of physically conserved quantities and that of selection rules. I shall refer to the collection of different heuristic strategies involving selection rules, invariances and conserved quantities as the “SIC-triangle” and show how different authors made use of them to interpret the wealth of new experimental data. It was only a posteriori that the successes of this hybrid “symmetry heuristics” came to be attributed exclusively to mathematical invariances and group theory, forgetting the role of selection rules and of the notion of physically conserved quantity in the emergence of new degrees of freedom and new invariances. The results of the present investigation clearly indicate that opinions on the role of symmetries in fundamental physics need to be critically reviewed in the spirit of integrated history and philosophy of science.  相似文献   

11.
Recent years have seen the development of an approach both to general philosophy and philosophy of science often referred to as ‘experimental philosophy’ or just ‘X-Phi’. Philosophers often make or presuppose empirical claims about how people would react to hypothetical cases, but their evidence for claims about what ‘we’ would say is usually very limited indeed. Philosophers of science have largely relied on their more or less intimate knowledge of their field of study to draw hypothetical conclusions about the state of scientific concepts and the nature of conceptual change in science. What they are lacking is some more objective quantitative data supporting their hypotheses. A growing number of philosophers (of science), along with a few psychologists and anthropologists, have tried to remedy this situation by designing experiments aimed at systematically exploring people’s reactions to philosophically important thought experiments or scientists’ use of their scientific concepts. Many of the results have been surprising and some of the conclusions drawn from them have been more than a bit provocative. This symposium attempts to provide a window into this new field of philosophical inquiry and to show how experimental philosophy provides crucial tools for the philosopher and encourages two-way interactions between scientists and philosophers.  相似文献   

12.
In recent years, a “change in attitude” in particle physics has led to our understanding current quantum field theories as effective field theories (EFTs). The present paper is concerned with the significance of this EFT approach, especially from the viewpoint of the debate on reductionism in science. In particular, I shall show how EFTs provide a new and interesting case study in current philosophical discussion on reduction, emergence, and inter-level relationships in general.  相似文献   

13.
This essay proposes a new notion - the landing zone - in order to identify conceptual features that allow modelers to transfer mathematical tools across disciplinary borders. This discussion refers to the transferable models as ‘templates’. Templates are functions, equations, or computational methods that are capable of being generalized from a particular subject matter. There are formal and conceptual prerequisites for the transfer of a template to a new domain. A landing zone is an ontology that contributes to the satisfaction of these conditions for successful transfer. This paper presents a case study on a model in chemistry - the Quantum Theory of Atoms in Molecules (QTAIM) - that makes use of transferred templates from physics - the virial theorem and the wave function. The landing zone in this case is a new ontological notion, that of the topological atom, which prepares ground for the use of the virial theorem and the wave function in chemistry. The virial theorem requires that there exists in-principle stability to the system that it represents, and the wave function requires transformation in its representation that is justified. The ontology of QTAIM - the landing zone for these templates - grounds the scientific use of these templates in the context of chemistry.  相似文献   

14.
After arguing that Laudan's account of the role of conceptual considerations in theory appraisal is inadequate and unsatisfying in a number of respects, I suggest some of the ways in which we might move to develop an alternative account. This alternative presupposes a problem-solving methodology and, unlike the Laudanian approach, awards a crucial role to empirical research in the resolution of the conceptual problems troubling a theory. Three ways in which a theory may enhance the conceptual resources which it supplies for empirical problem-solving are considered: the fine-tuning of theoretical concepts; the appropriation of the conceptual resources of theories in other domains; and the achievement of greater consilience. Reference is made to several historical cases in which such enhancement actually occurred.  相似文献   

15.
The history of the physics of pendular motion rightly begins with Galileo's discovery of the isochronous character of that motion. There is, however, a ‘pre-history’ of the pendulum, centering on its initial recognition as a significant special case requiring explanation. This occurred in the writings of Jean Buridan and Nicole Oresme in the middle of the fourteenth century. Earlier works that might have been construed as discussing pendular motion are considered, as are the explanations for the scholastic ‘discovery’ of pendular motion put forth by Thomas Kuhn and Piero Ariotti. In contrast to these writers, this paper seeks to account for the pendulum's emergence with reference to an imaginary experiment concerning a body moving past the earth's center, medieval theories of impetus, and the proximate physical model of pendular motion, the late medieval heavy suspended church bell.  相似文献   

16.
The birth of classical genetics in the 1910's was the result of the junction of two modes of analysis, corresponding to two disciplines: Mendelism and cytology. The goal of this paper is to shed some light on the change undergone by the science of heredity at the time, and to emphasize the subtlety of the conceptual articulation of Mendelian and cytological hypotheses within classical genetics. As a way to contribute to understanding how the junction of the two disciplines at play gave birth to a new way of studying heredity, my focus will be on the forms of representation used in genetics research at the time. More particularly, I will study the design and development, by Thomas H. Morgan's group, of the technique of linkage mapping, which embodies the integration of the Mendelian and cytological forms of representation. I will show that the design of this technique resulted in a genuine conceptual change, which should be described as a representational change, rather than merely as the introduction of new hypotheses into genetics.  相似文献   

17.
I offer a reply to criticisms of the Strong Programme presented by Stephen Kemp who develops some new lines of argument that focus on the ‘monism’ of the programme. He says the programme should be rejected for three reasons. First, because it embodies ‘weak idealism’, that is, its supporters effectively sever the link between language and the world. Second, it challenges the reasons that scientists offer in explanation of their own beliefs. Third, it destroys the distinction between successful and unsuccessful instrumental action. Kemp is careful to produce quotations from the supporters of the programme as evidence to support his case. All three points deserve and are given a detailed response and the interpretation of the quoted material plays a significant role in the discussion. My hope is that careful exegesis will offset the numerous misinterpretations that are current in the philosophical literature. Particular attention is paid to what is said about the normative standards involved in the application of empirical concepts. The operation of these standards in the face of the negotiability of all concepts is explored and misapprehensions on the topic are corrected. The work of Wittgenstein, Popper, Kuhn and Hesse is used to illustrate these themes.  相似文献   

18.
In 1981, David Hubel and Torsten Wiesel received the Nobel Prize for their research on cortical columns—vertical bands of neurons with similar functional properties. This success led to the view that “cortical column” refers to the basic building block of the mammalian neocortex. Since the 1990s, however, critics questioned this building block picture of “cortical column” and debated whether this concept is useless and should be replaced with successor concepts. This paper inquires which experimental results after 1981 challenged the building block picture and whether these challenges warrant the elimination “cortical column” from neuroscientific discourse. I argue that the proliferation of experimental techniques led to a patchwork of locally adapted uses of the column concept. Each use refers to a different kind of cortical structure, rather than a neocortical building block. Once we acknowledge this diverse-kinds picture of “cortical column”, the elimination of column concept becomes unnecessary. Rather, I suggest that “cortical column” has reached conceptual retirement: although it cannot be used to identify a neocortical building block, column research is still useful as a guide and cautionary tale for ongoing research. At the same time, neuroscientists should search for alternative concepts when studying the functional architecture of the neocortex. keywords: Cortical column, conceptual development, history of neuroscience, patchwork, eliminativism, conceptual retirement.  相似文献   

19.
It has been suggested that knowledge domains which emerge within regulatory science represent a compromise between technical knowledge and policy priorities. This article investigates the claim through consideration of the emergence of animal tests to evaluate chemical safety in the UK between 1945 and 1960. During this period there was a proliferation of new chemical-based innovations in consumer products. The situation gave rise to concerns about the potential impact on public health. Solutions required development of a knowledge domain that would fulfil policy requirements, outside the remit of academic science. Lack of consensus in the scientific field gave rise to debate over the best means to collect accurate data. This resulted in emergence of the new specialty of safety testing, in response to political and industrial needs. The socio-political context of this case illustrates the impact that organisational setting can have on shaping knowledge claims.  相似文献   

20.
According to inference to the best explanation (IBE), scientists infer the loveliest of competing hypotheses, ‘loveliness’ being explanatory virtue. This generates two key objections: that loveliness is too subjective to guide inference, and that it is no guide to truth. I defend IBE using Thomas Kuhn’s notion of exemplars: the scientific theories, or applications thereof, that define Kuhnian normal science and facilitate puzzle-solving. I claim that scientists infer the explanatory puzzle-solution that best meets the standard set by the relevant exemplar of loveliness. Exemplars are the subject of consensus, eliminating subjectivity; divorced from Kuhnian relativism, they give loveliness the context-sensitivity required to be truth-tropic. The resulting account, ‘Kuhnian IBE’, is independently plausible and offers a partial rapprochement between IBE and Kuhn’s account of science.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号