首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 21 毫秒
1.
This article is about structural realism, historical continuity, laws of nature, and ceteris paribus clauses. Fresnel's Laws of optics support Structural Realism because they are a scientific structure that has survived theory change. However, the history of Fresnel's Laws which has been depicted in debates over realism since the 1980s is badly distorted. Specifically, claims that J. C. Maxwell or his followers believed in an ontologically-subsistent electromagnetic field, and gave up the aether, before Einstein's annus mirabilis in 1905 are indefensible. Related claims that Maxwell himself did not believe in a luminiferous aether are also indefensible. This paper corrects the record. In order to trace Fresnel's Laws across significant ontological changes, they must be followed past Einstein into modern physics and nonlinear optics. I develop the philosophical implications of a more accurate history, and analyze Fresnel's Laws' historical trajectory in terms of dynamic ceteris paribus clauses. Structuralists have not embraced ceteris paribus laws, but they continue to point to Fresnel's Laws to resist anti-realist arguments from theory change. Fresnel's Laws fit the standard definition of a ceteris paribus law as a law applicable only in particular circumstances. Realists who appeal to the historical continuity of Fresnel's Laws to combat anti-realists must incorporate ceteris paribus laws into their metaphysics.  相似文献   

2.
3.
Attending carefully to the role that reference frames play in the definition of physical concepts in quantum theory can lead us to a clearer understanding of the uncertainty relations, and the connection that they have with the fact that physical quantities are defined in terms of, and measured relative to, reference frames.  相似文献   

4.
While philosophers have subjected Galileo's classic thought experiments to critical analysis, they have tended to largely ignored the historical and intellectual context in which they were deployed, and the specific role they played in Galileo's overall vision of science. In this paper I investigate Galileo's use of thought experiments, by focusing on the epistemic and rhetorical strategies that he employed in attempting to answer the question of how one can know what would happen in an imaginary scenario. Here I argue we can find three different answers to this question in Galileo later dialogues, which reflect the changing meanings of ‘experience’ and ‘knowledge’ (scientia) in the early modern period. Once we recognise that Galileo's thought experiments sometimes drew on the power of memory and the explicit appeal to ‘common experience’, while at other times, they took the form of demonstrative arguments intended to have the status of necessary truths; and on still other occasions, they were extrapolations, or probable guesses, drawn from a carefully planned series of controlled experiments, it becomes evident that no single account of the epistemological relationship between thought experiment, experience and experiment can adequately capture the epistemic variety we find Galileo's use of imaginary scenarios. To this extent, we cannot neatly classify Galileo's use of thought experiments as either ‘medieval’ or ‘early modern’, but we should see them as indicative of the complex epistemological transformations of the early seventeenth century.  相似文献   

5.
6.
This paper advocates the reduction of the inference of common cause to that of common origins. It distinguishes and subjects to critical analysis thirteen interpretations of “the inference of common cause” whose conclusions do not follow from their assumptions. Instead, I introduce six types of inferences of common origins of information signals from their receivers to reduce, in the sense of supersede and replace, the thirteen inferences of common causes. I show how the paradigmatic examples of inferences of common cause, as well as a broader scope of inferences in the historical sciences, are better explained by inferences of origins.Inferences of origins from information rich coherences between receivers of information signals both fit more closely and explain better the range of examples that have traditionally been associated with inferences of common causes, as well as a broader scope of examples from the historical sciences. Shannon's concept of information as reduction in uncertainty, rather than physicalist concepts of information that relate it to entropy or waves, simplifies the inferences, preempts objections, and avoids the underdetermination of conclusions that challenge models of inferences of common causes.In the second part of the paper I model inferences of common origins from information preserved in their receivers. I distinguish information poor inferences that there were some common origins of receivers from the information richer inferences of ranges of possible common origins and the information transmission channels by which they transmitted signals to receivers. Lastly and most information rich, I distinguish the inference of the defining properties of common origins. The information transmission model from origins to receivers allows the reconceptualization of the concepts of "independence" as absence of intersections between information channels and "reliability" as the preservation of information from origins in receivers. Finally, I show how inferences of origins form the epistemic basis of the historical sciences.  相似文献   

7.
This paper aims to identify the key characteristics of model organisms that make them a specific type of model within the contemporary life sciences: in particular, we argue that the term “model organism” does not apply to all organisms used for the purposes of experimental research. We explore the differences between experimental and model organisms in terms of their material and epistemic features, and argue that it is essential to distinguish between their representational scope and representational target. We also examine the characteristics of the communities who use these two types of models, including their research goals, disciplinary affiliations, and preferred practices to show how these have contributed to the conceptualization of a model organism. We conclude that model organisms are a specific subgroup of organisms that have been standardized to fit an integrative and comparative mode of research, and that it must be clearly distinguished from the broader class of experimental organisms. In addition, we argue that model organisms are the key components of a unique and distinctively biological way of doing research using models.  相似文献   

8.
This paper motivates and outlines a new account of scientific explanation, which I term ‘collaborative explanation.’ My approach is pluralist: I do not claim that all scientific explanations are collaborative, but only that some important scientific explanations are—notably those of complex organic processes like development. Collaborative explanation is closely related to what philosophers of biology term ‘mechanistic explanation’ (e.g., Machamer et al., Craver, 2007). I begin with minimal conditions for mechanisms: complexity, causality, and multilevel structure. Different accounts of mechanistic explanation interpret and prioritize these conditions in different ways. This framework reveals two distinct varieties of mechanistic explanation: causal and constitutive. The two have heretofore been conflated, with philosophical discussion focusing on the former. This paper addresses the imbalance, using a case study of modeling practices in Systems Biology to reveals key features of constitutive mechanistic explanation. I then propose an analysis of this variety of mechanistic explanation, in terms of collaborative concepts, and sketch the outlines of a general theory of collaborative explanation. I conclude with some reflections on the connection between this variety of explanation and social aspects of scientific practice.  相似文献   

9.
10.
This is an English translation of Paul Feyerabend's earliest extant essay “Der Begriff der Verständlichkeit in der modernen Physik” (1948). In it, Feyerabend defends positivism as a progressive framework for scientific research in certain stages of scientific development. He argues that in physics visualizability (Anschaulichkeit) and intelligibility (Verständlichkeit) are time-conditioned concepts: what is deemed visualizable in the development of physical theories is relative to a specific historical context and changes over time. He concludes that from time to time the abandonment of visualizability is crucial for progress in physics, as it is conducive to major theory change, illustrating the point on the basis of advances in atomic theory.  相似文献   

11.
We call attention to the historical fact that the meaning of symmetry in antiquity—as it appears in Vitruvius’s De architectura—is entirely different from the modern concept. This leads us to the question, what is the evidence for the changes in the meaning of the term symmetry, and what were the different meanings attached to it? We show that the meaning of the term in an aesthetic sense gradually shifted in the context of architecture before the image of the balance was attached to the term in the middle of the 18th century and well before the first modern scientific usage by Legendre in 1794.  相似文献   

12.
An overlap between the general relativist and particle physicist views of Einstein gravity is uncovered. Noether׳s 1918 paper developed Hilbert׳s and Klein׳s reflections on the conservation laws. Energy-momentum is just a term proportional to the field equations and a ‘curl’ term with identically zero divergence. Noether proved a converse “Hilbertian assertion”: such “improper” conservation laws imply a generally covariant action.Later and independently, particle physicists derived the nonlinear Einstein equations assuming the absence of negative-energy degrees of freedom (“ghosts”) for stability, along with universal coupling: all energy-momentum including gravity׳s serves as a source for gravity. Those assumptions (all but) imply (for 0 graviton mass) that the energy-momentum is only a term proportional to the field equations and a symmetric “curl,” which implies the coalescence of the flat background geometry and the gravitational potential into an effective curved geometry. The flat metric, though useful in Rosenfeld׳s stress-energy definition, disappears from the field equations. Thus the particle physics derivation uses a reinvented Noetherian converse Hilbertian assertion in Rosenfeld-tinged form.The Rosenfeld stress-energy is identically the canonical stress-energy plus a Belinfante curl and terms proportional to the field equations, so the flat metric is only a convenient mathematical trick without ontological commitment. Neither generalized relativity of motion, nor the identity of gravity and inertia, nor substantive general covariance is assumed. The more compelling criterion of lacking ghosts yields substantive general covariance as an output. Hence the particle physics derivation, though logically impressive, is neither as novel nor as ontologically laden as it has seemed.  相似文献   

13.
Objections to the use of historical case studies for philosophical ends fall into two categories. Methodological objections claim that historical accounts and their uses by philosophers are subject to various biases. We argue that these challenges are not special; they also apply to other epistemic practices. Metaphysical objections, on the other hand, claim that historical case studies are intrinsically unsuited to serve as evidence for philosophical claims, even when carefully constructed and used, and so constitute a distinct class of challenge. We show that attention to what makes for a canonical case can address these problems. A case study is canonical with respect to a particular philosophical aim when the features relevant to that aim provide a reasonably complete causal account of the results of the historical process under investigation. We show how to establish canonicity by evaluating relevant contingencies using two prominent examples from the history of science: Eddington’s confirmation of Einstein’s theory of general relativity using his data from the 1919 eclipse and Watson and Crick’s determination of the structure of DNA.  相似文献   

14.
Biological science uses multiple species concepts. Order can be brought to this diversity if we recognize two key features. First, any given species concept is likely to have a patchwork structure, generated by repeated application of the concept to new domains. We illustrate this by showing how two species concepts (biological and ecological) have been modified from their initial eukaryotic applications to apply to prokaryotes. Second, both within and between patches, distinct species concepts may interact and hybridize. We thus defend a semantic picture of the species concept as a collection of interacting patchwork structures. Thus, although not all uses of the term pick out the same kind of unit in nature, the diversity of uses reflects something more than mere polysemy. We suggest that the emphasis on the use of species to pick out natural units is itself problematic, because that is not the term’s sole function. In particular, species concepts are used to manage inquiry into processes of speciation, even when these processes do not produce clearly delimited species.  相似文献   

15.
16.
In the last decade much has been made of the role that models play in the epistemology of measurement. Specifically, philosophers have been interested in the role of models in producing measurement outcomes. This discussion has proceeded largely within the context of the physical sciences, with notable exceptions considering measurement in economics. However, models also play a central role in the methods used to develop instruments that purport to quantify psychological phenomena. These methods fall under the umbrella term ‘psychometrics’. In this paper, we focus on Clinical Outcome Assessments (COAs) and discuss two measurement theories and their associated models: Classical Test Theory (CTT) and Rasch Measurement Theory. We argue that models have an important role to play in coordinating theoretical terms with empirical content, but to do so they must serve: 1) as a representation of the measurement interaction; and 2) in conjunction with a theory of the attribute in which we are interested. We conclude that Rasch Measurement Theory is a more promising approach than CTT in these regards despite the latter's popularity with health outcomes researchers.  相似文献   

17.
Mathematical invariances, usually referred to as “symmetries”, are today often regarded as providing a privileged heuristic guideline for understanding natural phenomena, especially those of micro-physics. The rise of symmetries in particle physics has often been portrayed by physicists and philosophers as the “application” of mathematical invariances to the ordering of particle phenomena, but no historical studies exist on whether and how mathematical invariances actually played a heuristic role in shaping microphysics. Moreover, speaking of an “application” of invariances conflates the formation of concepts of new intrinsic degrees of freedom of elementary particles with the formulation of models containing invariances with respect to those degrees of freedom. I shall present here a case study from early particle physics (ca. 1930–1954) focussed on the formation of one of the earliest concepts of a new degree of freedom, baryon number, and on the emergence of the invariance today associated to it. The results of the analysis show how concept formation and “application” of mathematical invariances were distinct components of a complex historical constellation in which, beside symmetries, two further elements were essential: the idea of physically conserved quantities and that of selection rules. I shall refer to the collection of different heuristic strategies involving selection rules, invariances and conserved quantities as the “SIC-triangle” and show how different authors made use of them to interpret the wealth of new experimental data. It was only a posteriori that the successes of this hybrid “symmetry heuristics” came to be attributed exclusively to mathematical invariances and group theory, forgetting the role of selection rules and of the notion of physically conserved quantity in the emergence of new degrees of freedom and new invariances. The results of the present investigation clearly indicate that opinions on the role of symmetries in fundamental physics need to be critically reviewed in the spirit of integrated history and philosophy of science.  相似文献   

18.
Predictivism is the view that successful predictions of “novel” evidence carry more confirmational weight than accommodations of already known evidence. Novelty, in this context, has traditionally been conceived of as temporal novelty. However temporal predictivism has been criticized for lacking a rationale: why should the time order of theory and evidence matter? Instead, it has been proposed, novelty should be construed in terms of use-novelty, according to which evidence is novel if it was not used in the construction of a theory. Only if evidence is use-novel can it fully support the theory entailing it. As I point out in this paper, the writings of the most influential proponent of use-novelty contain a weaker and a stronger version of use-novelty. However both versions, I argue, are problematic. With regard to the appraisal of Mendeleev’ periodic table, the most contentious historical case in the predictivism debate, I argue that temporal predictivism is indeed supported, although in ways not previously appreciated. On the basis of this case, I argue for a form of so-called symptomatic predictivism according to which temporally novel predictions carry more confirmational weight only insofar as they reveal the theory’s presumed coherence of facts as real.  相似文献   

19.
20.
We distinguish two orientations in Weyl's analysis of the fundamental role played by the notion of symmetry in physics, namely an orientation inspired by Klein's Erlangen program and a phenomenological-transcendental orientation. By privileging the former to the detriment of the latter, we sketch a group(oid)-theoretical program—that we call the Klein-Weyl program—for the interpretation of both gauge theories and quantum mechanics in a single conceptual framework. This program is based on Weyl's notion of a “structure-endowed entity” equipped with a “group of automorphisms”. First, we analyze what Weyl calls the “problem of relativity” in the frameworks provided by special relativity, general relativity, and Yang-Mills theories. We argue that both general relativity and Yang-Mills theories can be understood in terms of a localization of Klein's Erlangen program: while the latter describes the group-theoretical automorphisms of a single structure (such as homogenous geometries), local gauge symmetries and the corresponding gauge fields (Ehresmann connections) can be naturally understood in terms of the groupoid-theoretical isomorphisms in a family of identical structures. Second, we argue that quantum mechanics can be understood in terms of a linearization of Klein's Erlangen program. This stance leads us to an interpretation of the fact that quantum numbers are “indices characterizing representations of groups” ((Weyl, 1931a), p.xxi) in terms of a correspondence between the ontological categories of identity and determinateness.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号