首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 281 毫秒
1.
Following demands to regulate biomedicine in the post-war period, Sweden saw several political debates about research ethics in the 1970s. Many of the debates centered on fetal research and animal experiments. At stake were questions of moral permissibility, public transparency, and scientific freedom. However, these debates did not only reveal ethical disagreement—they also contributed to constructing new boundaries between life-forms. Taking a post-Marxist approach to discursive policy analysis, we argue that the meaning of both the “human” and the “animal” in these debates was shaped by a need to manage a legitimacy crisis for medical science. By analyzing Swedish government bills, motions, parliamentary debates, and committee memorials from the 1970s, we map out how fetal and animal research were constituted as policy problems. We place particular emphasis on the problematization of fetal and animal vulnerability. By comparing the debates, we trace out how a particular vision of the ideal life defined the human-animal distinction.  相似文献   

2.
It is widely acknowledged that the patient's perspective should be considered when making decisions about how her care will be managed. Patient participation in the decision making process may play an important role in bringing to light and incorporating her perspective. The GRADE framework is touted as an evidence-based process for determining recommendations for clinical practice; i.e. determining how care ought to be managed. GRADE recommendations are categorized as “strong” or “weak” based on several factors, including the “values and preferences” of a “typical” patient. The strength of the recommendation also provides instruction to the clinician about when and how patients should participate in the clinical encounter, and thus whether an individual patient's values and preferences will be heard in her clinical encounter. That is, a “strong” recommendation encourages “paternalism” and a “weak” recommendation encourages shared decision making. We argue that adoption of the GRADE framework is problematic to patient participation and may result in care that is not respectful of the individual patient's values and preferences. We argue that the root of the problem is the conception of “values and preferences” in GRADE – the framework favours population thinking (e.g. “typical” patient “values and preferences”), despite the fact that “values and preferences” are individual in the sense that they are deeply personal. We also show that tying the strength of a recommendation to a model of decision making (paternalism or shared decision making) constrains patient participation and is not justified (theoretically and/or empirically) in the GRADE literature.  相似文献   

3.
In the published version of Hugh Everett III's doctoral dissertation, he inserted what has become a famous footnote, the “note added in proof”. This footnote is often the strongest evidence given for any of various interpretations of Everett (the many worlds, many minds, many histories and many threads interpretations). In this paper I will propose a new interpretation of the footnote. One that is supported by evidence found in letters written to and by Everett; one that is suggested by a new interpretation of Everett, an interpretation that takes seriously the central position of relative states in Everett's pure wave mechanics: the relative facts interpretation. Of central interest in this paper is how to make sense of Everett's claim in the “note added in proof” that “all elements of a superposition (all “branches”) are “actual,” none any more “real” than the rest.”  相似文献   

4.
This paper investigates how and when pairs of terms such as “local–global” and “im Kleinenim Grossen” began to be used by mathematicians as explicit reflexive categories. A first phase of automatic search led to the delineation of the relevant corpus, and to the identification of the period from 1898 to 1918 as that of emergence. The emergence appears to have been, from the very start, both transdisciplinary (function theory, calculus of variations, differential geometry) and international, although the AMS-Göttingen connection played a specific part. First used as an expository and didactic tool (e.g. by Osgood), it soon played a crucial part in the creation of new mathematical concepts (e.g. in Hahn’s work), in the shaping of research agendas (e.g. Blaschke’s global differential geometry), and in Weyl’s axiomatic foundation of the manifold concept. We finally turn to France, where in the 1910s, in the wake of Poincaré’s work, Hadamard began to promote a research agenda in terms of “passage du local au general.”  相似文献   

5.
Measurement results depend upon assumptions, and some of those assumptions are theoretical in character. This paper examines particle physics measurements in which a measurement result depends upon a type of assumption for which that very same result may be evidentially relevant, thus raising a worry about potential circularity in argumentation. We demonstrate how the practice of evaluating measurement uncertainty serves to render any such evidential circularity epistemically benign. Our analysis shows how the evaluation and deployment of uncertainty evaluation constitutes an in practice solution to a particular form of Duhemian underdetermination that improves upon Duhem's vague notion of “good sense,” avoids holism, and reconciles theory dependence of measurement with piecemeal hypothesis testing.  相似文献   

6.
This paper sets out to show how Eddington's early twenties case for variational derivatives significantly bears witness to a steady and consistent shift in focus from a resolute striving for objectivity towards “selective subjectivism” and structuralism. While framing his so-called “Hamiltonian derivatives” along the lines of previously available variational methods allowing to derive gravitational field equations from an action principle, Eddington assigned them a theoretical function of his own devising in The Mathematical Theory of Relativity (1923). I make clear that two stages should be marked out in Eddington's train of thought if the meaning of such variational derivatives is to be adequately assessed. As far as they were originally intended to embody the mind's collusion with nature by linking atomicity of matter with atomicity of action, variational derivatives were at first assigned a dual role requiring of them not only to express mind's craving for permanence but also to tune up mind's privileged pattern to “Nature's own idea”. Whereas at a later stage, as affine field theory would provide a framework for world-building, such “Hamiltonian differentiation” would grow out of tune through gauge-invariance and, by disregarding how mathematical theory might precisely come into contact with actual world, would be turned into a mere heuristic device for structural knowledge.  相似文献   

7.
Naturalized metaphysics remains the default presupposition of much contemporary philosophy of physics. As metaphysics is supposed to concern the general structure of reality, so scientific naturalism draws upon our best physical theories to attempt to answer the foundational question “par excellenceviz., “how could the world possibly be the way this theory says it is?” A particular case study, Hilbert's attempt to analyze and explain a seeming “pre-established harmony” between mind and nature, is offered as a salutary reminder that naturalism's ready inference from physical theory to ontology may be too quick.  相似文献   

8.
We distinguish two orientations in Weyl's analysis of the fundamental role played by the notion of symmetry in physics, namely an orientation inspired by Klein's Erlangen program and a phenomenological-transcendental orientation. By privileging the former to the detriment of the latter, we sketch a group(oid)-theoretical program—that we call the Klein-Weyl program—for the interpretation of both gauge theories and quantum mechanics in a single conceptual framework. This program is based on Weyl's notion of a “structure-endowed entity” equipped with a “group of automorphisms”. First, we analyze what Weyl calls the “problem of relativity” in the frameworks provided by special relativity, general relativity, and Yang-Mills theories. We argue that both general relativity and Yang-Mills theories can be understood in terms of a localization of Klein's Erlangen program: while the latter describes the group-theoretical automorphisms of a single structure (such as homogenous geometries), local gauge symmetries and the corresponding gauge fields (Ehresmann connections) can be naturally understood in terms of the groupoid-theoretical isomorphisms in a family of identical structures. Second, we argue that quantum mechanics can be understood in terms of a linearization of Klein's Erlangen program. This stance leads us to an interpretation of the fact that quantum numbers are “indices characterizing representations of groups” ((Weyl, 1931a), p.xxi) in terms of a correspondence between the ontological categories of identity and determinateness.  相似文献   

9.
Building on Norton's “material theory of induction,” this paper shows through careful historical analysis that analogy can act as a methodological principle or stratagem, providing experimentalists with a useful framework to assess data and devise novel experiments. Although this particular case study focuses on late eighteenth and early nineteenth-century experiments on the properties and composition of acids, the results of this investigation may be extended and applied to other research programs. A stage in-between what Steinle calls “exploratory experimentation” and robust theory, I argue that analogy encouraged research to substantiate why the likenesses should outweigh the differences (or vice versa) when evaluating results and designing experiments.  相似文献   

10.
In early 1925, Wolfgang Pauli (1900–1958) published the paper for which he is now most famous and for which he received the Nobel Prize in 1945. The paper detailed what we now know as his “exclusion principle.” This essay situates the work leading up to Pauli's principle within the traditions of the “Sommerfeld School,” led by Munich University's renowned theorist and teacher, Arnold Sommerfeld (1868–1951). Offering a substantial corrective to previous accounts of the birth of quantum mechanics, which have tended to sideline Sommerfeld's work, it is suggested here that both the method and the content of Pauli's paper drew substantially on the work of the Sommerfeld School in the early 1920s. Part One describes Sommerfeld's turn away from a faith in the power of model-based (modellmässig) methods in his early career towards the use of a more phenomenological emphasis on empirical regularities (Gesetzmässigkeiten) during precisely the period that both Pauli and Werner Heisenberg (1901–1976), among others, were his students. Part two delineates the importance of Sommerfeld's phenomenology to Pauli's methods in the exclusion principle paper, a paper that also eschewed modellmässig approaches in favour of a stress on Gesetzmässigkeiten. In terms of content, a focus on Sommerfeld's work reveals the roots of Pauli's understanding of the fundamental Zweideutigkeit (ambiguity) involving the quantum number of electrons within the atom. The conclusion points to the significance of these results to an improved historical understanding of the origin of aspects of Heisenberg's 1925 paper on the “Quantum-theoretical Reformulation (Umdeutung) of Kinematical and Mechanical Relations.”  相似文献   

11.
In The Theory of Relativity and A Priori Knowledge (1920b), Reichenbach developed an original account of cognition as coordination of formal structures to empirical ones. One of the most salient features of this account is that it is explicitly not a top-down type of coordination, and in fact it is crucially “directed” by the empirical side. Reichenbach called this feature “the mutuality of coordination” but, in that work, did not elaborate sufficiently on how this is supposed to work. In a paper that he wrote less than two years afterwards (but that he published only in 1932), “The Principle of Causality and the Possibility of its Empirical Confirmation” (1923/1932), he described what seems to be a model for this idea, now within an analysis of causality that results in an account of scientific inference. Recent reassessments of his early proposal do not seem to capture the extent of Reichenbach's original worries. The present paper analyses Reichenbach's early account and suggests a new way to look at his early work. According to it, we perform measurements, individuate parameters, collect and analyse data, by using a “constructive” approach, such as the one with which we formulate and test hypotheses, which paradigmatically requires some simplicity assumptions. Reichenbach's attempt to account for all these aspects in 1923 was obviously limited and naive in many ways, but it shows that, in his view, there were multiple ways in which the idea of “constitution” is embodied in scientific practice.  相似文献   

12.
Cramer's Transactional Interpretation (TI) is applied to the “quantum liar experiment” (QLE). It is shown how some apparently paradoxical features can be explained naturally, albeit nonlocally (since TI is an explicitly nonlocal interpretation, at least from the vantage point of ordinary spacetime). At the same time, it is proposed that in order to preserve the elegance and economy of the interpretation, it may be necessary to consider offer and confirmation waves as propagating in a “higher space” of possibilities.  相似文献   

13.
Despite aspirations to substitute animal experimentation with alternative methods and recent progress in the area of non-animal approaches, such as organoïds and organ(s)-on-a-chip technologies, there is no extensive replacement of animal-based research in biomedicine. In this paper, I will analyse this state of affairs with reference to key institutional and socio-epistemic barriers for the development and use of non-animal approaches in the context of biomedical research in Europe. I will argue that there exist several factors that inhibit change in this context. In particular, there is what I call “scientific inertia”, i.e. a certain degree of conservatism in scientific practice regarding the development and use of non-animal approaches to replace animal experimentation. This type of inertia is facilitated by socio-epistemic characteristics of animal-based research in the life sciences and is a key factor in understanding the status quo in biomedical research. The underlying reasons for scientific inertia have not received sufficient attention in the literature to date because the phenomenon transcends traditional disciplinary boundaries in the study of animal experimentation. This paper addresses this issue and seeks to contribute to a better understanding of scientific inertia by using a methodology that looks at the interplay of institutional, epistemic, and regulatory aspects of animal-based research.  相似文献   

14.
Quine's “naturalized epistemology” presents a challenge to Carnapian explication: why try to rationally reconstruct probabilistic concepts instead of just doing psychology? This paper tracks the historical development of Richard C. Jeffrey who, on the one hand, voiced worries similar to Quine's about Carnapian explication but, on the other hand, claims that his own work in formal epistemology—what he calls “radical probabilism”—is somehow continuous with both Carnap's method of explication and logical empiricism. By examining how Jeffrey's claim could possibly be accurate, the paper suggests that Jeffrey's radical probabilism can be seen as a sort of alternative explication project to Carnap's own inductive logic. In so doing, it deflates both Quine's worries about Carnapian explication and so also, by extension, similar worries about formal epistemology.  相似文献   

15.
“Colligation”, a term first introduced in philosophy of science by William Whewell (1840), today sparks a renewed interest beyond Whewell scholarship. In this paper, we argue that adopting the notion of colligation in current debates in philosophy of science can contribute to our understanding of scientific models. Specifically, studying colligation allows us to have a better grasp of how integrating diverse model components (empirical data, theory, useful idealization, visual and other representational resources) in a creative way may produce novel generalizations about the phenomenon investigated. Our argument is built both on the theoretical appraisal of Whewell’s philosophy of science and the historical rehabilitation of his scientific work on tides. Adopting a philosophy of science in practice perspective, we show how colligation emerged from Whewell’s empirical work on tides. The production of idealized maps (“cotidal maps”) illustrates the unifying and creative power of the activity of colligating in scientific practice. We show the importance of colligation in modelling practices more generally by looking at its epistemic role in the construction of the San Francisco Bay Model.  相似文献   

16.
Arnold Arluke and Clinton Sanders (1996) have argued that human societies index both humans and animals as belonging to particular rungs of the social hierarchy. They term this multispecies ranking the “sociozoological scale”. This paper will investigate how claims at the 1875 Royal Commission on Vivisection about the sensitivity of particular species and breeds not only reflected assumptions about human social hierarchy but also blurred the boundaries between the human and the animal in the process. It will further be shown how these claims were informed by 18th and 19th century humanitarianism, classism, scientific racism and evolutionary theory, and how these influences combined in claims-making about the relative capacity of particular animals to sense pain and ethical duties towards them that followed from this sensitivity. Particular attention will be given to the opposing efforts of commissioners Thomas Henry Huxley and Richard Holt Hutton to demarcate human and animal sensitivity and exempt companion animals from vivisection respectively. The paper concludes by considering the sociozoological orders constituted by the 1876 Cruelty to Animals Act, particularly through its focus on calculating pain, and the legacy and limitations of this constitution.  相似文献   

17.
Theories are composed of multiple interacting components. I argue that some theories have narratives as essential components, and that narratives function as integrative devices of the mathematical components of theories. Narratives represent complex processes unfolding in time as a sequence of stages, and hold the mathematical elements together as pieces in the investigation of a given process. I present two case studies from population genetics: R. A. Fisher's “mas selection” theory, and Sewall Wright's shifting balance theory. I apply my analysis to an early episode of the “R. A. Fisher – Sewall Wright controversy.”  相似文献   

18.
A number of scholars have recently drawn attention to the importance of iteration in scientific research. This paper builds on these previous discussions by drawing a distinction between epistemic and methodological forms of iteration and by clarifying the relationships between them. As defined here, epistemic iteration involves progressive alterations to scientific knowledge claims, whereas methodological iteration refers to an interplay between different modes of research practice. While distinct, these two forms of iteration are related in important ways. Contemporary research on the biological effects of nanomaterials illustrates that methodological iteration can help to “initiate,” “equip,” and “stimulate” epistemic iteration.  相似文献   

19.
The aim of this article is to shed light on an understudied aspect of Giordano Bruno's intellectual biography, namely, his career as a mathematical practitioner. Early interpreters, especially, have criticized Bruno's mathematics for being “outdated” or too “concrete”. However, thanks to developments in the study of early modern mathematics and the rediscovery of Bruno's first mathematical writings (four dialogues on Fabrizio's Mordente proportional compass), we are in a position to better understand Bruno's mathematics. In particular, this article aims to reopen the question of whether Bruno anticipated the concept of infinitesimal quantity. It does so by providing an analysis of the dialogues on Mordente's compass and of the historical circumstances under which those dialogues were written.  相似文献   

20.
The goal of this paper is to provide an interpretation of Feyerabend's metaphysics of science as found in late works like Conquest of Abundance and Tyranny of Science. Feyerabend's late metaphysics consists of an attempt to criticize and provide a systematic alternative to traditional scientific realism, a package of views he sometimes referred to as “scientific materialism.” Scientific materialism is objectionable not only on metaphysical grounds, nor because it provides a poor ground for understanding science, but because it implies problematic claims about the epistemic and cultural authority of science, claims incompatible with situating science properly in democratic societies. I show how Feyerabend's metaphysical view, which I call “the abundant world” or “abundant realism,” constitute a sophisticated and challenging form of ontological pluralism that makes interesting connections with contemporary philosophy of science and issues of the political and policy role of science in a democratic society.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号