首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
Bruno Latour claims to have shown that a Kantian model of knowledge, which he describes as seeking to unite a disembodied transcendental subject with an inaccessible thing-in-itself, is dramatically falsified by empirical studies of science in action. Instead, Latour puts central emphasis on scientific practice, and replaces this Kantian model with a model of “circulating reference.” Unfortunately, Latour's alternative schematic leaves out the scientific subject. I repair this oversight through a simple mechanical procedure. By putting a slight spin on Latour's diagrammatic representation of his theory, I discover a new space for a post-Kantian scientific subject, a subject brilliantly described by Ludwik Fleck. The neglected subjectivities and ceaseless practices of science are thus re-united.  相似文献   

3.
It is widely recognized that scientific theories are often associated with strictly inconsistent models, but there is little agreement concerning the epistemic consequences. Some argue that model inconsistency supports a strong perspectivism, according to which claims serving as interpretations of models are inevitably and irreducibly perspectival. Others argue that in at least some cases, inconsistent models can be unified as approximations to a theory with which they are associated, thus undermining this kind of perspectivism. I examine the arguments for perspectivism, and contend that its strong form is defeasible in principle, not merely in special cases. The argument rests on the plausibility of scientific knowledge concerning non-perspectival, dispositional facts about modelled systems. This forms the basis of a novel suggestion regarding how to understand the knowledge these models afford, in terms of a contrastive theory of what-questions.  相似文献   

4.
5.
Claims that the standard procedure for testing scientific theories is inapplicable to Everettian quantum theory, and hence that the theory is untestable, are due to misconceptions about probability and about the logic of experimental testing. Refuting those claims by correcting those misconceptions leads to an improved theory of scientific methodology (based on Popper׳s) and testing, which allows various simplifications, notably the elimination of everything probabilistic from the methodology (‘Bayesian’ credences) and from fundamental physics (stochastic processes).  相似文献   

6.
In this paper I assess whether the recently proposed “No De-Coupling” (NDC) theory of constitutive relevance in mechanisms is a useful tool to reconstruct constitutive relevance investigations in scientific practice. The NDC theory has been advanced as a framework theoretically superior to the mutual manipulability (MM) account of constitutive relevance in mechanisms but, in contrast to the MM account, has not yet been applied to detailed case studies. I argue that the NDC account is also applicable to empirical practice and that it fares better than the MM account on both theoretical and empirical grounds. I elaborate these claims in terms of applications of the NDC theory to two case studies of cognitive science research on the role of eye movements in mechanisms for cognitive capacities.  相似文献   

7.
I began this study with Laudan's argument from the pessimistic induction and I promised to show that the caloric theory of heat cannot be used to support the premisses of the meta-induction on past scientific theories. I tried to show that the laws of experimental calorimetry, adiabatic change and Carnot's theory of the motive power of heat were (i) independent of the assumption that heat is a material substance, (ii) approximately true, (iii) deducible and accounted for within thermodynamics.I stressed that results (i) and (ii) were known to most theorists of the caloric theory and that result (iii) was put forward by the founders of the new thermodynamics. In other words, the truth-content of the caloric theory was located, selected carefully, and preserved by the founders of thermodynamics.However, the reader might think that even if I have succeeded in showing that laudan is wrong about the caloric theory, I have not shown how the strategy followed in this paper can be generalised against the pessimistic meta-induction. I think that the general strategy against Laudan's argument suggested in this paper is this: the empirical success of a mature scientific theory suggests that there are respects and degrees in which this theory is true. The difficulty for — and and real challenge to — philosophers of science is to suggest ways in which this truth-content can be located and shown to be preserved — if at all — to subsequent theories. In particular, the empirical success of a theory does not, automatically, suggest that all theoretical terms of the theory refer. On the contrary, judgments of referential success depend on which theoretical claims are well-supported by the evidence. This is a matter of specific investigation. Generally, one would expect that claims about theoretical entities which are not strongly supported by the evidence or turn out to be independent of the evidence at hand, are not compelling. For simply, if the evidence does not make it likely that our beliefs about putative theoretical entities are approximately correct, a belief in those entities would be ill-founded and unjustified. Theoretical extrapolations in science are indespensable , but they are not arbitrary. If the evidence does not warrant them I do not see why someone should commit herself to them. In a sense, the problem with empricist philisophers is not that they demand that theoretical beliefs must be warranted by evidence. Rather, it is that they claim that no evidence can warrant theorretical beliefs. A realist philosopher of science would not disagree on the first, but she has good grounds to deny the second.I argued that claims about theoretical entities which are not strongly supported by the evidence must not be taken as belief-worthy. But can one sustaon the more ambitious view that loosely supported parts of a theory tend to be just those that include non-referring terms? There is an obvious excess risk in such a generalisation. For there are well-known cases in which a theoretical claim was initially weakly supported by the evidence  相似文献   

8.
The goal of this paper is to provide an interpretation of Feyerabend's metaphysics of science as found in late works like Conquest of Abundance and Tyranny of Science. Feyerabend's late metaphysics consists of an attempt to criticize and provide a systematic alternative to traditional scientific realism, a package of views he sometimes referred to as “scientific materialism.” Scientific materialism is objectionable not only on metaphysical grounds, nor because it provides a poor ground for understanding science, but because it implies problematic claims about the epistemic and cultural authority of science, claims incompatible with situating science properly in democratic societies. I show how Feyerabend's metaphysical view, which I call “the abundant world” or “abundant realism,” constitute a sophisticated and challenging form of ontological pluralism that makes interesting connections with contemporary philosophy of science and issues of the political and policy role of science in a democratic society.  相似文献   

9.
Several recent authors identify structural realism about scientific theories with the claim that the content of a scientific theory is expressible using its Ramsey sentence. Many of these authors have also argued that so understood, the view collapses into empiricist anti-realism, since an argument originally proposed by Max Newman in a review of Bertrand Russell’s The analysis of matter demonstrates that Ramsey sentences are trivially satisfied, and cannot make any significant claims about unobservables. In this paper I argue against both of these claims. Structural realism and Ramsey sentence realism are, in their most defensible versions, importantly different doctrines, and neither is committed to the premises required to demonstrate that they collapse into anti-realism.  相似文献   

10.
Alison Gopnik and Andrew Meltzoff have argued for a view they call the ‘theory theory’: theory change in science and children are similar. While their version of the theory theory has been criticized for depending on a number of disputed claims, we argue that there is a fundamental problem which is much more basic: the theory theory is multiply ambiguous. We show that it might be claiming that a similarity holds between theory change in children and (i) individual scientists, (ii) a rational reconstruction of a Superscientist, or (iii) the scientific community. We argue that (i) is false, (ii) is non-empirical (which is problematic since the theory theory is supposed to be a bold empirical hypothesis), and (iii) is either false or doesn't make enough sense to have a truth-value. We conclude that the theory theory is an interesting failure. Its failure points the way to a full, empirical picture of scientific development, one that marries a concern with the social dynamics of science to a psychological theory of scientific cognition.  相似文献   

11.
This article is about structural realism, historical continuity, laws of nature, and ceteris paribus clauses. Fresnel's Laws of optics support Structural Realism because they are a scientific structure that has survived theory change. However, the history of Fresnel's Laws which has been depicted in debates over realism since the 1980s is badly distorted. Specifically, claims that J. C. Maxwell or his followers believed in an ontologically-subsistent electromagnetic field, and gave up the aether, before Einstein's annus mirabilis in 1905 are indefensible. Related claims that Maxwell himself did not believe in a luminiferous aether are also indefensible. This paper corrects the record. In order to trace Fresnel's Laws across significant ontological changes, they must be followed past Einstein into modern physics and nonlinear optics. I develop the philosophical implications of a more accurate history, and analyze Fresnel's Laws' historical trajectory in terms of dynamic ceteris paribus clauses. Structuralists have not embraced ceteris paribus laws, but they continue to point to Fresnel's Laws to resist anti-realist arguments from theory change. Fresnel's Laws fit the standard definition of a ceteris paribus law as a law applicable only in particular circumstances. Realists who appeal to the historical continuity of Fresnel's Laws to combat anti-realists must incorporate ceteris paribus laws into their metaphysics.  相似文献   

12.
Under what circumstances, if any, are we warranted to assert that a theory is true or at least has some truth content? Scientific realists answer that such assertions are warranted only for those theories or theory-parts that enjoy explanatory and predictive success. A number of challenges to this answer have emerged, chief among them those arising from scientific theory change. For example, if, as scientific realists suggest, successive theories are to increasingly get closer to the truth, any theory changes must not undermine (i) the accumulation of explanatory and predictive success and (ii) the theoretical content responsible for that success. In this paper we employ frame theory to test to what extent certain theoretical claims made by the outdated caloric theory of heat and that, prima facie at least, were used to produce some of that theory’s success have survived into the theory that superseded it, i.e. the kinetic theory of heat. Our findings lend credence to structural realism, the view that scientific theories at best reveal only structural features of the unobservable world.  相似文献   

13.
Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computational quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.  相似文献   

14.
Robin Hendry has recently argued that although the term ‘element’ has traditionally been used in two different senses (basic substance and simple substance), there has nonetheless been a continuity of reference. The present article examines this author’s historical and philosophical claims and suggests that he has misdiagnosed the situation in several respects. In particular it is claimed that Hendry’s arguments for the nature of one particular element, oxygen, do not generalize to all elements as he implies. The second main objection is to Hendry’s view that the qua problem can be illuminated by appeal to the intention of scientists.  相似文献   

15.
The concept of phenomenotechnique has been regarded as Bachelard's most original contribution to the philosophy of science. Innovative as this neologism may seem, it benefited from a generation of debates on the nature and status of scientific facts, among conventionalist thinkers and their opponents. Granting that Bachelard stood among the opponents to conventionalism, this article nonetheless reveals deep similarities between his work and that of two conventionalist thinkers who insisted on what we call today the theory-ladenness of scientific experiment: Pierre Duhem and Édouard Le Roy. This article, therefore, compares Bachelard's notion of phenomenotechnique with Duhem's developments on the double character of scientific instruments, and with Le Roy's claim that scientific facts are fabricated to meet the requirements of theory. It shows how Bachelard retained Duhem and Le Roy's views on the interplay between theory and experiment but rejected their sceptical conclusions on the limitations of experimental control. It claims that this critical inheritance of conventionalism was made possible by a reflection on technology, which led Bachelard to re-evaluate the artificiality of scientific facts: instead of regarding this artificiality as a limitation of science, as Le Roy did, he presented it as a condition for objective knowledge.  相似文献   

16.
I distinguish between two ways in which Kuhn employs the concept of incommensurability based on for whom it presents a problem. First, I argue that Kuhn’s early work focuses on the comparison and underdetermination problems scientists encounter during revolutionary periods (actors’ incommensurability) whilst his later work focuses on the translation and interpretation problems analysts face when they engage in the representation of science from earlier periods (analysts’ incommensurability). Secondly, I offer a new interpretation of actors’ incommensurability. I challenge Kuhn’s account of incommensurability which is based on the compartmentalisation of the problems of both underdetermination and non-additivity to revolutionary periods. Through employing a finitist perspective, I demonstrate that in principle these are also problems scientists face during normal science. I argue that the reason why in certain circumstances scientists have little difficulty in concurring over their judgements of scientific findings and claims while in others they disagree needs to be explained sociologically rather than by reference to underdetermination or non-additivity. Thirdly, I claim that disagreements between scientists should not be couched in terms of translation or linguistic problems (aspects of analysts’ incommensurability), but should be understood as arising out of scientists’ differing judgments about how to take scientific inquiry further.  相似文献   

17.
I claim that one way thought experiments contribute to scientific progress is by increasing scientific understanding. Understanding does not have a currently accepted characterization in the philosophical literature, but I argue that we already have ways to test for it. For instance, current pedagogical practice often requires that students demonstrate being in either or both of the following two states: 1) Having grasped the meaning of some relevant theory, concept, law or model, 2) Being able to apply that theory, concept, law or model fruitfully to new instances. Three thought experiments are presented which have been important historically in helping us pass these tests, and two others that cause us to fail. Then I use this operationalization of understanding to clarify the relationships between scientific thought experiments, the understanding they produce, and the progress they enable. I conclude that while no specific instance of understanding (thus conceived) is necessary for scientific progress, understanding in general is.  相似文献   

18.
Philosophers now commonly reject the value free ideal for science by arguing that non-epistemic values, including personal or social values, are permissible within the core of scientific research. However, little attention has been paid to the normative political consequences of this position. This paper explores these consequences and shows how political theory is fruitful for proceeding in a world without value-neutral science. I draw attention to an oft-overlooked argument employed by proponents of the value free ideal I dub the “political legitimacy argument.” This argument claims that the value-free ideal follows directly from the foundational principles of liberal democracy. If so, then the use of value-laden scientific information within democratic decision making would be illegitimate on purely political grounds. Despite highlighting this unaddressed and important argument, I show how it can be rejected. By appealing to deliberative democratic theory, I demonstrate scientific information can be value-laden and politically legitimate. The deliberative democratic account I develop is well suited for capturing the intuitions of many opponents of the value free ideal and points to a new set of questions for those interested in values in science.  相似文献   

19.
The paper argues that Helen Longino’s pluralism implies circularity as it claims a preferably high number of qualified contributions to any scientific discussion that aims for objectivity, but does not regard the question who or what sets and employs the standards that rule the decision who is qualified to contribute and who is not. Therefore, objectivity is premised for a process that is to generate that very objectivity. Philip Kitcher’s ideal of democratization of science seems only to bypass the problem by introducing ideal deliberators tutored by appropriate experts, as for the implementation of this ideal the deliberators and experts, again, would have to be appointed by someone. However, Kitcher’s approach is based on a Rawlsian egalitarism and in this sense calls for political intrusion which could be based on case-by-case decisions. This offers a solution. I will illuminate the problem by some examples from climatology and demonstrate how Kitcher’s approach can help to tackle the problem by a final case study of pluralism in the Intergovernmental Panel on Climate Change.  相似文献   

20.
Contrary to Sankey’s central assumption, incommensurability does not imply incomparability of content, nor threaten scientific realism by challenging the rationality of theory comparison. Moreover, Sankey equivocates between reference to specific entities by statements used to test theories and reference to kinds by theories themselves. This distinction helps identify and characterize the genuine threat that incommensurability poses to realism, which is ontological discontinuity as evidenced in the historical record: Successive theories reclassify objects into mutually exclusive sets of kinds to which they refer. That is why claiming that scientific progress is an increasingly better approximation to truth is difficult to justify. Similarly, Sankey’s attack on neo-Kantian antirealist positions is based on his misunderstanding of some of the central terms of those positions, making most of his attack on them ineffectual, including his diagnosis of their incoherence. We conclude by reiterating our conviction that in this debate meta-incommensurability plays an important role.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号