首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 218 毫秒
1.
How should we understand scientific progress? Kuhn famously discussed science as its own internally driven venture, structured by paradigms. He also famously had a problem describing progress in science, as problem-solving ability failed to provide a clear rubric across paradigm change—paradigm changes tossed out problems as well as solving them. I argue here that much of Kuhn’s inability to articulate a clear view of scientific progress stems from his focus on pure science and a neglect of applied science. I trace the history of the distinction between pure and applied science, showing how the distinction came about, the rhetorical uses to which the distinction has been put, and how pure science came to be both more valued by scientists and philosophers. I argue that the distinction between pure and applied science does not stand up to philosophical scrutiny, and that once we relinquish it, we can provide Kuhn with a clear sense of scientific progress. It is not one, though, that will ultimately prove acceptable. For that, societal evaluations of scientific work are needed.  相似文献   

2.
This paper reconsiders the challenge presented to scientific realism by the semantic incommensurability thesis. A twofold distinction is drawn between methodological and semantic incommensurability, and between semantic incommensurability due to variation of sense and due to discontinuity of reference. Only the latter presents a challenge to scientific realism. The realist may dispose of this challenge on the basis of a modified causal theory of reference, as argued in the author’s 1994 book, The incommensurability thesis. This referential response has been the subject of a charge of meta-incommensurability by Hoyningen-Huene et al. (1996), who argue that the realist’s referential response begs the question against anti-realist advocates of incommensurability. In reply, it is noted that a tu quoque rejoinder is available to the realist. It is also argued that the dialectical situation favours the scientific realist, since the anti-realist defence of incommensurability depends on an incoherent distinction between phenomenal world and world-in-itself. In light of such incoherence, and a strong commonsense presumption in favour of realism, the referential response to semantic incommensurability may be justifiably based on realism.  相似文献   

3.
In this paper, I investigate the nature of empirical findings that provide evidence for the characterization of a scientific phenomenon, and the defeasible nature of this evidence. To do so, I explore an exemplary instance of the rejection of a characterization of a scientific phenomenon: memory transfer. I examine the reason why the characterization of memory transfer was rejected, and analyze how this rejection tied to researchers’ failures to resolve experimental issues relating to replication and confounds. I criticize the presentation of the case by Harry Collins and Trevor Pinch, who claim that no sufficient reason was provided to abandon research on memory transfer. I argue that skeptics about memory transfer adopted what I call a defeater strategy, in which researchers exploit the defeasibility of the evidence for a characterization of a phenomenon.  相似文献   

4.
At first glance there seem to be many similarities between Thomas S. Kuhn’s and Ludwik Fleck’s accounts of the development of scientific knowledge. Notably, both pay attention to the role played by the scientific community in the development of scientific knowledge. But putting first impressions aside, one can criticise some philosophers for being too hasty in their attempt to find supposed similarities in the works of the two men. Having acknowledged that Fleck anticipated some of Kuhn’s later theses, there seems to be a temptation in more recent research to equate both theories in important respects. Because of this approach, one has to deal with the problem of comparing the most notable technical terms of both philosophers, namely “thought style” and “paradigm”.This paper aims at a more thorough comparison between Ludwik Fleck’s concept of thought style and Thomas Kuhn’s concept of paradigm. Although some philosophers suggest that these two concepts are essentially equal in content, a closer examination reveals that this is not the case. This thesis of inequality will be defended in detail, also taking into account some of the alleged similarities which may be responsible for losing sight of the differences between these theories.  相似文献   

5.
At first glance there seem to be many similarities between Thomas S. Kuhn’s and Ludwik Fleck’s accounts of the development of scientific knowledge. Notably, both pay attention to the role played by the scientific community in the development of scientific knowledge. But putting first impressions aside, one can criticise some philosophers for being too hasty in their attempt to find supposed similarities in the works of the two men. Having acknowledged that Fleck anticipated some of Kuhn’s later theses, there seems to be a temptation in more recent research to equate both theories in important respects. Because of this approach, one has to deal with the problem of comparing the most notable technical terms of both philosophers, namely “thought style” and “paradigm”.This paper aims at a more thorough comparison between Ludwik Fleck’s concept of thought style and Thomas Kuhn’s concept of paradigm. Although some philosophers suggest that these two concepts are essentially equal in content, a closer examination reveals that this is not the case. This thesis of inequality will be defended in detail, also taking into account some of the alleged similarities which may be responsible for losing sight of the differences between these theories.  相似文献   

6.
Advocates of the self-corrective thesis argue that scientific method will refute false theories and find closer approximations to the truth in the long run. I discuss a contemporary interpretation of this thesis in terms of frequentist statistics in the context of the behavioral sciences. First, I identify experimental replications and systematic aggregation of evidence (meta-analysis) as the self-corrective mechanism. Then, I present a computer simulation study of scientific communities that implement this mechanism to argue that frequentist statistics may converge upon a correct estimate or not depending on the social structure of the community that uses it. Based on this study, I argue that methodological explanations of the “replicability crisis” in psychology are limited and propose an alternative explanation in terms of biases. Finally, I conclude suggesting that scientific self-correction should be understood as an interaction effect between inference methods and social structures.  相似文献   

7.
Recent literature in the scientific realism debate has been concerned with a particular species of statistical fallacy concerning base-rates, and the worry that no matter how predictively successful our contemporary scientific theories may be, this will tell us absolutely nothing about the likelihood of their truth if our overall sample space contains enough empirically adequate theories that are nevertheless false. In response, both realists and anti-realists have switched their focus from general arguments concerning the reliability and historical track-records of our scientific methodology, to a series of specific arguments and case-studies concerning our reasons to believe individual scientific theories to be true. Such a development however sits in tension with the usual understanding of the scientific realism debate as offering a second-order assessment of our first-order scientific practices, and threatens to undermine the possibility of a distinctive philosophical debate over the approximate truth of our scientific theories. I illustrate this concern with three recent attempts to offer a more localised understanding of the scientific realism debate—due to Stathis Psillos, Juha Saatsi, and Kyle Stanford—and argue that none of these alternatives offer a satisfactory response to the problem.  相似文献   

8.
For many years, scientific heritage has received attention from multiple actors from different spheres of activity—archives, museums, scientific institutions. Beyond the heterogeneity revealed when examining the place of scientific heritage in different places, an authentic patrimonial configuration emerges and takes the form of a nebula of claims and of accomplishments that result, in some cases, in institutional and political recognition at the national level, in various country all around the world. At the international level, the creation of the international committee dedicated to University Museums and Collections (UMAC) within the International Council of Museums (ICOM) certainly testified from this raising interest in academic heritage and the existence of a specific community concern with it.This article presents numerous initiatives for the preservation of scientific heritage in France, with the goal of analysing the relationship scientists have with their heritage. We argue that scientific communities have a special relationship with heritage, which is characterized by a number of ambiguities. We show that such ambivalences allow analysis of identity, discipline, professional, and social issues operative in defining heritage and being redefined by heritage. To explore these dimensions, we have chosen to present three different case studies. The first traces the institutional uses of heritage by a scientific institution, the Commissariat à l’énergie atomique (CEA), through the transformation of the first French atomic reactor (ZOE) into a museum. The second example describes the initiatives of French astronomers from the mid-1990s to construct a national programme for the protection of astronomy heritage. Lastly, we recount the case of universities, with the example of the Université de Strasbourg.  相似文献   

9.
According to the foundationalist picture, shared by many rationalists and positivist empiricists, science makes cognitive progress by accumulating justified truths. Fallibilists, who point out that complete certainty cannot be achieved in empirical science, can still argue that even successions of false theories may progress toward the truth. This proposal was supported by Karl Popper with his notion of truthlikeness or verisimilitude. Popper’s own technical definition failed, but the idea that scientific progress means increasing truthlikeness can be expressed by defining degrees of truthlikeness in terms of similarities between states of affairs. This paper defends the verisimilitude approach against Alexander Bird who argues that the “semantic” definition (in terms of truth or truthlikeness alone) is not sufficient to define progress, but the “epistemic” definition referring to justification and knowledge is more adequate. Here Bird ignores the crucial distinction between real progress and estimated progress, explicated by the difference between absolute (and usually unknown) degrees of truthlikeness and their evidence-relative expected values. Further, it is argued that Bird’s idea of returning to the cumulative model of growth requires an implausible trick of transforming past false theories into true ones.  相似文献   

10.
This paper motivates and outlines a new account of scientific explanation, which I term ‘collaborative explanation.’ My approach is pluralist: I do not claim that all scientific explanations are collaborative, but only that some important scientific explanations are—notably those of complex organic processes like development. Collaborative explanation is closely related to what philosophers of biology term ‘mechanistic explanation’ (e.g., Machamer et al., Craver, 2007). I begin with minimal conditions for mechanisms: complexity, causality, and multilevel structure. Different accounts of mechanistic explanation interpret and prioritize these conditions in different ways. This framework reveals two distinct varieties of mechanistic explanation: causal and constitutive. The two have heretofore been conflated, with philosophical discussion focusing on the former. This paper addresses the imbalance, using a case study of modeling practices in Systems Biology to reveals key features of constitutive mechanistic explanation. I then propose an analysis of this variety of mechanistic explanation, in terms of collaborative concepts, and sketch the outlines of a general theory of collaborative explanation. I conclude with some reflections on the connection between this variety of explanation and social aspects of scientific practice.  相似文献   

11.
This paper attempts to argue for the theory-ladenness of evidence. It does so by employing and analysing an episode from the history of eighteenth century chemistry. It delineates attempts by Joseph Priestley and Antoine Lavoisier to construct entirely different kinds of evidence for and against a particular hypothesis from a set of agreed upon observations or (raw) data. Based on an augmented version of a distinction, drawn by J. Bogen and J. Woodward, between data and phenomena it is shown that the role of theoretical auxiliary assumptions is very important in constructing evidence for (or against) a theory from observation or (raw) data. In revolutionary situations, rival groups hold radically different theories and theoretical auxiliary assumptions. These are employed to construct very different evidence from the agreed upon set of observations or (raw) data. Hence, theory resolution becomes difficult. It is argued that evidence construction is a multi-layered exercise and can be disputed at any level. What counts as unproblematic observation or (raw) data at one level may become problematic at another level. The contingency of these constructions and the (un)problematic nature of evidence are shown to be partially dependent upon the scientific knowledge that the scientific community possesses.  相似文献   

12.
In this paper I challenge and adjudicate between the two positions that have come to prominence in the scientific realism debate: deployment realism and structural realism. I discuss a set of cases from the history of celestial mechanics, including some of the most important successes in the history of science. To the surprise of the deployment realist, these are novel predictive successes toward which theoretical constituents that are now seen to be patently false were genuinely deployed. Exploring the implications for structural realism, I show that the need to accommodate these cases forces our notion of “structure” toward a dramatic depletion of logical content, threatening to render it explanatorily vacuous: the better structuralism fares against these historical examples, in terms of retention, the worse it fares in content and explanatory strength. I conclude by considering recent restrictions that serve to make “structure” more specific. I show however that these refinements will not suffice: the better structuralism fares in specificity and explanatory strength, the worse it fares against history. In light of these case studies, both deployment realism and structural realism are significantly threatened by the very historical challenge they were introduced to answer.  相似文献   

13.
This introductory essay to the special issue on ‘understanding without explanation’ provides a review of the debate in philosophy of science concerning the relation between scientific explanation and understanding, and an overview of the themes addressed in the papers included in this issue. In recent years, the traditional consensus that understanding is a philosophically irrelevant by-product of scientific explanations has given way to a lively debate about the relation between understanding and explanation. The papers in this issue defend or challenge the idea that understanding is a cognitive achievement in its own right, rather than simply a derivative or side-effect of scientific explanations.  相似文献   

14.
The translation of a mathematical model into a numerical one employs various modifications in order to make the model accessible for computation. Such modifications include discretizations, approximations, heuristic assumptions, and other methods. The paper investigates the divergent styles of mathematical and numerical models in the case of a specific piece of code in a current atmospheric model. Cognizance of these modifications means that the question of the role and function of scientific models has to be reworked. Neither are numerical models pure intermediaries between theory and data, nor are they autonomous tools of inquiry. Instead, theory and data are transformed into a new symbolic form of research due to the fact that computation has become an essential requirement for every scientific practice. Therefore the question is posed: What do numerical (climate) models really represent?  相似文献   

15.
In this paper, I argue for a distinction between two scales of coordination in scientific inquiry, through which I reassess Georg Simon Ohm's work on conductivity and resistance. Firstly, I propose to distinguish between measurement coordination, which refers to the specific problem of how to justify the attribution of values to a quantity by using a certain measurement procedure, and general coordination, which refers to the broader issue of justifying the representation of an empirical regularity by means of abstract mathematical tools. Secondly, I argue that the development of Ohm's measurement practice between the first and the second experimental phase of his work involved the change of the measurement coordination on which he relied to express his empirical results. By showing how Ohm relied on different calibration assumptions and practices across the two phases, I demonstrate that the concurrent change of both Ohm's experimental apparatus and the variable that Ohm measured should be viewed based on the different form of measurement coordination. Finally, I argue that Ohm's assumption that tension is equally distributed in the circuit is best understood as part of the general coordination between Ohm's law and the empirical regularity that it expresses, rather than measurement coordination.  相似文献   

16.
In this paper, I consider Kitcher’s (1993) account of reference for the expressions of past science. Kitcher’s case study is of Joseph Priestley and his expression ‘dephlogisticated air’. There is a strong intuitive case that ‘dephlogisticated air’ referred to oxygen, but it was underpinned by very mistaken phlogiston theory, so concluding either that dephlogisticated air referred straightforwardly or that it failed to refer both have unpalatable consequences. Kitcher argues that the reference of such terms is best considered relative to each token—some tokens refer, and others do not. His account thus relies crucially on how this distinction between tokens can be made good—a puzzle I call the discrimination problem. I argue that the discrimination problem cannot be solved. On any reading of Kitcher’s defence of the distinction, the grounds provided are either insufficient or illegitimate. On the first reading, Kitcher violates the principle of humanity by making Priestley’s referential success a matter of the mental contents of modern speakers. The second reading sidesteps the problem of beliefs by appealing to mind-independent facts, but I argue that these are insufficient to achieve reference because of the indeterminacy introduced by the qua problem. On the third and final reading, Priestley’s success is given by what he would say in counterfactual circumstances. I argue that even if there are facts about what Priestley would say, and there is reason for doubt, there is no motivation to think that such facts determine how Priestley referred in the actual world.  相似文献   

17.
Contrary to Sankey’s central assumption, incommensurability does not imply incomparability of content, nor threaten scientific realism by challenging the rationality of theory comparison. Moreover, Sankey equivocates between reference to specific entities by statements used to test theories and reference to kinds by theories themselves. This distinction helps identify and characterize the genuine threat that incommensurability poses to realism, which is ontological discontinuity as evidenced in the historical record: Successive theories reclassify objects into mutually exclusive sets of kinds to which they refer. That is why claiming that scientific progress is an increasingly better approximation to truth is difficult to justify. Similarly, Sankey’s attack on neo-Kantian antirealist positions is based on his misunderstanding of some of the central terms of those positions, making most of his attack on them ineffectual, including his diagnosis of their incoherence. We conclude by reiterating our conviction that in this debate meta-incommensurability plays an important role.  相似文献   

18.
We examine the interrelationships between analog computational modelling and analogue (physical) modelling. To this end, we attempt a regimentation of the informal distinction between analog and digital, which turns on the consideration of computing in a broader context. We argue that in doing so, one comes to see that (scientific) computation is better conceptualised as an epistemic process relative to agents, wherein representations play a key role. We distinguish between two, conceptually distinct, kinds of representation that, we argue, are both involved in each case of computing. Based on the semantic and syntactic properties of each of these representations, we put forward a new account of the distinction between analog and digital computing. We discuss how the developed account is able to explain various properties of different models of computation, and we conceptually compare analog computational modelling to analogue (scale) modelling. It is concluded that, contrary to the standard view, the two practices are orthogonal, differing both in their foundations and in the epistemic functions they fulfil.  相似文献   

19.
A case study is presented of a recent proposal by the major metrology institutes to redefine four of the physical base units, namely kilogram, ampere, mole, and kelvin. The episode shows a number of features that are unusual for progress in an objective science: for example, the progress is not triggered by experimental discoveries or theoretical innovations; also, the new definitions are eventually implemented by means of a voting process. In the philosophical analysis, I will first argue that the episode provides considerable evidence for confirmation holism, i.e. the claim that central statements in fundamental science cannot be tested in isolation; second, that the episode satisfies many of the criteria which Kuhn requires for scientific revolutions even though one would naturally classify it as normal science. These two observations are interrelated since holism can provide within normal science a possible source of future revolutionary periods.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号