首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 609 毫秒
1.
Predictivism is the view that successful predictions of “novel” evidence carry more confirmational weight than accommodations of already known evidence. Novelty, in this context, has traditionally been conceived of as temporal novelty. However temporal predictivism has been criticized for lacking a rationale: why should the time order of theory and evidence matter? Instead, it has been proposed, novelty should be construed in terms of use-novelty, according to which evidence is novel if it was not used in the construction of a theory. Only if evidence is use-novel can it fully support the theory entailing it. As I point out in this paper, the writings of the most influential proponent of use-novelty contain a weaker and a stronger version of use-novelty. However both versions, I argue, are problematic. With regard to the appraisal of Mendeleev’ periodic table, the most contentious historical case in the predictivism debate, I argue that temporal predictivism is indeed supported, although in ways not previously appreciated. On the basis of this case, I argue for a form of so-called symptomatic predictivism according to which temporally novel predictions carry more confirmational weight only insofar as they reveal the theory’s presumed coherence of facts as real.  相似文献   

2.
This is a comment on the paper by Barnes (2005) and the responses from Scerri (2005) and Worrall (2005), debating the thesis (‘predictivism’) that a fact successfully predicted by a theory is stronger evidence than a similar fact known before the prediction was made. Since Barnes and Scerri both use evidence presented in my paper on Mendeleev’s periodic law (Brush, 1996) to support their views, I reiterate my own position on predictivism. I do not argue for or against predictivism in the normative sense that philosophers of science employ, rather I describe how scientists themselves use facts and predictions to support their theories. I find wide variations, and no support for the assumption that scientists use a single ‘Scientific Method’ in deciding whether to accept a proposed new theory.  相似文献   

3.
According to the comparative Bayesian concept of confirmation, rationalized versions of creationism come out as empirically confirmed. From a scientific viewpoint, however, they are pseudo-explanations because with their help all kinds of experiences are explainable in an ex-post fashion, by way of ad-hoc fitting of an empirically empty theoretical framework to the given evidence. An alternative concept of confirmation that attempts to capture this intuition is the use novelty (UN) criterion of confirmation. Serious objections have been raised against this criterion. In this paper I suggest solutions to these objections. Based on them, I develop an account of genuine confirmation that unifies the UN-criterion with a refined probabilistic confirmation concept that is explicated in terms of the confirmation of evidence-transcending content parts of the hypothesis.  相似文献   

4.
There is considerable disagreement about the epistemic value of novel predictive success, i.e. when a scientist predicts an unexpected phenomenon, experiments are conducted, and the prediction proves to be accurate. We survey the field on this question, noting both fully articulated views such as weak and strong predictivism, and more nascent views, such as pluralist reasons for the instrumental value of prediction. By examining the various reasons offered for the value of prediction across a range of inferential contexts (including inferences from data to phenomena, from phenomena to theory, and from theory to framework), we can see that neither weak nor strong predictivism captures all of the reasons for valuing prediction available. A third path is presented, Pluralist Instrumental Predictivism; PIP for short.  相似文献   

5.
6.
7.
8.
In my From Instrumentalism to Constructive Realism (2000) I have shown how an instrumentalist account of empirical progress can be related to nomic truth approximation. However, it was assumed that a strong notion of nomic theories was needed for that analysis. In this paper it is shown, in terms of truth and falsity content, that the analysis already applies when, in line with scientific common sense, nomic theories are merely assumed to exclude certain conceptual possibilities as nomic possibilities.  相似文献   

9.
I address questions about values in model-making in engineering, specifically: Might the role of values be attributable solely to interests involved in specifying and using the model? Selected examples illustrate the surprisingly wide variety of things one must take into account in the model-making itself. The notions of system (as used in engineering thermodynamics), and physically similar systems (as used in the physical sciences) are important and powerful in determining what is relevant to an engineering model. Another example (windfarms) illustrates how an idea to completely re-characterize, or reframe, an engineering problem arose during model-making.I employ a qualitative analogue of the notion of physically similar systems. Historical cases can thus be drawn upon; I illustrate with a comparison between a geoengineering proposal to inject, or spray, sulfate aerosols, and two different historical cases involving the spraying of DDT (fire ant eradication; malaria eradication). The current geoengineering proposal is seen to be like the disastrous and counterproductive case, and unlike the successful case, of the spraying of DDT. I conclude by explaining my view that model-making in science is analogous to moral perception in action, drawing on a view in moral theory that has come to be called moral particularism.  相似文献   

10.
This paper revisits the debate between Harry Collins and Allan Franklin, concerning the experimenters' regress. Focusing my attention on a case study from recent psychology (regarding experimental evidence for the existence of a Mozart Effect), I argue that Franklin is right to highlight the role of epistemological strategies in scientific practice, but that his account does not sufficiently appreciate Collins's point about the importance of tacit knowledge in experimental practice. In turn, Collins rightly highlights the epistemic uncertainty (and skepticism) surrounding much experimental research. However, I will argue that his analysis of tacit knowledge fails to elucidate the reasons why scientists often are (and should be) skeptical of other researchers' experimental results. I will present an analysis of tacit knowledge in experimental research that not only answers to this desideratum, but also shows how such skepticism can in fact be a vital enabling factor for the dynamic processes of experimental knowledge generation.  相似文献   

11.
In this paper, I introduce a new historical case study into the scientific realism debate. During the late-eighteenth century, the Scottish natural philosopher James Hutton made two important successful novel predictions. The first concerned granitic veins intruding from granite masses into strata. The second concerned what geologists now term “angular unconformities”: older sections of strata overlain by younger sections, the two resting at different angles, the former typically more inclined than the latter. These predictions, I argue, are potentially problematic for selective scientific realism in that constituents of Hutton's theory that would not be considered even approximately true today played various roles in generating them. The aim here is not to provide a full philosophical analysis but to introduce the case into the debate by detailing the history and showing why, at least prima facie, it presents a problem for selective realism. First, I explicate Hutton's theory. I then give an account of Hutton's predictions and their confirmations. Next, I explain why these predictions are relevant to the realism debate. Finally, I consider which constituents of Hutton's theory are, according to current beliefs, true (or approximately true), which are not (even approximately) true, and which were responsible for these successes.  相似文献   

12.
The aim of this paper is to put in place some cornerstones in the foundations for an objective theory of confirmation by considering lessons from the failures of predictivism. Discussion begins with a widely accepted challenge, to find out what is needed in addition to the right kind of inferential–semantical relations between hypothesis and evidence to have a complete account of confirmation, one that gives a definitive answer to the question whether hypotheses branded as “post hoc monsters” can be confirmed. The predictivist view is then presented as a way to meet this challenge. Particular attention is paid to Worrall’s version of predictivism, as it appears to be the most sophisticated of the lot. It is argued that, despite its faults, his view turns our heads in the right direction by attempting to remove contingent considerations from confirmational matters. The demand to remove such considerations becomes the first of four cornerstones. Each cornerstone is put in place with the aim to steer clear of the sort of failures that plague various kinds of predictivism. In the process, it becomes obvious that the original challenge is wrongheaded and in need of revision. The paper ends with just such a revision.  相似文献   

13.
14.
In this paper I deal with a neglected topic with respect to unification in Newton’s Principia. I will clarify Newton’s notion (as can be found in Newton’s utterances on unification) and practice of unification (its actual occurrence in his scientific work). In order to do so, I will use the recent theories on unification as tools of analysis (Kitcher, Salmon and Schurz). I will argue, after showing that neither Kitcher’s nor Schurz’s account aptly capture Newton’s notion and practice of unification, that Salmon’s later work is a good starting point for analysing this notion and its practice in the Principia. Finally, I will supplement Salmon’s account in order to answer the question at stake.  相似文献   

15.
Bell's theorem in its standard version demonstrates that the joint assumptions of the hidden-variable hypothesis and the principle of local causation lead to a conflict with quantum-mechanical predictions. In his latest counterfactual strengthening of Bell's theorem, Stapp attempts to prove that the locality assumption itself contradicts the quantum-mechanical predictions in the Hardy case. His method relies on constructing a complex, non-truth functional formula which consists of statements about measurements and outcomes in some region R, and whose truth value depends on the selection of a measurement setting in a space-like separated location L. Stapp argues that this fact shows that the information about the measurement selection made in L has to be present in R. I give detailed reasons why this conclusion can and should be resisted. Next I correct and formalize an informal argument by Shimony and Stein showing that the locality condition coupled with Einstein's criterion of reality is inconsistent with quantum-mechanical predictions. I discuss the possibility of avoiding the inconsistency by rejecting Einstein's criterion rather than the locality assumption.  相似文献   

16.
First, I argue that scientific progress is possible in the absence of increasing verisimilitude in science's theories. Second, I argue that increasing theoretical verisimilitude is not the central, or primary, dimension of scientific progress. Third, I defend my previous argument that unjustified changes in scientific belief may be progressive. Fourth, I illustrate how false beliefs can promote scientific progress in ways that cannot be explicated by appeal to verisimilitude.  相似文献   

17.
I propose a distinct type of robustness, which I suggest can support a confirmatory role in scientific reasoning, contrary to the usual philosophical claims. In model robustness, repeated production of the empirically successful model prediction or retrodiction against a background of independently-supported and varying model constructions, within a group of models containing a shared causal factor, may suggest how confident we can be in the causal factor and predictions/retrodictions, especially once supported by a variety of evidence framework. I present climate models of greenhouse gas global warming of the 20th Century as an example, and emphasize climate scientists' discussions of robust models and causal aspects. The account is intended as applicable to a broad array of sciences that use complex modeling techniques.  相似文献   

18.
In this paper, I offer an alternative account of the relationship of Hobbesian geometry to natural philosophy by arguing that mixed mathematics provided Hobbes with a model for thinking about it. In mixed mathematics, one may borrow causal principles from one science and use them in another science without there being a deductive relationship between those two sciences. Natural philosophy for Hobbes is mixed because an explanation may combine observations from experience (the ‘that’) with causal principles from geometry (the ‘why’). My argument shows that Hobbesian natural philosophy relies upon suppositions that bodies plausibly behave according to these borrowed causal principles from geometry, acknowledging that bodies in the world may not actually behave this way. First, I consider Hobbes's relation to Aristotelian mixed mathematics and to Isaac Barrow's broadening of mixed mathematics in Mathematical Lectures (1683). I show that for Hobbes maker's knowledge from geometry provides the ‘why’ in mixed-mathematical explanations. Next, I examine two explanations from De corpore Part IV: (1) the explanation of sense in De corpore 25.1-2; and (2) the explanation of the swelling of parts of the body when they become warm in De corpore 27.3. In both explanations, I show Hobbes borrowing and citing geometrical principles and mixing these principles with appeals to experience.  相似文献   

19.
Epigenesis has become a far more exciting issue in Kant studies recently, especially with the publication of Jennifer Mensch's Kant’ Organicism. In my commentary, I propose to clarify my own position on epigenesis relative to that of Mensch and others by once again considering the discourse of epigenesis in the wider eighteenth century. Historically, I maintain that Kant was never fully an epigenesist because he feared its materialist implications. This makes it highly unlikely that he drew heavily, as other interpreters like Dupont and Huneman have suggested, on Caspar Friedrich Wolff for his ultimate theory of “generic preformation.” In order to situate more precisely what Kant made of epigenesis, I distinguish his metaphysical use, as elaborated by Mensch, from his view of it as a theory for life science. In that light, I raise questions about the scope and authority of philosophy vis a vis natural science.  相似文献   

20.
Many philosophers contend that Turing’s work provides a conceptual analysis of numerical computability. In (Rescorla, 2007), I dissented. I argued that the problem of deviant notations stymies existing attempts at conceptual analysis. Copeland and Proudfoot respond to my critique. I argue that their putative solution does not succeed. We are still awaiting a genuine conceptual analysis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号