首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
John Norton's Material Theory of Induction (Norton, 2003, 2005, 2008, forthcoming) has a two-fold, negative and positive, goal. The negative goal is to establish that formal logics of induction fail if they are understood as universally applicable schemas of induction. The positive goal is to establish that it is material facts that enable and justify inductive inferences. I argue in this paper that Norton is more successful with his negative than with his positive ambition. While I do not deny that facts constitute an important type of enabler and justifier of inductions, they are by no means the only type. This paper suggests that there are no less than six other types of background information scientists need and use to fuel and warrant inductions. The discussion of additional enablers and justifiers of inductions will further show there are practically important and intellectually challenging methodological issues Norton's theory prevents us from seeing because it leaves out this or that type of enabler and justifier.  相似文献   

2.
In this paper, we present an explanatory objection to Norton's material theory of induction, as applied to predictive inferences. According to the objection we present, there is an explanatory disconnect between our beliefs about the future and the relevant future facts. We argue that if we recognize such a disconnect, we are no longer rationally entitled to our future beliefs.  相似文献   

3.
C. D. Broad famously labelled the problem of providing our inductive practices with a proper justification “the scandal of philosophy” (Broad, 1952). Recently, John Norton has provided a dissolution of this problem (2014). According to Norton, inductive inference is grounded in particular facts obtaining within particular domains (J. Norton, 2003b, 2010, 2014). Because the material theory does not involve a universal schema of induction, Norton claims it dissolves the problem of induction (which implies that such universal schemas cannot be justified).In this paper, I critically evaluate Norton's dissolution. In particular, I argue that the problem of induction is an epistemological problem, that Norton's material theory entails an externalist epistemology, and that it is a common feature of such epistemologies that they dissolve the problem of induction. The upshot is that the material theory is not unique in its ability to reap the specifically epistemic benefits of dissolving the problem of induction, and thus that the epistemic advantages of the material theory over extant alternatives in this regard are fewer than it may appear at first sight.  相似文献   

4.
John Norton's The Material Theory of Induction bristles with fresh insights and provocative ideas that provide a much needed stimulus to a stodgy if not moribund field. I use quantum mechanics (QM) as a medium for exploring some of these ideas. First, I note that QM offers more predictability than Newtonian mechanics for the Norton dome and other cases where classical determinism falters. But this ability of QM to partially cure the ills of classical determinism depends on facts about the quantum Hamiltonian operator that vary from case to case, providing an illustration of Norton's theme of the importance of contingent facts for inductive reasoning. Second, I agree with Norton that Bayesianism as developed for classical probability theory does not constitute a universal inference machine, and I use QM to explain the sense in which this is so. But at the same time I defend a brand of quantum Bayesianism as providing an illuminating account of how physicists' reasoning about quantum events. Third, I argue that if the probabilities induced by quantum states are regarded as objective chances then there are strong reasons to think that fair infinite lotteries are impossible in a quantum world.  相似文献   

5.
John D. Norton is responsible for a number of influential views in contemporary philosophy of science. This paper will discuss two of them. The material theory of induction claims that inductive arguments are ultimately justified by their material features, not their formal features. Thus, while a deductive argument can be valid irrespective of the content of the propositions that make up the argument, an inductive argument about, say, apples, will be justified (or not) depending on facts about apples. The argument view of thought experiments claims that thought experiments are arguments, and that they function epistemically however arguments do. These two views have generated a great deal of discussion, although there hasn't been much written about their combination. I argue that despite some interesting harmonies, there is a serious tension between them. I consider several options for easing this tension, before suggesting a set of changes to the argument view that I take to be consistent with Norton's fundamental philosophical commitments, and which retain what seems intuitively correct about the argument view. These changes require that we move away from a unitary epistemology of thought experiments and towards a more pluralist position.  相似文献   

6.
In this paper, I raise some worries with John D. Norton's application of his material theory of induction to the study of analogical inferences. Skeptical that these worries can be properly addressed, I propose a principle to guide the philosophical research on analogical inferences and argue for its usefulness.  相似文献   

7.
Building on Norton's “material theory of induction,” this paper shows through careful historical analysis that analogy can act as a methodological principle or stratagem, providing experimentalists with a useful framework to assess data and devise novel experiments. Although this particular case study focuses on late eighteenth and early nineteenth-century experiments on the properties and composition of acids, the results of this investigation may be extended and applied to other research programs. A stage in-between what Steinle calls “exploratory experimentation” and robust theory, I argue that analogy encouraged research to substantiate why the likenesses should outweigh the differences (or vice versa) when evaluating results and designing experiments.  相似文献   

8.
In his book, The Material Theory of Induction, Norton argues that the quest for a universal formal theory or ‘schema’ for analogical inference should be abandoned. In its place, he offers the “material theory of analogy”: each analogical inference is “powered” by a local fact of analogy rather than by any formal schema. His minimalist model promises a straightforward, fact-based approach to the evaluation and justification of analogical inferences. This paper argues that although the rejection of universal schemas is justified, Norton's positive theory is limited in scope: it works well only for a restricted class of analogical inferences. Both facts and quasi-formal criteria have roles to play in a theory of analogical reasoning.  相似文献   

9.
This paper follows up a debate as to the consistency of Newtonian cosmology. Whereas Malament [(1995). Is Newtonian cosmology really inconsistent? Philosophy of Science 62, 489–510] has shown that Newtonian cosmology is not inconsistent, to date there has been no analysis of Norton's claim [(1995). The force of Newtonian cosmology: Acceleration is relative. Philosophy of Science 62, 511–522.] that Newtonian cosmology was inconsistent prior to certain advances in the 1930s, and in particular prior to Seeliger's seminal paper of Seeliger [(1895). Über das Newton'sche Gravitationsgesetz. Astronomische Nachrichten 137 (3273), 129–136.] In this paper I agree that there are assumptions, Newtonian and cosmological in character, and relevant to the real history of science, which are inconsistent. But there are some important corrections to make to Norton's account. Here I display for the first time the inconsistencies—four in total—in all their detail. Although this extra detail shows there to be several different inconsistencies, it also goes some way towards explaining why they went unnoticed for 200 years.  相似文献   

10.
11.
12.
In the area of social science, in particular, although we have developed methods for reliably discovering the existence of causal relationships, we are not very good at using these to design effective social policy. Cartwright argues that in order to improve our ability to use causal relationships, it is essential to develop a theory of causation that makes explicit the connections between the nature of causation, our best methods for discovering causal relationships, and the uses to which these are put. I argue that Woodward's interventionist theory of causation is uniquely suited to meet Cartwright's challenge. More specifically, interventionist mechanisms can provide the bridge from ‘hunting causes’ to ‘using them’, if interventionists (i) tell us more about the nature of these mechanisms, and (ii) endorse the claim that it is these mechanisms—or whatever constitutes them—that make causal claims true. I illustrate how having an understanding of interventionist mechanisms can allow us to put causal knowledge to use via a detailed example from organic chemistry.  相似文献   

13.
In this paper, I consider Kitcher’s (1993) account of reference for the expressions of past science. Kitcher’s case study is of Joseph Priestley and his expression ‘dephlogisticated air’. There is a strong intuitive case that ‘dephlogisticated air’ referred to oxygen, but it was underpinned by very mistaken phlogiston theory, so concluding either that dephlogisticated air referred straightforwardly or that it failed to refer both have unpalatable consequences. Kitcher argues that the reference of such terms is best considered relative to each token—some tokens refer, and others do not. His account thus relies crucially on how this distinction between tokens can be made good—a puzzle I call the discrimination problem. I argue that the discrimination problem cannot be solved. On any reading of Kitcher’s defence of the distinction, the grounds provided are either insufficient or illegitimate. On the first reading, Kitcher violates the principle of humanity by making Priestley’s referential success a matter of the mental contents of modern speakers. The second reading sidesteps the problem of beliefs by appealing to mind-independent facts, but I argue that these are insufficient to achieve reference because of the indeterminacy introduced by the qua problem. On the third and final reading, Priestley’s success is given by what he would say in counterfactual circumstances. I argue that even if there are facts about what Priestley would say, and there is reason for doubt, there is no motivation to think that such facts determine how Priestley referred in the actual world.  相似文献   

14.
Predictivism is the view that successful predictions of “novel” evidence carry more confirmational weight than accommodations of already known evidence. Novelty, in this context, has traditionally been conceived of as temporal novelty. However temporal predictivism has been criticized for lacking a rationale: why should the time order of theory and evidence matter? Instead, it has been proposed, novelty should be construed in terms of use-novelty, according to which evidence is novel if it was not used in the construction of a theory. Only if evidence is use-novel can it fully support the theory entailing it. As I point out in this paper, the writings of the most influential proponent of use-novelty contain a weaker and a stronger version of use-novelty. However both versions, I argue, are problematic. With regard to the appraisal of Mendeleev’ periodic table, the most contentious historical case in the predictivism debate, I argue that temporal predictivism is indeed supported, although in ways not previously appreciated. On the basis of this case, I argue for a form of so-called symptomatic predictivism according to which temporally novel predictions carry more confirmational weight only insofar as they reveal the theory’s presumed coherence of facts as real.  相似文献   

15.
Philosophers and historians of science have for some time now debated whether the results of current science are ‘contingent’ or ‘inevitable’. Scholars have noted that inevitabilism often enjoys the status of a presumptive default position. Consequently, contingentists are, from the outset, lumbered with the burden of proof. This is evident in the case of the inevitabilist demand that the contingentist “put up or shut up” (PUSU). This paper adds to the existing case which says that inevitabilism's default-status is unjustified. However, whilst some have suggested that contingentism should replace inevitabilism as the default position, I argue that the contingency/inevitability (C/I) conversation should proceed sans default. This move is motivated largely by my claim that the C/I issue is best conceived as a ‘local’, rather than a global or universal one. The main problem with taking inevitabilism or contingentism as the default is the globalist nature of such a tack. Whilst localism is arguably an emergent reality of the growing C/I literature, its implications have not been fully realised. I suggest that fully and explicitly embracing localism, including the closely related move of doing away with defaults, represents the most promising way forward for the C/I conversation. In addition, I will show how these moves entail that we stop worrying about the inevitabilist PUSU demand, or more bluntly, that we shut up about putting-up.  相似文献   

16.
This article is about the role of abstraction in mechanistic explanations. Abstraction is widely recognised as a necessary concession to the practicalities of scientific work, but some mechanist philosophers argue that it is also a positive explanatory feature in its own right. I claim that in as much as these arguments are based on the idea that mechanistic explanation exhibits a trade-off between fine-grained detail and generality, they are unsuccessful. Detail and generality both appear to be important sources of explanatory power, but investigators do not need to make a choice between these desiderata, at least when an explanation incorporates further detail through the decomposition of the mechanism's parts.  相似文献   

17.
Most scientific realists today in one way or another confine the object of their commitment to certain components of a successful theory and thereby seek to make realism compatible with the history of theory change. Kyle Stanford calls this move by realists the strategy of selective confirmation and raises a challenge against its contemporary, reliable applicability. In this paper, I critically examine Stanford's inductive argument that is based on past scientists' failures to identify the confirmed components of their contemporary theories. I argue that our ability to make such identification should be evaluated based on the performance of the scientific community as a whole rather than that of individual scientists and that Stanford's challenge fails to raise a serious concern because it focuses solely on individual scientists' judgments, which are either made before the scientific community has reached a consensus or about the value of the posit as a locus for further research rather than its confirmed status.  相似文献   

18.
In the Bayesian approach to quantum mechanics, probabilities—and thus quantum states—represent an agent's degrees of belief, rather than corresponding to objective properties of physical systems. In this paper we investigate the concept of certainty in quantum mechanics. Particularly, we show how the probability-1 predictions derived from pure quantum states highlight a fundamental difference between our Bayesian approach, on the one hand, and Copenhagen and similar interpretations on the other. We first review the main arguments for the general claim that probabilities always represent degrees of belief. We then argue that a quantum state prepared by some physical device always depends on an agent's prior beliefs, implying that the probability-1 predictions derived from that state also depend on the agent's prior beliefs. Quantum certainty is therefore always some agent's certainty. Conversely, if facts about an experimental setup could imply agent-independent certainty for a measurement outcome, as in many Copenhagen-like interpretations, that outcome would effectively correspond to a preexisting system property. The idea that measurement outcomes occurring with certainty correspond to preexisting system properties is, however, in conflict with locality. We emphasize this by giving a version of an argument of Stairs [(1983). Quantum logic, realism, and value-definiteness. Philosophy of Science, 50, 578], which applies the Kochen–Specker theorem to an entangled bipartite system.  相似文献   

19.
If we cannot directly empirically test the claims of a particular scientific theory directly, then it would be nice to have some other criteria with which to assess its viability. In his 2013 book, String Theory and the Scientific Method, Richard Dawid aims to develop such criteria, with an eye to vindicating research programmess in disciplines where direct empirical data is scant or non-existent. In an accompanying paper, Dawid, Hartmann and Sprenger formalise Dawid's so-called ‘No Alternatives Argument’ (NAA) using a generalised Bayesian framework, as a first step towards formalising Dawid's entire research programme (which itself relies on two further arguments). In this paper, I argue that the formalisation of the NAA cannot play the central role in Dawid's programme as intended. This is based on the observation that not all confirmation is non-negligible confirmation. For Dawid's programme to be useful, it must demonstrate the viability not just of non-empirical theory confirmation, but of non-negligible non-empirical theory confirmation. I argue that Dawid et al.‘s appeal to Bayesian confirmation theory to formalise his NAA cannot guarantee non-negligible confirmation. As a result, I conclude that if Dawid's overall project is to succeed, it must do so without the NAA formalised in this way.  相似文献   

20.
I began this study with Laudan's argument from the pessimistic induction and I promised to show that the caloric theory of heat cannot be used to support the premisses of the meta-induction on past scientific theories. I tried to show that the laws of experimental calorimetry, adiabatic change and Carnot's theory of the motive power of heat were (i) independent of the assumption that heat is a material substance, (ii) approximately true, (iii) deducible and accounted for within thermodynamics.I stressed that results (i) and (ii) were known to most theorists of the caloric theory and that result (iii) was put forward by the founders of the new thermodynamics. In other words, the truth-content of the caloric theory was located, selected carefully, and preserved by the founders of thermodynamics.However, the reader might think that even if I have succeeded in showing that laudan is wrong about the caloric theory, I have not shown how the strategy followed in this paper can be generalised against the pessimistic meta-induction. I think that the general strategy against Laudan's argument suggested in this paper is this: the empirical success of a mature scientific theory suggests that there are respects and degrees in which this theory is true. The difficulty for — and and real challenge to — philosophers of science is to suggest ways in which this truth-content can be located and shown to be preserved — if at all — to subsequent theories. In particular, the empirical success of a theory does not, automatically, suggest that all theoretical terms of the theory refer. On the contrary, judgments of referential success depend on which theoretical claims are well-supported by the evidence. This is a matter of specific investigation. Generally, one would expect that claims about theoretical entities which are not strongly supported by the evidence or turn out to be independent of the evidence at hand, are not compelling. For simply, if the evidence does not make it likely that our beliefs about putative theoretical entities are approximately correct, a belief in those entities would be ill-founded and unjustified. Theoretical extrapolations in science are indespensable , but they are not arbitrary. If the evidence does not warrant them I do not see why someone should commit herself to them. In a sense, the problem with empricist philisophers is not that they demand that theoretical beliefs must be warranted by evidence. Rather, it is that they claim that no evidence can warrant theorretical beliefs. A realist philosopher of science would not disagree on the first, but she has good grounds to deny the second.I argued that claims about theoretical entities which are not strongly supported by the evidence must not be taken as belief-worthy. But can one sustaon the more ambitious view that loosely supported parts of a theory tend to be just those that include non-referring terms? There is an obvious excess risk in such a generalisation. For there are well-known cases in which a theoretical claim was initially weakly supported by the evidence  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号