首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The basic notion of an objective probability is that of a probability determined by the physical structure of the world. On this understanding, there are subjective credences that do not correspond to objective probabilities, such as credences concerning rival physical theories. The main question for objective probabilities is how they are determined by the physical structure.In this paper, I survey three ways of understanding objective probability: stochastic dynamics, humean chances, and deterministic chances (typicality). The first is the obvious way to understand the probabilities of quantum mechanics via a collapse theory such as GRW, the last is the way to understand the probabilities in the context of a deterministic theory such as Bohmian mechanics. Humean chances provide a more abstract and general account of chances locutions that are independent of dynamical considerations.  相似文献   

2.
In this paper, I consider Kitcher’s (1993) account of reference for the expressions of past science. Kitcher’s case study is of Joseph Priestley and his expression ‘dephlogisticated air’. There is a strong intuitive case that ‘dephlogisticated air’ referred to oxygen, but it was underpinned by very mistaken phlogiston theory, so concluding either that dephlogisticated air referred straightforwardly or that it failed to refer both have unpalatable consequences. Kitcher argues that the reference of such terms is best considered relative to each token—some tokens refer, and others do not. His account thus relies crucially on how this distinction between tokens can be made good—a puzzle I call the discrimination problem. I argue that the discrimination problem cannot be solved. On any reading of Kitcher’s defence of the distinction, the grounds provided are either insufficient or illegitimate. On the first reading, Kitcher violates the principle of humanity by making Priestley’s referential success a matter of the mental contents of modern speakers. The second reading sidesteps the problem of beliefs by appealing to mind-independent facts, but I argue that these are insufficient to achieve reference because of the indeterminacy introduced by the qua problem. On the third and final reading, Priestley’s success is given by what he would say in counterfactual circumstances. I argue that even if there are facts about what Priestley would say, and there is reason for doubt, there is no motivation to think that such facts determine how Priestley referred in the actual world.  相似文献   

3.
How are we to understand the use of probability in corroboration functions? Popper says logically, but does not show we could have access to, or even calculate, probability values in a logical sense. This makes the logical interpretation untenable, as Ramsey and van Fraassen have argued.If corroboration functions only make sense when the probabilities employed therein are subjective, however, then what counts as impressive evidence for a theory might be a matter of convention, or even whim. So isn’t so-called ‘corroboration’ just a matter of psychology?In this paper, I argue that we can go some way towards addressing this objection by adopting an intersubjective interpretation, of the form advocated by Gillies, with respect to corroboration. I show why intersubjective probabilities are preferable to subjective ones when it comes to decision making in science: why group decisions are liable to be superior to individual ones, given a number of plausible conditions. I then argue that intersubjective corroboration is preferable to intersubjective confirmation of a Bayesian variety, because there is greater opportunity for principled agreement concerning the factors involved in the former.  相似文献   

4.
This is a comment on the paper by Barnes (2005) and the responses from Scerri (2005) and Worrall (2005), debating the thesis (‘predictivism’) that a fact successfully predicted by a theory is stronger evidence than a similar fact known before the prediction was made. Since Barnes and Scerri both use evidence presented in my paper on Mendeleev’s periodic law (Brush, 1996) to support their views, I reiterate my own position on predictivism. I do not argue for or against predictivism in the normative sense that philosophers of science employ, rather I describe how scientists themselves use facts and predictions to support their theories. I find wide variations, and no support for the assumption that scientists use a single ‘Scientific Method’ in deciding whether to accept a proposed new theory.  相似文献   

5.
In a preceding paper, I studied the significance of Jarrett's and Shimony's analyses of ‘factorisability’ into ‘parameter independence’ and ‘outcome independence’ for clarifying the nature of non-locality in quantum phenomena. I focused on four types of non-locality; superluminal signalling, action-at-a-distance, non-separability and holism. In this paper, I consider a fifth type of non-locality: superluminal causation according to ‘logically weak’ concepts of causation, where causal dependence requires neither action nor signalling. I conclude by considering the compatibility of non-factorisable theories with relativity theory. In this connection, I pay special attention to the difficulties that superluminal causation raises in relativistic spacetime. My main findings in this paper are: first, parameter-dependent and outcome-dependent theories both involve superluminal causal connections between outcomes and between settings and outcomes. Second, while relativistic deterministic parameter-dependent theories seem impossible on pain of causal paradoxes, relativistic indeterministic parameter-dependent theories are not subjected to the same challenge. Third, current relativistic non-factorisable theories seem to have some rather unattractive characteristics.  相似文献   

6.
This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning.  相似文献   

7.
Philip Kitcher's The Advancement of Science sets out, programmatically, a new naturalistic view of science as a process of building consensus practices. Detailed historical case studies—centrally, the Darwinian revolutio—are intended to support this view. I argue that Kitcher's expositions in fact support a more conservative view, that I dub ‘Legend Naturalism’. Using four historical examples which increasingly challenge Kitcher's discussions, I show that neither Legend Naturalism, nor the less conservative programmatic view, gives an adequate account of scientific progress. I argue for a naturalism that is more informed by psychology and a normative account that is both more social and less realist than the views articulated in The Advancement of Science.  相似文献   

8.
9.
Can stable regularities be explained without appealing to governing laws or any other modal notion? In this paper, I consider what I will call a ‘Humean system’—a generic dynamical system without guiding laws—and assess whether it could display stable regularities. First, I present what can be interpreted as an account of the rise of stable regularities, following from Strevens (2003), which has been applied to explain the patterns of complex systems (such as those from meteorology and statistical mechanics). Second, since this account presupposes that the underlying dynamics displays deterministic chaos, I assess whether it can be adapted to cases where the underlying dynamics is not chaotic but truly random—that is, cases where there is no dynamics guiding the time evolution of the system. If this is so, the resulting stable, apparently non-accidental regularities are the fruit of what can be called statistical necessity rather than of a primitive physical necessity.  相似文献   

10.
John Norton's The Material Theory of Induction bristles with fresh insights and provocative ideas that provide a much needed stimulus to a stodgy if not moribund field. I use quantum mechanics (QM) as a medium for exploring some of these ideas. First, I note that QM offers more predictability than Newtonian mechanics for the Norton dome and other cases where classical determinism falters. But this ability of QM to partially cure the ills of classical determinism depends on facts about the quantum Hamiltonian operator that vary from case to case, providing an illustration of Norton's theme of the importance of contingent facts for inductive reasoning. Second, I agree with Norton that Bayesianism as developed for classical probability theory does not constitute a universal inference machine, and I use QM to explain the sense in which this is so. But at the same time I defend a brand of quantum Bayesianism as providing an illuminating account of how physicists' reasoning about quantum events. Third, I argue that if the probabilities induced by quantum states are regarded as objective chances then there are strong reasons to think that fair infinite lotteries are impossible in a quantum world.  相似文献   

11.
Work throughout the history and philosophy of biology frequently employs ‘chance’, ‘unpredictability’, ‘probability’, and many similar terms. One common way of understanding how these concepts were introduced in evolution focuses on two central issues: the first use of statistical methods in evolution (Galton), and the first use of the concept of “objective chance” in evolution (Wright). I argue that while this approach has merit, it fails to fully capture interesting philosophical reflections on the role of chance expounded by two of Galton's students, Karl Pearson and W.F.R. Weldon. Considering a question more familiar from contemporary philosophy of biology—the relationship between our statistical theories of evolution and the processes in the world those theories describe—is, I claim, a more fruitful way to approach both these two historical actors and the broader development of chance in evolution.  相似文献   

12.
In The Advancement of Science (1993) Philip Kitcher develops what he calls the ‘Compromise Model’ of the closure of scientific debates. The model is designed to acknowledge significant elements from ‘Rationalist’ and ‘Antirationalist’ accounts of science, without succumbing to the one-sidedness of either. As part of an ambitious naturalistic account of scientific progress, Kitcher's model succeeds to the extent that transitions in the history of science satisfy its several conditions. 1 critically evaluate the Compromise Model by identifying its crucial assumptions and by attempting to apply the model to a major transition in the history of biology: the rejection of ‘naive group selectionism’ in the 1960s. I argue that the weaknesses and limitations of Kitcher's model exemplify general problems facing philosophical models of scientific change, and that recognition of these problems supports a more modest vision of the relationship between historical and philosophical accounts of science.  相似文献   

13.
This article critically appraises David Bloor’s recent attempts to refute criticisms levelled at the Strong Programme’s social constructionist approach to scientific knowledge. Bloor has tried to argue, contrary to some critics, that the Strong Programme is not idealist in character, and it does not involve a challenge to the credibility of scientific knowledge. I argue that Bloor’s attempt to deflect the charge of idealism, which calls on the self-referential theory of social institutions, is partially successful. However, I suggest that although the Strong Programme should not be accused of ‘strong idealism’, it is still vulnerable to the criticism that it entails a form of ‘weak idealism’. The article moves on to argue that, contrary to Bloor, constructionist approaches do challenge the credibility of the scientific knowledge that they analyse. I conclude the article by arguing that sociological analyses of scientific knowledge can be conducted without the weak idealism and the credibility-challenging assumptions of the Strong Programme approach.  相似文献   

14.
In the 1930s, Carnap set out to incorporate psychology into the unity of science, by showing that all cognitively meaningful sentences of psychology can be translated into the language of physics. I will argue that Carnap, relying on his notion of protocol languages, defends a physicalistic philosophy of psychology that shows due appreciation of ‘introspection’ as a strictly subjective, but reliable way to verify sentences about one’s own mind. Second, I will point out that Carnap’s philosophy of psychology not only takes into account overt behaviour, but must comprise neurophysiological processes as well. Last, I will show that Carnap aims to develop a philosophy of psychology that does justice to the ongoing changeability of scientific knowledge.  相似文献   

15.
Kuhn and Feyerabend have little to say about the thought of Michael Polanyi, and the secondary literature on Polanyi's relation to them is meagre. I argue that Polanyi's view, in Personal knowledge and in other writings, of conceptual frameworks ‘segregated’ by a ‘logical gap’ as giving rise to controversies in science foreshadowed Kuhn and Feyerabend's theme of incommensurability. The similarity between the thinkers is, I suggest, no coincidence.  相似文献   

16.
In this paper, I explore Rosen’s (1994) ‘transcendental’ objection to constructive empiricism—the argument that in order to be a constructive empiricist, one must be ontologically committed to just the sort of abstract, mathematical objects constructive empiricism seems committed to denying. In particular, I assess Bueno’s (1999) ‘partial structures’ response to Rosen, and argue that such a strategy cannot succeed, on the grounds that it cannot provide an adequate metalogic for our scientific discourse. I conclude by arguing that this result provides some interesting consequences in general for anti-realist programmes in the philosophy of science.  相似文献   

17.
I present an account of classical genetics to challenge theory-biased approaches in the philosophy of science. Philosophers typically assume that scientific knowledge is ultimately structured by explanatory reasoning and that research programs in well-established sciences are organized around efforts to fill out a central theory and extend its explanatory range. In the case of classical genetics, philosophers assume that the knowledge was structured by T. H. Morgan’s theory of transmission and that research throughout the later 1920s, 30s, and 40s was organized around efforts to further validate, develop, and extend this theory. I show that classical genetics was structured by an integration of explanatory reasoning (associated with the transmission theory) and investigative strategies (such as the ‘genetic approach’). The investigative strategies, which have been overlooked in historical and philosophical accounts, were as important as the so-called laws of Mendelian genetics. By the later 1920s, geneticists of the Morgan school were no longer organizing research around the goal of explaining inheritance patterns; rather, they were using genetics to investigate a range of biological phenomena that extended well beyond the explanatory domain of transmission theories. Theory-biased approaches in history and philosophy of science fail to reveal the overall structure of scientific knowledge and obscure the way it functions.  相似文献   

18.
Among the alternatives of non-relativistic quantum mechanics (NRQM) there are those that give different predictions than quantum mechanics in yet-untested circumstances, while remaining compatible with current empirical findings. In order to test these predictions, one must isolate one's system from environmental induced decoherence, which, on the standard view of NRQM, is the dynamical mechanism that is responsible for the ‘apparent’ collapse in open quantum systems. But while recent advances in condensed-matter physics may lead in the near future to experimental setups that will allow one to test the two hypotheses, namely genuine collapse vs. decoherence, hence make progress toward a solution to the quantum measurement problem, those philosophers and physicists who are advocating an information-theoretic approach to the foundations of quantum mechanics are still unwilling to acknowledge the empirical character of the issue at stake. Here I argue that in doing so they are displaying an unwarranted double standard.  相似文献   

19.
This is a discussion of how we can understand the world-view given to us by the Everett interpretation of quantum mechanics, and in particular the rôle played by the concept of ‘world’. The view presented is that we are entitled to use ‘many-worlds’ terminology even if the theory does not specify the worlds in the formalism; this is defended by means of an extensive analogy with the concept of an ‘instant’ or moment of time in relativity, with the lack of a preferred foliation of spacetime being compared with the lack of a preferred basis in quantum theory. Implications for identity of worlds over time, and for relativistic quantum mechanics, are discussed.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号