首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this paper, three theories of progress and the aim of science are discussed: (i) the theory of progress as increasing explanatory power, advocated by Popper in The logic of scientific discovery (1935/1959); (ii) the theory of progress as approximation to the truth, introduced by Popper in Conjectures and refutations (1963); (iii) the theory of progress as a steady increase of competing alternatives, which Feyerabend put forward in the essay “Reply to criticism. Comments on Smart, Sellars and Putnam” (1965) and defended as late as the last edition of Against method (1993). It is argued that, contrary to what Feyerabend scholars have predominantly assumed, Feyerabend's changing attitude towards falsificationism—which he often advocated at the beginning of his career, and vociferously attacked in the 1970s and 1980s—must be explained by taking into account not only Feyerabend's very peculiar view of the aim of science, but also Popper's changing account of progress.  相似文献   

2.
Inferences from scientific success to the approximate truth of successful theories remain central to the most influential arguments for scientific realism. Challenges to such inferences, however, based on radical discontinuities within the history of science, have motivated a distinctive style of revision to the original argument. Conceding the historical claim, selective realists argue that accompanying even the most revolutionary change is the retention of significant parts of replaced theories, and that a realist attitude towards the systematically retained constituents of our scientific theories can still be defended. Selective realists thereby hope to secure the argument from success against apparent historical counterexamples. Independently of that objective, historical considerations have inspired a further argument for selective realism, where evidence for the retention of parts of theories is itself offered as justification for adopting a realist attitude towards them. Given the nature of these arguments from success and from retention, a reasonable expectation is that they would complement and reinforce one another, but although several theses purport to provide such a synthesis the results are often unconvincing. In this paper I reconsider the realist’s favoured type of scientific success, novel success, offer a revised interpretation of the concept, and argue that a significant consequence of reconfiguring the realist’s argument from success accordingly is a greater potential for its unification with the argument from retention.  相似文献   

3.
According to inference to the best explanation (IBE), scientists infer the loveliest of competing hypotheses, ‘loveliness’ being explanatory virtue. This generates two key objections: that loveliness is too subjective to guide inference, and that it is no guide to truth. I defend IBE using Thomas Kuhn’s notion of exemplars: the scientific theories, or applications thereof, that define Kuhnian normal science and facilitate puzzle-solving. I claim that scientists infer the explanatory puzzle-solution that best meets the standard set by the relevant exemplar of loveliness. Exemplars are the subject of consensus, eliminating subjectivity; divorced from Kuhnian relativism, they give loveliness the context-sensitivity required to be truth-tropic. The resulting account, ‘Kuhnian IBE’, is independently plausible and offers a partial rapprochement between IBE and Kuhn’s account of science.  相似文献   

4.
Under what circumstances, if any, are we warranted to assert that a theory is true or at least has some truth content? Scientific realists answer that such assertions are warranted only for those theories or theory-parts that enjoy explanatory and predictive success. A number of challenges to this answer have emerged, chief among them those arising from scientific theory change. For example, if, as scientific realists suggest, successive theories are to increasingly get closer to the truth, any theory changes must not undermine (i) the accumulation of explanatory and predictive success and (ii) the theoretical content responsible for that success. In this paper we employ frame theory to test to what extent certain theoretical claims made by the outdated caloric theory of heat and that, prima facie at least, were used to produce some of that theory’s success have survived into the theory that superseded it, i.e. the kinetic theory of heat. Our findings lend credence to structural realism, the view that scientific theories at best reveal only structural features of the unobservable world.  相似文献   

5.
Recent literature in the scientific realism debate has been concerned with a particular species of statistical fallacy concerning base-rates, and the worry that no matter how predictively successful our contemporary scientific theories may be, this will tell us absolutely nothing about the likelihood of their truth if our overall sample space contains enough empirically adequate theories that are nevertheless false. In response, both realists and anti-realists have switched their focus from general arguments concerning the reliability and historical track-records of our scientific methodology, to a series of specific arguments and case-studies concerning our reasons to believe individual scientific theories to be true. Such a development however sits in tension with the usual understanding of the scientific realism debate as offering a second-order assessment of our first-order scientific practices, and threatens to undermine the possibility of a distinctive philosophical debate over the approximate truth of our scientific theories. I illustrate this concern with three recent attempts to offer a more localised understanding of the scientific realism debate—due to Stathis Psillos, Juha Saatsi, and Kyle Stanford—and argue that none of these alternatives offer a satisfactory response to the problem.  相似文献   

6.
This paper argues that spacetime visualisability is not a necessary condition for the intelligibility of theories in physics. Visualisation can be an important tool for rendering a theory intelligible, but it is by no means a sine qua non. The paper examines the historical transition from classical to quantum physics, and analyses the role of visualisability (Anschaulichkeit) and its relation to intelligibility. On the basis of this historical analysis, an alternative conception of the intelligibility of scientific theories is proposed, based on Heisenberg's reinterpretation of the notion of Anschaulichkeit.  相似文献   

7.
Henri Poincaré acquired a reputation in his lifetime for being difficult to read. It was said that he missed out important steps in his arguments, assumed the truth of claims that would be difficult if not impossible to prove, and in short that he lacked rigour. In the years after his death this view coalesced into an exaggerated claim that his work was simply too vague, and has become a cliché. This paper argues that Poincaré was far from indifferent to rigour, and that what characterises his work is an attempt to convey a particular sense of what it is to understand a topic. Throughout his working life Poincaré was concerned to promote the understanding of many domains of mathematics and physics. This is as apparent in his views about geometry, his conventionalism, and his theory of knowledge, as it is in his work on electricity and optics, on number theory, and function theory. It is one of the ways Poincaré discharged his responsibilities as a scientist, and that it accounts not only for a surprising degree of unity in his work but also gives it its distinctive character—at once profound and elusive.  相似文献   

8.
According to the foundationalist picture, shared by many rationalists and positivist empiricists, science makes cognitive progress by accumulating justified truths. Fallibilists, who point out that complete certainty cannot be achieved in empirical science, can still argue that even successions of false theories may progress toward the truth. This proposal was supported by Karl Popper with his notion of truthlikeness or verisimilitude. Popper’s own technical definition failed, but the idea that scientific progress means increasing truthlikeness can be expressed by defining degrees of truthlikeness in terms of similarities between states of affairs. This paper defends the verisimilitude approach against Alexander Bird who argues that the “semantic” definition (in terms of truth or truthlikeness alone) is not sufficient to define progress, but the “epistemic” definition referring to justification and knowledge is more adequate. Here Bird ignores the crucial distinction between real progress and estimated progress, explicated by the difference between absolute (and usually unknown) degrees of truthlikeness and their evidence-relative expected values. Further, it is argued that Bird’s idea of returning to the cumulative model of growth requires an implausible trick of transforming past false theories into true ones.  相似文献   

9.
It is frequently said that belief aims at truth, in an explicitly normative sense—that is, that one ought to believe the proposition that p if, and only if, p is true. This truth norm is frequently invoked to explain why we should seek evidential justification in our beliefs, or why we should try to be rational in our belief formation—it is because we ought to believe the truth that we ought to follow the evidence in belief revision. In this paper, I argue that this view is untenable. The truth norm clashes with plausible evidential norms in a wide range of cases, such as when we have excellent but misleading evidence for a falsehood or no evidence for a truth. I will consider various ways to resolve this conflict and argue that none of them work. However, I will ultimately attempt to vindicate the love of truth, by arguing that knowledge is the proper epistemic goal. The upshot is that we should not aim merely to believe the truth; we should aim to know it.  相似文献   

10.
11.
One way to reconstruct the miracle argument for scientific realism is to regard it as a statistical inference: since it is exceedingly unlikely that a false theory makes successful predictions, while it is rather likely that an approximately true theory is predictively successful, it is reasonable to infer that a predictively successful theory is at least approximately true. This reconstruction has led to the objection that the argument embodies a base rate fallacy: by focusing on successful theories one ignores the vast number of false theories some of which will be successful by mere chance.In this paper, I shall argue that the cogency of this objection depends on the explanandum of the miracle argument. It is cogent if what is to be explained is the success of a particular theory. If, however, the explanandum of the argument is the distribution of successful predictions among competing theories, the situation is different. Since the distribution of accidentally successful predictions is independent of the base rate, it is possible to assess the base rate by comparing this distribution to the empirically found distribution of successful predictions among competing theories.  相似文献   

12.
13.
In the second half of the eighteenth century a lively debate was going on in Germany about the nature of light. One important contribution to this discussion, namely a paper by Nicolas Béguelin, is studied in this article. In his essay, Béguelin compared the Newtonian emission theory of light and the wave theory of Leonhard Euler. Whereas others opted for one of the two theories by invoking arguments or authorities, Béguelin made a systematic search for experiments which he hoped would settle the dispute. Two of these experiments were most original. The first, which Béguelin himself performed, concerned light rays grazing a glass surface. For several reasons it did not have the impact it deserved. The second one was a thought experiment which was meant to illustrate a major tenet of the wave theory, that is, the analogy between light and sound. An analysis is given of these two experiments, and it is shown that neither of them brought the debate to an end.  相似文献   

14.
In 1895 sociologist and philosopher Georg Simmel published a paper: ‘On a connection of selection theory to epistemology’. It was focussed on the question of how behavioural success and the evolution of the cognitive capacities that underlie it are to be related to knowing and truth. Subsequently, Simmel’s ideas were largely lost, but recently (2002) an English translation was published by Coleman in this journal. While Coleman’s contextual remarks are solely concerned with a preceding evolutionary epistemology, it will be argued here that Simmel pursues a more unorthodox, more radically biologically based and pragmatist, approach to epistemology in which the presumption of a wholly interests-independent truth is abandoned, concepts are accepted as species-specific and truth tied intimately to practical success. Moreover, Simmel’s position, shorn of one too-radical commitment, shares its key commitments with the recently developed interactivist–constructivist framework for understanding biological cognition and naturalistic epistemology. There Simmel’s position can be given a natural, integrated, three-fold elaboration in interactivist re-analysis, unified evolutionary epistemology and learnable normativity.  相似文献   

15.
The application of analytic continuation in quantum field theory (QFT) is juxtaposed to T-duality and mirror symmetry in string theory. Analytic continuation—a mathematical transformation that takes the time variable t to negative imaginary time—it—was initially used as a mathematical technique for solving perturbative Feynman diagrams, and was subsequently the basis for the Euclidean approaches within mainstream QFT (e.g., Wilsonian renormalization group methods, lattice gauge theories) and the Euclidean field theory program for rigorously constructing non-perturbative models of interacting QFTs. A crucial difference between theories related by duality transformations and those related by analytic continuation is that the former are judged to be physically equivalent while the latter are regarded as physically inequivalent. There are other similarities between the two cases that make comparing and contrasting them a useful exercise for clarifying the type of argument that is needed to support the conclusion that dual theories are physically equivalent. In particular, T-duality and analytic continuation in QFT share the criterion for predictive equivalence that two theories agree on the complete set of expectation values and the mass spectra and the criterion for formal equivalence that there is a “translation manual” between the physically significant algebras of observables and sets of states in the two theories. The analytic continuation case study illustrates how predictive and formal equivalence are compatible with physical inequivalence, but not in the manner of standard underdetermination cases. Arguments for the physical equivalence of dual theories must cite considerations beyond predictive and formal equivalence. The analytic continuation case study is an instance of the strategy of developing a physical theory by extending the formal or mathematical equivalence with another physical theory as far as possible. That this strategy has resulted in developments in pure mathematics as well as theoretical physics is another feature that this case study has in common with dualities in string theory.  相似文献   

16.
Contrary to Sankey’s central assumption, incommensurability does not imply incomparability of content, nor threaten scientific realism by challenging the rationality of theory comparison. Moreover, Sankey equivocates between reference to specific entities by statements used to test theories and reference to kinds by theories themselves. This distinction helps identify and characterize the genuine threat that incommensurability poses to realism, which is ontological discontinuity as evidenced in the historical record: Successive theories reclassify objects into mutually exclusive sets of kinds to which they refer. That is why claiming that scientific progress is an increasingly better approximation to truth is difficult to justify. Similarly, Sankey’s attack on neo-Kantian antirealist positions is based on his misunderstanding of some of the central terms of those positions, making most of his attack on them ineffectual, including his diagnosis of their incoherence. We conclude by reiterating our conviction that in this debate meta-incommensurability plays an important role.  相似文献   

17.
This article seeks to take a step towards recognizing that science can deal with the concrete and individual as well as the universal. I shall concentrate on some of Aristotle’s texts, as there is a long tradition going back to Aristotle, according to which science deals only with the universal, although his work also contains texts of a very different tenor. He tries to improve the process of definition as an attempt to bring science closer to the concrete, but ends up realizing that there are some unreachable limits. There is, however, a second Aristotelian approach to the problem in Metaphysica M 10, a passage which takes scientific rapprochement to the individual further by introducing a distinction between science in potential and science in act. The former is universal, but the latter deals with individual substances and processes. Aristotle himself acknowledges here that in one sense science is universal and in another it is not, a position that raises important ontological and epistemological problems. Some suggestions are also offered concerning the kind of truth applicable to science in act, that is, practical truth.  相似文献   

18.
Thermodynamics has a clear arrow of time, characterized by the irreversible approach to equilibrium. This stands in contrast to the laws of microscopic theories, which are invariant under time-reversal. Foundational discussions of this “problem of irreversibility” often focus on historical considerations, and do therefore not take results of modern physical research on this topic into account. In this article, I will close this gap by studying the implications of dynamical density functional theory (DDFT), a central method of modern nonequilibrium statistical mechanics not previously considered in philosophy of physics, for this debate. For this purpose, the philosophical discussion of irreversibility is structured into five problems, concerned with the source of irreversibility in thermodynamics, the definition of equilibrium and entropy, the justification of coarse-graining, the approach to equilibrium and the arrow of time. For each of these problems, it is shown that DDFT provides novel insights that are of importance for both physicists and philosophers of physics.  相似文献   

19.
A central topic in the logic of science concerns the proper semantic analysis of theoretical sentences, that is sentences containing theoretical terms. In this paper, we present a novel choice-semantical account of theoretical truth based on the epsilon-term definition of theoretical terms. Specifically, we develop two ways of specifying the truth conditions of theoretical statements in a choice functional semantics, each giving rise to a corresponding logic of such statements. In order to investigate the inferential strength of these logical systems, we provide a translation of each truth definition into a modal definition of theoretical truth. Based on this, we show that the stronger notion of choice-semantical truth captures more adequately our informal semantic understanding of scientific statements.  相似文献   

20.
Relationships between current theories, and relationships between current theories and the sought theory of quantum gravity (QG), play an essential role in motivating the need for QG, aiding the search for QG, and defining what would count as QG. Correspondence is the broad class of inter-theory relationships intended to demonstrate the necessary compatibility of two theories whose domains of validity overlap, in the overlap regions. The variety of roles that correspondence plays in the search for QG are illustrated, using examples from specific QG approaches. Reduction is argued to be a special case of correspondence, and to form part of the definition of QG. Finally, the appropriate account of emergence in the context of QG is presented, and compared to conceptions of emergence in the broader philosophy literature. It is argued that, while emergence is likely to hold between QG and general relativity, emergence is not part of the definition of QG, and nor can it serve usefully in the development and justification of the new theory.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号