首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 171 毫秒
1.
This paper examines James Conant’s pragmatic theory of science—a theory that has been neglected by most commentators on the history of 20th-century philosophy of science—and it argues that this theory occupied an important place in Conant’s strategic thinking about the Cold War. Conant drew upon his wartime science policy work, the history of science, and Quine’s epistemological holism to argue that there is no strict distinction between science and technology, that there is no such thing as “the scientific method,” and that theories are better interpreted as policies rather than creeds. An important consequence that he drew from these arguments is that science is both a thoroughly value-laden, and an intrinsically social, enterprise. These results led him to develop novel proposals for reorganizing scientific and technological research—proposals that he believed could help to win the Cold War. Interestingly, the Cold War had a different impact upon Conant’s thinking than it did upon many other theorists of science in postwar America. Instead of leading him to “the icy slopes of logic,” it led him to develop a socially- and politically-engaged theory that was explicitly in the service of the American Cold War effort.  相似文献   

2.
In this paper, three theories of progress and the aim of science are discussed: (i) the theory of progress as increasing explanatory power, advocated by Popper in The logic of scientific discovery (1935/1959); (ii) the theory of progress as approximation to the truth, introduced by Popper in Conjectures and refutations (1963); (iii) the theory of progress as a steady increase of competing alternatives, which Feyerabend put forward in the essay “Reply to criticism. Comments on Smart, Sellars and Putnam” (1965) and defended as late as the last edition of Against method (1993). It is argued that, contrary to what Feyerabend scholars have predominantly assumed, Feyerabend's changing attitude towards falsificationism—which he often advocated at the beginning of his career, and vociferously attacked in the 1970s and 1980s—must be explained by taking into account not only Feyerabend's very peculiar view of the aim of science, but also Popper's changing account of progress.  相似文献   

3.
The application of analytic continuation in quantum field theory (QFT) is juxtaposed to T-duality and mirror symmetry in string theory. Analytic continuation—a mathematical transformation that takes the time variable t to negative imaginary time—it—was initially used as a mathematical technique for solving perturbative Feynman diagrams, and was subsequently the basis for the Euclidean approaches within mainstream QFT (e.g., Wilsonian renormalization group methods, lattice gauge theories) and the Euclidean field theory program for rigorously constructing non-perturbative models of interacting QFTs. A crucial difference between theories related by duality transformations and those related by analytic continuation is that the former are judged to be physically equivalent while the latter are regarded as physically inequivalent. There are other similarities between the two cases that make comparing and contrasting them a useful exercise for clarifying the type of argument that is needed to support the conclusion that dual theories are physically equivalent. In particular, T-duality and analytic continuation in QFT share the criterion for predictive equivalence that two theories agree on the complete set of expectation values and the mass spectra and the criterion for formal equivalence that there is a “translation manual” between the physically significant algebras of observables and sets of states in the two theories. The analytic continuation case study illustrates how predictive and formal equivalence are compatible with physical inequivalence, but not in the manner of standard underdetermination cases. Arguments for the physical equivalence of dual theories must cite considerations beyond predictive and formal equivalence. The analytic continuation case study is an instance of the strategy of developing a physical theory by extending the formal or mathematical equivalence with another physical theory as far as possible. That this strategy has resulted in developments in pure mathematics as well as theoretical physics is another feature that this case study has in common with dualities in string theory.  相似文献   

4.
According to inference to the best explanation (IBE), scientists infer the loveliest of competing hypotheses, ‘loveliness’ being explanatory virtue. This generates two key objections: that loveliness is too subjective to guide inference, and that it is no guide to truth. I defend IBE using Thomas Kuhn’s notion of exemplars: the scientific theories, or applications thereof, that define Kuhnian normal science and facilitate puzzle-solving. I claim that scientists infer the explanatory puzzle-solution that best meets the standard set by the relevant exemplar of loveliness. Exemplars are the subject of consensus, eliminating subjectivity; divorced from Kuhnian relativism, they give loveliness the context-sensitivity required to be truth-tropic. The resulting account, ‘Kuhnian IBE’, is independently plausible and offers a partial rapprochement between IBE and Kuhn’s account of science.  相似文献   

5.
In the area of social science, in particular, although we have developed methods for reliably discovering the existence of causal relationships, we are not very good at using these to design effective social policy. Cartwright argues that in order to improve our ability to use causal relationships, it is essential to develop a theory of causation that makes explicit the connections between the nature of causation, our best methods for discovering causal relationships, and the uses to which these are put. I argue that Woodward's interventionist theory of causation is uniquely suited to meet Cartwright's challenge. More specifically, interventionist mechanisms can provide the bridge from ‘hunting causes’ to ‘using them’, if interventionists (i) tell us more about the nature of these mechanisms, and (ii) endorse the claim that it is these mechanisms—or whatever constitutes them—that make causal claims true. I illustrate how having an understanding of interventionist mechanisms can allow us to put causal knowledge to use via a detailed example from organic chemistry.  相似文献   

6.
This paper examines competing interpretations of Pierre Duhem’s theory of good sense recently defended by David Stump and Milena Ivanova and defends a hybrid reading that accommodates the intuitions of both readings. At issue between Stump and Ivanova is whether Duhemian good sense is a virtue theoretic concept. I approach the issue from the broader perspective of determining the epistemic value of good sense per se, and argue for a mitigated virtue theoretic reading that identifies an essential role for good sense in theory choice. I also show that many important issues in both philosophy of science and ‘mainstream’ value driven epistemology are illuminated by the debate over the epistemic value of good sense. In particular, philosophical work on the nature of cognitive character, rule governed rationality and the prospects of epistemic value t-monism are illuminated by virtue theoretic readings of Duhemian good sense.  相似文献   

7.
One way to reconstruct the miracle argument for scientific realism is to regard it as a statistical inference: since it is exceedingly unlikely that a false theory makes successful predictions, while it is rather likely that an approximately true theory is predictively successful, it is reasonable to infer that a predictively successful theory is at least approximately true. This reconstruction has led to the objection that the argument embodies a base rate fallacy: by focusing on successful theories one ignores the vast number of false theories some of which will be successful by mere chance.In this paper, I shall argue that the cogency of this objection depends on the explanandum of the miracle argument. It is cogent if what is to be explained is the success of a particular theory. If, however, the explanandum of the argument is the distribution of successful predictions among competing theories, the situation is different. Since the distribution of accidentally successful predictions is independent of the base rate, it is possible to assess the base rate by comparing this distribution to the empirically found distribution of successful predictions among competing theories.  相似文献   

8.
Summary Anthranilic acid-(14COOH) was administered to rooted leaves ofAdhatoda vasica Nees. The isolated alkaloid peganine = vasicine was degraded according to the method ofSpäth andHikawitz (Figure). The anthranilic acid obtained showed the same specific radioactivity as the alkaloid. Therefore anthranilic acid must be regarded as a direct precursor of peganine inA. vasica Nees.  相似文献   

9.
Duhem’s concept of ‘good sense’ is central to his philosophy of science, given that it is what allows scientists to decide between competing theories. Scientists must use good sense and have intellectual and moral virtues in order to be neutral arbiters of scientific theories, especially when choosing between empirically adequate theories. I discuss the parallels in Duhem’s views to those of virtue epistemologists, who understand justified belief as that arrived at by a cognitive agent with intellectual and moral virtues, showing how consideration of Duhem as a virtue epistemologist offers insights into his views, as well as providing possible answers to some puzzles about virtue epistemology. The extent to which Duhem holds that the intellectual and moral virtues of the scientist determine scientific knowledge has not been generally noticed.  相似文献   

10.
We discuss some aspects of the relation between dualities and gauge symmetries. Both of these ideas are of course multi-faceted, and we confine ourselves to making two points. Both points are about dualities in string theory, and both have the ‘flavour’ that two dual theories are ‘closer in content’ than you might think. For both points, we adopt a simple conception of a duality as an ‘isomorphism’ between theories: more precisely, as appropriate bijections between the two theories’ sets of states and sets of quantities.The first point (Section 3) is that this conception of duality meshes with two dual theories being ‘gauge related’ in the general philosophical sense of being physically equivalent. For a string duality, such as T-duality and gauge/gravity duality, this means taking such features as the radius of a compact dimension, and the dimensionality of spacetime, to be ‘gauge’.The second point (4 Gauge/gravity duality, 5 Some complications for gauge invariance, 6 Galileo׳s ship, (Local)) is much more specific. We give a result about gauge/gravity duality that shows its relation to gauge symmetries (in the physical sense of symmetry transformations that are spacetime-dependent) to be subtler than you might expect. For gauge theories, you might expect that the duality bijections relate only gauge-invariant quantities and states, in the sense that gauge symmetries in one theory will be unrelated to any symmetries in the other theory. This may be so in general; and indeed, it is suggested by discussions of Polchinski and Horowitz. But we show that in gauge/gravity duality, each of a certain class of gauge symmetries in the gravity/bulk theory, viz. diffeomorphisms, is related by the duality to a position-dependent symmetry of the gauge/boundary theory.  相似文献   

11.
What if gravity satisfied the Klein–Gordon equation? Both particle physics from the 1920–30s and the 1890s Neumann–Seeliger modification of Newtonian gravity with exponential decay suggest considering a “graviton mass term” for gravity, which is algebraic in the potential. Unlike Nordström׳s “massless” theory, massive scalar gravity is strictly special relativistic in the sense of being invariant under the Poincaré group but not the 15-parameter Bateman–Cunningham conformal group. It therefore exhibits the whole of Minkowski space–time structure, albeit only indirectly concerning volumes. Massive scalar gravity is plausible in terms of relativistic field theory, while violating most interesting versions of Einstein׳s principles of general covariance, general relativity, equivalence, and Mach. Geometry is a poor guide to understanding massive scalar gravity(s): matter sees a conformally flat metric due to universal coupling, but gravity also sees the rest of the flat metric (barely or on long distances) in the mass term. What is the ‘true’ geometry, one might wonder, in line with Poincaré׳s modal conventionality argument? Infinitely many theories exhibit this bimetric ‘geometry,’ all with the total stress–energy׳s trace as source; thus geometry does not explain the field equations. The irrelevance of the Ehlers–Pirani–Schild construction to a critique of conventionalism becomes evident when multi-geometry theories are contemplated. Much as Seeliger envisaged, the smooth massless limit indicates underdetermination of theories by data between massless and massive scalar gravities—indeed an unconceived alternative. At least one version easily could have been developed before General Relativity; it then would have motivated thinking of Einstein׳s equations along the lines of Einstein׳s newly re-appreciated “physical strategy” and particle physics and would have suggested a rivalry from massive spin 2 variants of General Relativity (massless spin 2, Pauli and Fierz found in 1939). The Putnam–Grünbaum debate on conventionality is revisited with an emphasis on the broad modal scope of conventionalist views. Massive scalar gravity thus contributes to a historically plausible rational reconstruction of much of 20th–21st century space–time philosophy in the light of particle physics. An appendix reconsiders the Malament–Weatherall–Manchak conformal restriction of conventionality and constructs the ‘universal force’ influencing the causal structure.Subsequent works will discuss how massive gravity could have provided a template for a more Kant-friendly space–time theory that would have blocked Moritz Schlick׳s supposed refutation of synthetic a priori knowledge, and how Einstein׳s false analogy between the Neumann–Seeliger–Einstein modification of Newtonian gravity and the cosmological constant Λ generated lasting confusion that obscured massive gravity as a conceptual possibility.  相似文献   

12.
13.
In this paper I argue that the case of Einstein׳s special relativity vs. Hendrik Lorentz׳s ether theory can be decided in terms of empirical evidence, in spite of the predictive equivalence between the theories. In the historical and philosophical literature this case has been typically addressed focusing on non-empirical features (non-empirical virtues in special relativity and/or non-empirical flaws in the ether theory). I claim that non-empirical features are not enough to provide a fully objective and uniquely determined choice in instances of empirical equivalence. However, I argue that if we consider arguments proposed by Richard Boyd, and by Larry Laudan and Jarret Leplin, a choice based on non-entailed empirical evidence favoring Einstein׳s theory can be made.  相似文献   

14.
15.
Our paper challenges the conventional wisdom that the flat maximum inflicts the ‘curse of insensitivity’ on the modelling of judgement and decision processes. In particular, we argue that this widely demonstrated failure on the part of conventional statistical methods to differentiate between competing models has a useful role to play in the development of accessible and economical applied systems, since it allows a low cost choice between systems which vary in their cognitive demands on the user and in their ease of development and implementation. To illustrate our thesis, we take two recent applications of linear scoring models used for credit scoring and for the prediction of sudden infant death. The paper discusses the nature and determinants of the flat maximum as well as its role in applied cognition. Other sections mention certain unanswered questions about the development of linear scoring models and briefly describe competing formulations for prediction.  相似文献   

16.
In this paper, I compare Pierre-Simon Laplace's celebrated formulation of the principle of determinism in his 1814 Essai philosophique sur les probabilités with the formulation of the same principle offered by Roger Joseph Boscovich in his Theoria philosophiae naturalis, published 56 years earlier. This comparison discloses a striking general similarity between the two formulations of determinism as well as certain important differences. Regarding their similarities, both Boscovich's and Laplace's conceptions of determinism involve two mutually interdependent components—ontological and epistemic—and they are both intimately linked with the principles of causality and continuity. Regarding their differences, however, Boscovich's formulation of the principle of determinism turns out not only to be temporally prior to Laplace's but also—being founded on fewer metaphysical principles and more rooted in and elaborated by physical assumptions—to be more precise, complete and comprehensive than Laplace's somewhat parenthetical statement of the doctrine. A detailed analysis of these similarities and differences, so far missing in the literature on the history and philosophy of the concept of determinism, is the main goal of the present paper.  相似文献   

17.
Perhaps the strongest argument for scientific realism, the no-miracles-argument, has been said to commit the so-called base rate fallacy. The apparent elusiveness of the base rate of true theories has even been said to undermine the rationality of the entire realism debate. On the basis of the Kuhnian picture of theory choice, I confront this challenge by arguing that a theory is likely to be true if it possesses multiple theoretical virtues and is embraced by numerous scientists–even when the base rate converges to zero.  相似文献   

18.
Linus Pauling played a key role in creating valence-bond theory, one of two competing theories of the chemical bond that appeared in the first half of the 20th century. While the chemical community preferred his theory over molecular-orbital theory for a number of years, valence-bond theory began to fall into disuse during the 1950s. This shift in the chemical community's perception of Pauling's theory motivated Pauling to defend the theory, and he did so in a peculiar way. Rather than publishing a defence of the full theory in leading journals of the day, Pauling published a defence of a particular model of the double bond predicted by the theory in a revised edition of his famous textbook, The Nature of the Chemical Bond. This paper explores that peculiar choice by considering both the circumstances that brought about the defence and the mathematical apparatus Pauling employed, using new discoveries from the Ava Helen and Linus Pauling Papers archive.  相似文献   

19.
Most of our knowledge of Greek and Roman scientific practice and its place in ancient culture is derived from our study of ancient texts. In the last few decades, this written evidence—ancient technical or specialist literature—has begun to be studied using tools of literary analysis to help answer questions about, for instance, how these works were composed, their authors’ intentions and the expectations of their readers.This introduction to Structures and strategies in ancient Greek and Roman technical writing provides an overview of recent scholarship in the area, and the difficulty in pinning down what ‘technical/specialist literature’ might mean in an ancient context, since Greek and Roman authors communicated scientific knowledge using a wide variety of styles and forms of text (e.g. poetry, dialogues, letters).An outline of the three sections is provided: Form as a mirror of method, in which Sabine Föllinger and Alexander Mueller explore ways in which the structures of texts by Aristotle and Plutarch may reflect methodological concerns; Authors and their implied readers, with contributions by Oliver Stoll, David Creese, Boris Dunsch and Paula Olmos, which examines what ancient texts can tell us about the place of technical knowledge in antiquity; Science and the uses of poetry, with articles by Jochen Althoff, Michael Coxhead and Laurence Totelin, and a new English translation of the Aetna poem by Harry Hine, which explores the (to us) unexpected roles of poetry in ancient scientific culture.  相似文献   

20.
The purpose of this paper is to offer a sympathetic reconstruction of the political thought of Paul Feyerabend. Using a critical discussion of the idea of the ‘free society’ it is suggested that his political thought is best understood in terms of three thematic concerns—liberation, hegemony, and the authority of science—and that the political significance of those claims become clear when they are considered in the context of his educational views. It emerges that Feyerabend is best understood as calling for the grounding of cognitive and cultural authorities—like the sciences—in informed deliberation, rather than the uncritical embrace of prevailing convictions. It therefore emerges that a free society is best understood as one of epistemically responsible citizenship rather than epistemically anarchistic relativism of the ‘anything goes’ sort—a striking anticipation of current debates about philosophy of science in society.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号