首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We outline a framework for analyzing episodes from the history of science in which the application of mathematics plays a constitutive role in the conceptual development of empirical sciences. Our starting point is the inferential conception of the application of mathematics, recently advanced by Bueno and Colyvan (2011). We identify and discuss some systematic problems of this approach. We propose refinements of the inferential conception based on theoretical considerations and on the basis of a historical case study. We demonstrate the usefulness of the refined, dynamical inferential conception using the well-researched example of the genesis of general relativity. Specifically, we look at the collaboration of the physicist Einstein and the mathematician Grossmann in the years 1912–1913, which resulted in the jointly published “Outline of a Generalized Theory of Relativity and a Theory of Gravitation,” a precursor theory of the final theory of general relativity. In this episode, independently developed mathematical theories, the theory of differential invariants and the absolute differential calculus, were applied in the process of finding a relativistic theory of gravitation. The dynamical inferential conception not only provides a natural framework to describe and analyze this episode, but it also generates new questions and insights. We comment on the mathematical tradition on which Grossmann drew, and on his own contributions to mathematical theorizing. The dynamical inferential conception allows us to identify both the role of heuristics and of mathematical resources as well as the systematic role of problems and mistakes in the reconstruction of episodes of conceptual innovation and theory change.  相似文献   

2.
3.
In The Theory of Relativity and A Priori Knowledge (1920b), Reichenbach developed an original account of cognition as coordination of formal structures to empirical ones. One of the most salient features of this account is that it is explicitly not a top-down type of coordination, and in fact it is crucially “directed” by the empirical side. Reichenbach called this feature “the mutuality of coordination” but, in that work, did not elaborate sufficiently on how this is supposed to work. In a paper that he wrote less than two years afterwards (but that he published only in 1932), “The Principle of Causality and the Possibility of its Empirical Confirmation” (1923/1932), he described what seems to be a model for this idea, now within an analysis of causality that results in an account of scientific inference. Recent reassessments of his early proposal do not seem to capture the extent of Reichenbach's original worries. The present paper analyses Reichenbach's early account and suggests a new way to look at his early work. According to it, we perform measurements, individuate parameters, collect and analyse data, by using a “constructive” approach, such as the one with which we formulate and test hypotheses, which paradigmatically requires some simplicity assumptions. Reichenbach's attempt to account for all these aspects in 1923 was obviously limited and naive in many ways, but it shows that, in his view, there were multiple ways in which the idea of “constitution” is embodied in scientific practice.  相似文献   

4.
Causal set theory and the theory of linear structures (which has recently been developed by Tim Maudlin as an alternative to standard topology) share some of their main motivations. In view of that, I raise and answer the question how these two theories are related to each other and to standard topology. I show that causal set theory can be embedded into Maudlin׳s more general framework and I characterise what Maudlin׳s topological concepts boil down to when applied to discrete linear structures that correspond to causal sets. Moreover, I show that all topological aspects of causal sets that can be described in Maudlin׳s theory can also be described in the framework of standard topology. Finally, I discuss why these results are relevant for evaluating Maudlin׳s theory. The value of this theory depends crucially on whether it is true that (a) its conceptual framework is as expressive as that of standard topology when it comes to describing well-known continuous as well as discrete models of spacetime and (b) it is even more expressive or fruitful when it comes to analysing topological aspects of discrete structures that are intended as models of spacetime. On one hand, my theorems support (a). The theory is rich enough to incorporate causal set theory and its definitions of topological notions yield a plausible outcome in the case of causal sets. On the other hand, the results undermine (b). Standard topology, too, has the conceptual resources to capture those topological aspects of causal sets that are analysable within Maudlin׳s framework. This fact poses a challenge for the proponents of Maudlin׳s theory to prove it fruitful.  相似文献   

5.
In this paper I assess whether the recently proposed “No De-Coupling” (NDC) theory of constitutive relevance in mechanisms is a useful tool to reconstruct constitutive relevance investigations in scientific practice. The NDC theory has been advanced as a framework theoretically superior to the mutual manipulability (MM) account of constitutive relevance in mechanisms but, in contrast to the MM account, has not yet been applied to detailed case studies. I argue that the NDC account is also applicable to empirical practice and that it fares better than the MM account on both theoretical and empirical grounds. I elaborate these claims in terms of applications of the NDC theory to two case studies of cognitive science research on the role of eye movements in mechanisms for cognitive capacities.  相似文献   

6.
7.
In my From Instrumentalism to Constructive Realism (2000) I have shown how an instrumentalist account of empirical progress can be related to nomic truth approximation. However, it was assumed that a strong notion of nomic theories was needed for that analysis. In this paper it is shown, in terms of truth and falsity content, that the analysis already applies when, in line with scientific common sense, nomic theories are merely assumed to exclude certain conceptual possibilities as nomic possibilities.  相似文献   

8.
When attempting to assess the strengths and weaknesses of various principles in their potential role of guiding the formulation of a theory of quantum gravity, it is crucial to distinguish between principles which are strongly supported by empirical data – either directly or indirectly – and principles which instead (merely) rely heavily on theoretical arguments for their justification. Principles in the latter category are not necessarily invalid, but their a priori foundational significance should be regarded with due caution. These remarks are illustrated in terms of the current standard models of cosmology and particle physics, as well as their respective underlying theories, i.e., essentially general relativity and quantum (field) theory. For instance, it is clear that both standard models are severely constrained by symmetry principles: an effective homogeneity and isotropy of the known universe on the largest scales in the case of cosmology and an underlying exact gauge symmetry of nuclear and electromagnetic interactions in the case of particle physics. However, in sharp contrast to the cosmological situation, where the relevant symmetry structure is more or less established directly on observational grounds, all known, nontrivial arguments for the “gauge principle” are purely theoretical (and far less conclusive than usually advocated). Similar remarks apply to the larger theoretical structures represented by general relativity and quantum (field) theory, where – actual or potential – empirical principles, such as the (Einstein) equivalence principle or EPR-type nonlocality, should be clearly differentiated from theoretical ones, such as general covariance or renormalizability. It is argued that if history is to be of any guidance, the best chance to obtain the key structural features of a putative quantum gravity theory is by deducing them, in some form, from the appropriate empirical principles (analogous to the manner in which, say, the idea that gravitation is a curved spacetime phenomenon is arguably implied by the equivalence principle). Theoretical principles may still be useful however in formulating a concrete theory (analogous to the manner in which, say, a suitable form of general covariance can still act as a sieve for separating theories of gravity from one another). It is subsequently argued that the appropriate empirical principles for deducing the key structural features of quantum gravity should at least include (i) quantum nonlocality, (ii) irreducible indeterminacy (or, essentially equivalently, given (i), relativistic causality), (iii) the thermodynamic arrow of time, (iv) homogeneity and isotropy of the observable universe on the largest scales. In each case, it is explained – when appropriate – how the principle in question could be implemented mathematically in a theory of quantum gravity, why it is considered to be of fundamental significance and also why contemporary accounts of it are insufficient. For instance, the high degree of uniformity observed in the Cosmic Microwave Background is usually regarded as theoretically problematic because of the existence of particle horizons, whereas the currently popular attempts to resolve this situation in terms of inflationary models are, for a number of reasons, less than satisfactory. However, rather than trying to account for the required empirical features dynamically, an arguably much more fruitful approach consists in attempting to account for these features directly, in the form of a lawlike initial condition within a theory of quantum gravity.  相似文献   

9.
In their book Cognitive Structure of Scientific Revolutions, Hanne Andersen, Peter Barker, and Xiang Chen reconstruct Kuhn’s account of conceptual structure and change, based on the dynamic frame model. I argue against their reconstruction of anomalies and of the no-overlap principle and propose a competing model, based on the similarity relation. First, I introduce the concept of psychological distance between objects, and then I show that the conceptual structure of a theory consists of a set of natural families, separated by a significant empty space. I argue that, in such a conceptual structure, the ES condition, according to which the distance between natural families should be greater than the distance between any two objects belonging to the same natural family, is satisfied. Anomalous objects lead to the violation of this condition. I argue that in a conceptual structure satisfying the ES condition, a similarity relation could be defined, so that natural families would be similarity classes, satisfying the no-overlap principle. In a structure not satisfying this principle, such similarity classes could not be delimited.  相似文献   

10.
I summarize certain aspects of Paul Feyerabend's account of the development of Western rationalism, show the ways in which that account is supposed to run up against an alternative, that of Karl Popper, and then try to give a preliminary comparison of the two. My interest is primarily in whether what Feyerabend called his ‘story’ constitutes a possible history of our epistemic concepts and their trajectory. I express some grave reservations about that story, and about Feyerabend's framework, finding Popper's views less problematic here. However, I also suggest that one important aspect of Feyerabend's material, his treatment of religious belief, can be given an interpretation which makes it tenable, and perhaps preferable to a Popperian approach.  相似文献   

11.
In this paper I draw the distinction between intuitive and theory-relative accounts of the time reversal symmetry and identify problems with each. I then propose an alternative to these two types of accounts that steers a middle course between them and minimizes each account׳s problems. This new account of time reversal requires that, when dealing with sets of physical theories that satisfy certain constraints, we determine all of the discrete symmetries of the physical laws we are interested in and look for involutions that leave spatial coordinates unaffected and that act consistently across our physical laws. This new account of time reversal has the interesting feature that it makes the nature of the time reversal symmetry an empirical feature of the world without requiring us to assume that any particular physical theory is time reversal invariant from the start. Finally, I provide an analysis of several toy cases that reveals differences between my new account of time reversal and its competitors.  相似文献   

12.
The geometry of the alternative reconstruction of Eudoxan planetary theory is studied. It is shown that in this framework the hippopede acquires an analytical role, consolidating the theorys geometrical underpinnings. This removes the main point of incompatibility between the alternative reconstruction and Simpliciuss account of Eudoxan planetary astronomy. The analysis also suggests a compass and straight-edge procedure for drawing a point by point outline of the retrograde loop created by any given arrangement of the three inner spheres. (Received March 20, 2001)  相似文献   

13.
Recent insights into the conceptual structure of localization in QFT (modular localization) led to clarifications of old unsolved problems. The oldest one is the Einstein–Jordan conundrum which led Jordan in 1925 to the discovery of quantum field theory. This comparison of fluctuations in subsystems of heat bath systems (Einstein) with those resulting from the restriction of the QFT vacuum state to an open subvolume (Jordan) leads to a perfect analogy; the globally pure vacuum state becomes upon local restriction a strongly impure KMS state. This phenomenon of localization-caused thermal behavior as well as the vacuum-polarization clouds at the causal boundary of the localization region places localization in QFT into a sharp contrast with quantum mechanics and justifies the attribute “holstic”. In fact it positions the E–J Gedankenexperiment into the same conceptual category as the cosmological constant problem and the Unruh Gedankenexperiment. The holistic structure of QFT resulting from “modular localization” also leads to a revision of the conceptual origin of the crucial crossing property which entered particle theory at the time of the bootstrap S-matrix approach but suffered from incorrect use in the S-matrix settings of the dual model and string theory.The new holistic point of view, which strengthens the autonomous aspect of QFT, also comes with new messages for gauge theory by exposing the clash between Hilbert space structure and localization and presenting alternative solutions based on the use of stringlocal fields in Hilbert space. Among other things this leads to a reformulation of the Englert–Higgs symmetry breaking mechanism.  相似文献   

14.
In this paper we offer a formal-logical analysis of the famous reversibility objection against the Second Law of thermodynamics. We reconstruct the objection as a deductive argument leading to a contradiction, employing resources of standard quantified modal logic and thereby highlighting explicit and implicit assumptions with respect to possibility, identity, and their interaction. We then describe an alternative framework, case-intensional first order logic, that has greater expressive resources than standard quantified modal logic. We show that in that framework we can account for the role of sortals in possibility judgments. This allows us to formalize the relevant truths involved in the reversibility objection in such a way that no contradiction ensues. We claim that this analysis helps to understand in which way the Second Law is, specifically, a law of thermodynamics, but not of systems of particles in general.  相似文献   

15.
Methodologists in political science have advocated for causal process tracing as a way of providing evidence for causal mechanisms. Recent analyses of the method have sought to provide more rigorous accounts of how it provides such evidence. These accounts have focused on the role of process tracing for causal inference and specifically on the way it can be used with case studies for testing hypotheses. While the analyses do provide an account of such testing, they pay little attention to the narrative elements of case studies. I argue that the role of narrative in case studies is not merely incidental. Narrative does cognitive work by both facilitating the consideration of alternative hypotheses and clarifying the relationship between evidence and explanation. I consider the use of process tracing in a particular case (the Fashoda Incident) in order to illustrate the role of narrative. I argue that process tracing contributes to knowledge production in ways that the current focus on inference tends to obscure.  相似文献   

16.
Cassirer's philosophical agenda revolved around what appears to be a paradoxical goal, that is, to reconcile the Kantian explanation of the possibility of knowledge with the conceptual changes of nineteenth and early twentieth-century science. This paper offers a new discussion of one way in which this paradox manifests itself in Cassirer's philosophy of mathematics. Cassirer articulated a unitary perspective on mathematics as an investigation of structures independently of the nature of individual objects making up those structures. However, this posed the problem of how to account for the applicability of abstract mathematical concepts to empirical reality. My suggestion is that Cassirer was able to address this problem by giving a transcendental account of mathematical reasoning, according to which the very formation of mathematical concepts provides an explanation of the extensibility of mathematical knowledge. In order to spell out what this argument entails, the first part of the paper considers how Cassirer positioned himself within the Marburg neo-Kantian debate over intellectual and sensible conditions of knowledge in 1902–1910. The second part compares what Cassirer says about mathematics in 1910 with some relevant examples of how structural procedures developed in nineteenth-century mathematics.  相似文献   

17.
In Dynamics of Reason Michael Friedman proposes a kind of synthesis between the neokantianism of Ernst Cassirer, the logical empiricism of Rudolf Carnap, and the historicism of Thomas Kuhn. Cassirer and Carnap are to take care of the Kantian legacy of modern philosophy of science, encapsulated in the concept of the relativized a priori and the globally rational or continuous evolution of scientific knowledge, while Kuhn’s role is to ensure that the historicist character of scientific knowledge is taken seriously. More precisely, Carnapian linguistic frameworks, guarantee that the evolution of science proceeds in a rational manner locally, while Cassirer’s concept of an internally defined conceptual convergence of empirical theories provides the means to maintain the global continuity of scientific reason. In this paper it is argued that Friedman’s Neokantian account of scientific reason based on the concept of the relativized a priori underestimates the pragmatic aspects of the dynamics of scientific reason. To overcome this shortcoming, I propose to reconsider C.I. Lewis’s account of a pragmatic priori, recently modernized and elaborated by Hasok Chang. This may be considered as a first step to a dynamics of an embodied reason, less theoretical and more concrete than Friedman’s Neokantian proposal.  相似文献   

18.
Dark matter (DM) is an essential ingredient of the present Standard Cosmological Model, according to which only 5% of the mass/energy content of our universe is made of ordinary matter. In recent times, it has been argued that certain cases of gravitational lensing represent a new type of evidence for the existence of DM. In a recent paper, Peter Kosso attempts to substantiate that claim. His argument is that, although in such cases DM is only detected by its gravitational effects, gravitational lensing is a direct consequence of Einstein's Equivalence Principle (EEP) and therefore the complete gravitational theory is not needed in order to derive such lensing effects. In this paper I critically examine Kosso's argument: I confront the notion of empirical evidence involved in the discussion and argue that EEP does not have enough power by itself to sustain the claim that gravitational lensing in the Bullet Cluster constitutes evidence for the DM Hypothesis. As a consequence of this, it is necessary to examine the details of alternative theories of gravity to decide whether certain empirical situations are indeed evidence for the existence of DM. It may well be correct that gravitational lensing does constitute evidence for the DM Hypothesis—at present it is controversial whether the proposed modifications of gravitation all need DM to account for the phenomenon of gravitational lensing and if so, of which kind—but this will not be a direct consequence of EEP.  相似文献   

19.
In the above pages I have sketched a history of the genesis and comparative evaluation of the repressor model of genetic regulation of enzyme induction. I have not attempted in this article to carry out an analysis of the more scientifically interesting fully developed Jacob-Monod operon theory of genetic regulations but such an analysis of the operon theory would not, I believe, involve any additional logical or epistemological features than have been discussed above.I have argued that the above account of the development of a theory of enzyme induction involved inferential moves and well-characterized desiderata, of both empirical and non-empirical character, in the genesis and evaluation of new hypotheses and theories. I have also contended that the reasoning displayed in the genesis of a theory is in a large measure identical to that utilized in evaluating a theory. Both of these conclusions are at variance with the views of philosophers such as H. Reichenbach, Sir Karl Popper, and C.G. Hempel who have argued that the genesis of new hypotheses is primarily an irrational affair and that only the context of justification is susceptible of rational reconstruction. In the alternative view presented here, scientific discovery and scientific justification represent the application in contexts, which are primarily telically distinguishable, of a fundamentally unitary logic of scientific inquiry.  相似文献   

20.
We aim to assess the ability of two alternative forecasting procedures to predict quarterly national account (QNA) aggregates. The application of Box–Jenkins techniques to observed data constitutes the basis of traditional ARIMA and transfer function methods (BJ methods). The alternative procedure exploits the information of unobserved high‐ and low‐frequency components of time series (UC methods). An informal examination of empirical evidence suggests that the relationships between QNA aggregates and coincident indicators are often clearly different for diverse frequencies. Under these circumstances, a Monte Carlo experiment shows that UC methods significantly improve the forecasting accuracy of BJ procedures if coincident indicators play an important role in such predictions. Otherwise (i.e., under univariate procedures), BJ methods tend to be more accurate than the UC alternative, although the differences are small. We illustrate these findings with several applications from the Spanish economy with regard to industrial production, private consumption, business investment and exports. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号