首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
This paper traces the origin of renormalization group concepts back to two strands of 1950s high energy physics: the causal perturbation theory programme, which gave rise to the Stueckelberg-Petermann renormalization group, and the debate about the consistency of quantum electrodynamics, which gave rise to the Gell-Mann-Low renormalization group. Recognising the different motivations that shaped these early approaches sheds light on the formal and interpretive diversity we find in contemporary renormalization group methods.  相似文献   

2.
In this paper, I introduce a new historical case study into the scientific realism debate. During the late-eighteenth century, the Scottish natural philosopher James Hutton made two important successful novel predictions. The first concerned granitic veins intruding from granite masses into strata. The second concerned what geologists now term “angular unconformities”: older sections of strata overlain by younger sections, the two resting at different angles, the former typically more inclined than the latter. These predictions, I argue, are potentially problematic for selective scientific realism in that constituents of Hutton's theory that would not be considered even approximately true today played various roles in generating them. The aim here is not to provide a full philosophical analysis but to introduce the case into the debate by detailing the history and showing why, at least prima facie, it presents a problem for selective realism. First, I explicate Hutton's theory. I then give an account of Hutton's predictions and their confirmations. Next, I explain why these predictions are relevant to the realism debate. Finally, I consider which constituents of Hutton's theory are, according to current beliefs, true (or approximately true), which are not (even approximately) true, and which were responsible for these successes.  相似文献   

3.
Cassirer's philosophical agenda revolved around what appears to be a paradoxical goal, that is, to reconcile the Kantian explanation of the possibility of knowledge with the conceptual changes of nineteenth and early twentieth-century science. This paper offers a new discussion of one way in which this paradox manifests itself in Cassirer's philosophy of mathematics. Cassirer articulated a unitary perspective on mathematics as an investigation of structures independently of the nature of individual objects making up those structures. However, this posed the problem of how to account for the applicability of abstract mathematical concepts to empirical reality. My suggestion is that Cassirer was able to address this problem by giving a transcendental account of mathematical reasoning, according to which the very formation of mathematical concepts provides an explanation of the extensibility of mathematical knowledge. In order to spell out what this argument entails, the first part of the paper considers how Cassirer positioned himself within the Marburg neo-Kantian debate over intellectual and sensible conditions of knowledge in 1902–1910. The second part compares what Cassirer says about mathematics in 1910 with some relevant examples of how structural procedures developed in nineteenth-century mathematics.  相似文献   

4.
Taking a cue from remarks Thomas Kuhn makes in 1990 about the historical turn in philosophy of science, I examine the history of history and philosophy of science within parts of the British philosophical context in the 1950s and early 1960s. During this time, ordinary language philosophy's influence was at its peak. I argue that the ordinary language philosophers' methodological recommendation to analyze actual linguistic practice influences several prominent criticisms of the deductive-nomological model of scientific explanation and that these criticisms relate to the historical turn in philosophy of science. To show these connections, I primarily examine the work of Stephen Toulmin, who taught at Oxford from 1949 to 1954, and Michael Scriven, who completed a dissertation on explanation under Gilbert Ryle and R.B. Braithwaite in 1956. I also consider Mary Hesse's appeal to an ordinary language-influenced account of meaning in her account of the role of models and analogies in scientific reasoning, and W.H. Watson's Wittgensteinian philosophy of science, an early influence on Toulmin. I think there are two upshots to my historical sketch. First, it fills out details of the move away from logical positivism to more historical- and practice-focused philosophies of science. Second, questions about linguistic meaning and the proper targets and aims of philosophical analysis are part and parcel of the historical turn, as well as its reception. Looking at the philosophical background during which so-called linguistic philosophers also had a hand in bringing these questions to prominence helps us understand why.  相似文献   

5.
An early example is von Neumann's and Charney's Princeton Meteorological Project in the period 1946–53 which ended with daily numerical prediction in less than 2 hours. After this stage, the questions of long-range forecasting and general circulation of the atmosphere became of greater importance. The late 1950s saw the emergence of an alternative: were atmospheric models used mainly for prediction or understanding? This controversial debate in particular occurred during an important colloquium in Tokyo in 1960 which gathered together J. Charney, E. Lorenz, A. Eliassen, and B. Saltzman, among others, and witnessed discussions on statistical methods for predictions and/or maximum simplification of dynamic equations. This phase ended in 1963 with Lorenzs seminal paper on “Deterministic non periodic flows.” (Received February 11, 2000)  相似文献   

6.
Summary

A famous debate between John Ray, Joseph Pitton de Tournefort and Augustus Quirinus Rivinus at the end of the seventeenth century has often been referred to as signalling the beginning of a rift between classificatory methods relying on logical division and classificatory methods relying on empirical grouping. Interestingly, a couple of decades later, Linnaeus showed very little excitement in reviewing this debate, and this although he was the first to introduce the terminological distinction of artificial vs. natural methods. In this paper, I will explain Linnaeus's indifference by the fact that earlier debates were revolving around problems of plant diagnosis rather than classification. From Linnaeus's perspective, they were therefore concerned with what he called artificial methods alone – diagnostic tools, that is, which were artificial no matter which characters were taken into account. The natural method Linnaeus proposed, on the other hand, was not about diagnosis, but about relations of equivalence which played a vital, although largely implicit role in the practices of specimen exchange on which naturalists relied to acquire knowledge of the natural world.  相似文献   

7.
This is the second in a series of three papers that charts the history of the Lenz–Ising model (commonly called just the Ising model in the physics literature) in considerable detail, from its invention in the early 1920s to its recognition as an important tool in the study of phase transitions by the late 1960s. By focusing on the development in physicists’ perception of the model’s ability to yield physical insight—in contrast to the more technical perspective in previous historical accounts, for example, Brush (Rev Modern Phys 39: 883–893, 1967) and Hoddeson et al. (Out of the Crystal Maze. Chapters from the History of Solid-State Physics. Oxford University Press, New York, pp. 489–616, 1992)—the series aims to cover and explain in depth why this model went from relative obscurity to a prominent position in modern physics, and to examine the consequences of this change. In the present paper, which is self-contained, I deal with the development from the early 1950s to the 1960s and document that this period witnessed a major change in the perception of the model: In the 1950s it was not in the cards that the model was to become a pivotal tool of theoretical physics in the following decade. In fact, I show, based upon recollections and research papers, that many of the physicists in the 1950s interested in understanding phase transitions saw the model as irrelevant for this endeavor because it oversimplifies the nature of the microscopic constituents of the physical systems exhibiting phase transitions. However, one group, Cyril Domb’s in London, held a more positive view during this decade. To bring out the basis for their view, I analyze in detail their motivation and work. In the last part of the paper I document that the model was seen as much more physically relevant in the early 1960s and examine the development that led to this change in perception. I argue that the main factor behind the change was the realization of the surprising and striking agreement between aspects of the model, notably its critical behavior, and empirical features of the physical phenomena.  相似文献   

8.
About a century ago, Ernst Mach argued that Archimedes’s deduction of the principle of the lever is invalid, since its premises contain the conclusion to be demonstrated. Subsequently, many scholars defended Archimedes, mostly on historical grounds, by raising objections to Mach’s reconstruction of Archimedes’s deduction. In the debate, the Italian philosopher and historian of science Giovanni Vailati stood out. Vailati responded to Mach with an analysis of Archimedes’s deduction which was later quoted and praised by Mach himself. In this paper, my objective is to show that the debate can be further advanced, as Mach indicated, by reframing it in terms of the empirical vs. the logical dimensions of mechanics. In this way, I will suggest, the debate about Archimedes’s deduction can be resolved in Mach’s favour.  相似文献   

9.
In this discussion paper, I seek to challenge Hylarie Kochiras’ recent claims on Newton’s attitude towards action at a distance, which will be presented in Section 1. In doing so, I shall include the positions of Andrew Janiak and John Henry in my discussion and present my own tackle on the matter (Section 2). Additionally, I seek to strengthen Kochiras’ argument that Newton sought to explain the cause of gravity in terms of secondary causation (Section 3). I also provide some specification on what Kochiras calls ‘Newton’s substance counting problem’ (Section 4). In conclusion, I suggest a historical correction (Section 5).  相似文献   

10.
The revolution in geology, initiated with Alfred Wegener’s theory of continental drift, has been the subject of many philosophical discussions aiming at resolving the problem of rationality underlying this historical episode. Even though the debate included analyses in terms of scientific methodology, applications of concrete accounts of epistemic justification to this case study have been rare. In particular, the question as to whether Wegener’s theory was epistemically worthy of pursuit in the first half of the twentieth century, that is, in its early development, remained open or inadequately addressed. The aim of this paper is to offer an answer to this question. The evaluation of Drift will be done by means of an account of theory evaluation suitable for the context of pursuit, developed in ?e?elja and Straßer (accepted for publication). We will argue that pursuing the theory of continental drift was rational, i.e., that it was irrational to reject its pursuit as unworthy.  相似文献   

11.
Sidney Dancoff׳s paper “On Radiative Corrections for Electron Scattering” is generally viewed in the secondary literature as a failed attempt to develop renormalized quantum electrodynamics (QED) a decade early, an attempt that failed because of a mistake that Dancoff made. I will discuss Dancoff׳s mistake and try to reconstruct why it occurred, by relating it to the usual practices of the quantum field theory of his time. I will also argue against the view that Dancoff was on the verge of developing renormalized QED and will highlight the conceptual divides that separate Dancoff׳s work from the QED of the late 1940s. I will finally discuss how the established view of Dancoff׳s paper came to be and how the reading of this specific anecdote relates to more general assessments of the conceptual advances of the late 1940s (covariant techniques, renormalization), in particular to their assessment as being conservative rather than revolutionary.  相似文献   

12.
Douglas Hartree and Hilleth Thomas were graduate students together at Cambridge University in the mid-1920s. Each developed an important approximation method to calculate the electronic structure of atoms. Each went on to make significant contributions to numerical analysis and to the development of scientific computing. Their early efforts were fused in the mid-1960s with the development of an approach to the many-particle problem in quantum mechanics called density functional theory. This paper discusses the experiences which led Hartree and Thomas to their approximations, outlines the similarities in their subsequent careers, and highlights the essential role their work played in the foundational papers of modern density functional theory.  相似文献   

13.
The London and Bauer monograph occupies a central place in the debate concerning the quantum measurement problem. Gavroglu has previously noted the influence of Husserlian phenomenology on London's scientific work. However, he has not explored the full extent of this influence in the monograph itself. I begin this paper by outlining the important role played by the monograph in the debate. In effect, it acted as a kind of ‘lens’ through which the standard, or Copenhagen, ‘solution’ to the measurement problem came to be perceived and, as such, it was robustly criticized, most notably by Putnam and Shimony. I then spell out the Husserlian understanding of consciousness in order to illuminate the traces of this understanding within the London and Bauer text. This, in turn, yields a new perspective on this ‘solution’ to the measurement problem, one that I believe has not been articulated before and, furthermore, which is immune to the criticisms of Putnam and Shimony.  相似文献   

14.
In his later writings Kuhn reconsidered his earlier account of incommensurability, clarifying some aspects, modifying others, and explicitly rejecting some of his earlier claims. In Kuhn’s new account incommensurability does not pose a problem for the rational evaluation of competing scientific theories, but does pose a problem for certain forms of realism. Kuhn maintains that, because of incommensurability, the notion that science might seek to learn the nature of things as they are in themselves is incoherent. I develop Kuhn’s new account of incommensurability, respond to his anti-realist argument, and sketch a form of realism in which the realist aim is a pursuable goal.  相似文献   

15.
Some conceptual issues in the foundations of classical electrodynamics concerning the interaction between particles and fields have recently received increased attention among philosophers of physics. After a brief review of the debate, I argue that there are essentially two incompatible solutions to these issues corresponding to F.A. Muller's distinction between the extension and the renormalization program. Neither of these solutions comes free of cost: the extension program is plagued with all problems related to extended elementary charges, the renormalization program works with point charges but trades in the notorious divergences of the field energies. The aim of this paper is to bring back into the discussion a third alternative, the action-at-a-distance program, which avoids both the riddles of extended elementary charges as well as the divergences although it admittedly has other problems. It will be discussed, why action-at-a-distance theories are actually not a far cry from particle–field theories, and I will argue that the main reasons for rejecting action-at-a-distance theories originate in certain metaphysical prejudices about locality and energy conservation. I will broadly suggest how these concepts could be adapted in order to allow for action at a distance.  相似文献   

16.
This paper examines the interweaving of the history of quantum decoherence and the interpretation problem in quantum mechanics through the work of two physicists—H. Dieter Zeh and Wojciech Zurek. In the early 1970s Zeh anticipated many of the important concepts of decoherence, framing it within an Everett-type interpretation. Zeh has since remained committed to this view; however, Zurek, whose papers in the 1980s were crucial in the treatment of the preferred basis problem and the subsequent development of density matrix formalism, has argued that decoherence leads to what he terms the ‘existential interpretation’, compatible with certain aspects of both Everett's relative-state formulation and the Bohr's ‘Copenhagen interpretation’. I argue that these different interpretations can be traced back to the different early approaches to the study of environment-induced decoherence in quantum systems, evident in the early work of Zeh and Zurek. I also show how Zurek's work has contributed to the tendency to see decoherence as contributing to a ‘new orthodoxy’ or a reconstruction of the original Copenhagen interpretation.  相似文献   

17.
18.
In The empirical stance, Bas van Fraassen argues for a reconceptualization of empiricism, and a rejection of its traditional rival, speculative metaphysics, as part of a larger and provocative study in epistemology. Central to his account is the notion of voluntarism in epistemology, and a concomitant understanding of the nature of rationality. In this paper I give a critical assessment of these ideas, with the ultimate goal of clarifying the nature of debate between metaphysicians and empiricists, and more specifically, between scientific realists and empiricist antirealists. Despite van Fraassen’s assertion to the contrary, voluntarism leads to a form of epistemic relativism. Rather than stifling debate, however, this ‘stance’ relativism places precise constraints on possibilities for constructive engagement between metaphysicians and empiricists, and thus distinguishes, in broad terms, paths along which this debate may usefully proceed from routes which offer no hope of progress.  相似文献   

19.
In this paper I offer an account of the normative dimension implicit in D. Bernoulli’s expected utility functions by means of an analysis of the juridical metaphors upon which the concept of mathematical expectation was moulded. Following a suggestion by the late E. Coumet, I show how this concept incorporated a certain standard of justice which was put in question by the St. Petersburg paradox. I contend that Bernoulli would have solved it by introducing an alternative normative criterion rather than a positive model of decision making processes.  相似文献   

20.
In 1965, John A. Pope presented a paper entitled 'Two-Dimensional Chart of Quantum Chemistry' to illustrate the inverse relationship between the sophistication of computational methods and the size of molecules under study. This chart, later called the 'hyperbola of quantum chemistry', succinctly summarized the growing tension between the proponents of two different approaches to computation–the ab initio method and semiempirical method–in the early years of electronic digital computers. Examining the development of quantum chemistry after World War II, I focus on the role of computers in shaping disciplinary identity. The availability of high-speed computers in the early 1950s attracted much attention from quantum chemists, and their community took shape through a series of conferences and personal networking. However, this emerging community soon encountered the problem of communication between groups that differed in the degree of reliance they placed on computers. I show the complexity of interactions between computing technology and a scientific discipline, in terms of both forming and splitting the community of quantum chemistry.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号