首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
It is with good reason that the name Rutherford is closely linked with the early history of the alpha particle. He discovered them, determined their nature, and from 1909 used them to probe the structure of the atom. From 1898 to 1902 Rutherford construed alpha radiation as a type of non-particulate Röntgen radiation. On his theory of the locomotion of radioactive particles Rutherford proposed that alpha radiation consisted of negatively charged particles. During 1902 he confirmed the particulate nature of alpha radiation but discovered that these alpha particles were positively charged. Although Rutherford suspected from 1903 that these alpha particles were related somehow with helium, the proof required six long years of investigation. By mid-1908 it seemed certain that the alpha particle possessed two units of the elementary charge. Since the e/m ratio had already been determined for alpha particles, this evidence enhanced the suspected connection with helium. However, this gain and loss of charge was still construed as an ionization effect. Since as late as 1908 gaseous ionization was assumed to involve the gain or loss of a single unit of charge, Rutherford's alleged case of doubly ionized alpha particles was presumably an exception. Yet helium was known to be an inert gas and thus hardly a likely candidate for such exceptional ionization behaviour. To establish the connection, therefore, Rutherford resorted to a spectroscopic test. He collected spent alpha particles shot into a thin glass tube and gradually observed the spectrum of helium. Rutherford had thus been correct in his assumption, but a proper explanation was possible only after the confirmation of the nuclear structure of the atom.  相似文献   

2.
In this paper we examine the reaction of the Leiden low-temperature laboratory of Heike Kamerlingh Onnes to new ideas in quantum theory. Especially the contributions of Albert Einstein (1906) and Peter Debye (1912) to the theory of specific heat, and the concept of zero-point energy formulated by Max Planck in 1911, gave a boost to solid state research to test these theories. In the case of specific heat measurements, Kamerlingh Onnes's laboratory faced stiff competition from Walter Nernst's Institute of Physical Chemistry in Berlin. In fact, Berlin got the better of it because Leiden lacked focus. After the liquefaction of helium in 1908, Kamerlingh Onnes transformed his laboratory into an international facility for low temperature research, and for this reason it was impossible to make headway with the specific heat measurements. In the case of zero-point energy, Leiden developed a magnetic research programme to test the concept. Initially the balance of evidence seemed to be tipping in favour of zero-point energy. After 1914, however, Leiden would desert the theory in fovour, of a concept from calssical physics. A curious move that illustrates Kamerlingh Onnes's discomfort with the new quantum theory.  相似文献   

3.
4.
The term “analogy” stands for a variety of methodological practices all related in one way or another to the idea of proportionality. We claim that in his first substantial contribution to electromagnetism James Clerk Maxwell developed a methodology of analogy which was completely new at the time or, to borrow John North’s expression, Maxwell’s methodology was a “newly contrived analogue”. In his initial response to Michael Faraday’s experimental researches in electromagnetism, Maxwell did not seek an analogy with some physical system in a domain different from electromagnetism as advocated by William Thomson; rather, he constructed an entirely artificial one to suit his needs. Following North, we claim that the modification which Maxwell introduced to the methodology of analogy has not been properly appreciated. In view of our examination of the evidence, we argue that Maxwell gave a new meaning to analogy; in fact, it comes close to modeling in current usage.  相似文献   

5.
In spite of the paradigmatic status of the Big Bang model of the universe, the genesis of this idea has never been examined in detail. This paper investigates how the Belgian physicist and cosmologist Georges Lemaître in 1931 arrived at the hypothesis that the universe had begun in a Big Bang, or what he called a ‘primeval atom’. Four years earlier, he had suggested a closed expanding model in which the universe slowly inflated from an equilibrium Einstein state, but in 1931 he advocated an abrupt beginning from an initial, superdense concentration of nuclear matter. Why did Lemaître believe that the universe had a definite beginning a finite time ago? It turns out that the law of increase of entropy was one motivation, and that the existence of long-lived radioactive substances was another. Contrary to what is often stated, he most likely had the idea of an exploding universe before 1931. Among his chief inspirations to think about the origin of the universe, we draw attention to his persistent fascination of light as the primeval state of the world. Although this idea was originally seen in a theological perspective, religion played no direct role in Lemaître's hypothesis of 1931.  相似文献   

6.
Existing scholarship on animal models tends to foreground either of the two major roles research organisms play in different epistemic contexts, treating their representational and instrumental roles separately. Based on an empirical case study, this article explores the changing relationship between the two epistemic roles of a research organism over the span of a decade, while the organism was used to achieve various knowledge ends. This rat model was originally intended as a replica of human susceptibility to cardiac arrest. In a fortunate stroke of serendipity, however, the experimenters detected the way mother-infant interactions regulated the pups’ resting cardiac rate. This intriguing outcome thus became the model’s new representational target and began driving the development of an experimental system. Henceforth, the model acquired an instrumental function, serving to detect and measure system-specific differences. Its subsequent development involved creating stimulus-response measures to explain and theorize those differences. It was this instrumental use of the model that pushed the experimenters into unchartered territory and conferred to the model an ability to adapt to varied epistemic contexts. Despite the prominence of this instrumental role, however, the model’s representational power continued to guide research. The model’s representational target was widened beyond heart rate to reflect other functional phenomena, such as behavioral activity and sleep/wake rhythm. The rat model was thus transformed from an experimental organism designed to instantiate cardiac regulation to a model organism taken to represent the development of a whole, intact animal under the regulatory influence of maternal care. This article examines this multifaceted transformation within the context of the salient shifts in modeling practice and variations in the model’s representational power. It thus explores how the relationship between the representational and instrumental uses of the model changed with respect to the varying exigencies of the investigative context, foregrounding its contextual versatility.  相似文献   

7.
8.
James Geikie's Great Ice Age (1874) first presented to the geological public the Pleistocene. modern interpretation of alternating mild and cold periods during the Though it was supported by geological evidence, Geikie's view of the Ice Age was based on a theoretical framework supplied by the climatic physics of James Croll. Mid-nineteenth-century British geologists had encountered great difficulty in making sense out of the varied and complicated glacial deposits, or ‘drift’, and had formulated the ‘iceberg’ theory to account for the apparent chaos of the drift, an explanation which discouraged its stratigraphic study. The reaffirmation of faith in continental glaciation by several Scottish geologists in the 1850s brought with it a belief in an eventful Pleistocene, but it remained difficult to discover the events of Ice Age history from study of the glacial deposits. In 1864 Croll presented a detailed climatic history of the Ice Age deduced from astronomy and physical geography. By 1871 James Geikie was using Croll's scheme of Ice Age history as the basis for his impressive synthesis of Pleistocene data from throughout the world.  相似文献   

9.
In 1925 a debate erupted in the correspondence columns of the British Medical Journal concerning the effectiveness of eating raw pancreas as a treatment for diabetes. Enthusiasts were predominantly general practitioners (GPs), who claimed success for the therapy on the basis of their clinical impressions. Their detractors were laboratory‐oriented ‘biochemist‐physicians,’ who considered that their own experiments demonstrated that raw pancreas therapy was ineffective. The biochemist‐physicians consistently dismissed the GPs' observations as inadequately ‘controlled’. They did not define the meaning of ‘control’ in this context, although it clearly did not have the term's present‐day meaning of a trial employing an untreated comparison group of patients. Rather, the physician‐biochemists' ‘properly controlled’ experiments involved careful regulation of their patients' diet and other environmental factors, and evaluation of the therapy's success through biochemical, rather than just clinical, criteria. However, my analysis suggests that these factors alone are inadequate to account for the biochemist‐physicians' dismissal of the GPs' work as ‘uncontrolled’. I suggest that the biochemist‐physicians were deliberately exploiting the powerful rhetorical connotations of the term ‘control’. Ultimately, they implied that only a trial which they themselves had conducted could be deemed ‘adequately controlled’.  相似文献   

10.
This paper revisits the debate between Harry Collins and Allan Franklin, concerning the experimenters' regress. Focusing my attention on a case study from recent psychology (regarding experimental evidence for the existence of a Mozart Effect), I argue that Franklin is right to highlight the role of epistemological strategies in scientific practice, but that his account does not sufficiently appreciate Collins's point about the importance of tacit knowledge in experimental practice. In turn, Collins rightly highlights the epistemic uncertainty (and skepticism) surrounding much experimental research. However, I will argue that his analysis of tacit knowledge fails to elucidate the reasons why scientists often are (and should be) skeptical of other researchers' experimental results. I will present an analysis of tacit knowledge in experimental research that not only answers to this desideratum, but also shows how such skepticism can in fact be a vital enabling factor for the dynamic processes of experimental knowledge generation.  相似文献   

11.
The first classification of general types of transition between phases of matter, introduced by Paul Ehrenfest in 1933, lies at a crossroads in the thermodynamical study of critical phenomena. It arose following the discovery in 1932 of a suprising new phase transition in liquid helium, the “lambda transition,” when W. H. Keesom and coworkers in Leiden, Holland observed a λhaped “jump” discontinuity in the curve giving the temperature dependence of the specific heat of helium at a critical value. This apparent jump led Ehrenfest to introduce a classification of phase transitions on the basis of jumps in derivatives of the free energy function. This classification was immediately applied by A.J. Rutgers to the study of the transition from the normal to superconducting state in metals. Eduard Justi and Max von Laue soon questioned the possibility of its class of “second-order phase transitions” -- of which the “lambda transition was believed to be the arche type -- but C.J. Gorter and H.B.G. Casimir used an “order parameter to demonstrate their existence in superconductors. As a crossroads of study, the Ehrenfest classification was forced to undergo a slow, adaptive evolution during subsequent decades. During the 1940s the classification was increasingly used in discussions of liquid-gas, order-disorder, paramagnetic-ferromagnetic and normal-super-conducting phase transitions. Already in 1944 however, Lars Onsagers solution of the Ising model for two-dimensional magnets was seen to possess a derivative with a logarithmic divergence rather than a jump as the critical point was approached. In the 1950s, experiments further revealed the lambda transition in helium to exhibit similar behavior. Rather than being a prime example of an Ehrenfest phase transition, the lambda transition was seen to lie outside the Ehrenfest classification. The Ehrenfest scheme was then extended to include such singularities, most notably by A. Brain Pippard in 1957, with widespread acceptance. During the 1960s these logarithmic infinities were the focus of the investigation of “scaling” by Leo Kadanoff, B. Widom and others. By the 1970s, a radically simplified binary classification of phase transitions into “first-order” and “continuous” transitions was increasingly adopted.  相似文献   

12.
Commentators often claim that the bodies of Spinoza’s physics resist the changes they undergo. But it’s not always clear what they mean when they say this, or whether they are entitled to say it. This article clarifies what it might mean to for Spinoza’s bodies to resist change, and examines the evidence for such a view. In the first half, the author argues that there is some limited evidence for such a view, but not nearly as much as people think. In the second half, the author proposes looking for a mental analogue to collision in the realm of ideas and argues that adequacy amounts to a meaningful concept of resistance in Spinoza, albeit one that is incomplete.  相似文献   

13.
Well-known in his day, but overlooked since, Erasmus King lectured in natural and experimental philosophy from the 1730s until 1756 at his Westminster home and twenty other venues, publicizing his frequent courses exclusively in the Daily Advertiser. In 1739 he escorted Desaguliers's youngest son to Russia, hoping to demonstrate experimental philosophy to the Russian empress. En route, he conducted trials with a sea-guage in the Baltic which were reported by Stephen Hales in his Statical Essays. Various sources testify to King's subsequent experimental research for Hales in the fields of anatomy, respiration and electricity. There is recorded evidence for the exceptional range and quality of King's scientific apparatus and models.  相似文献   

14.
Recent years have seen the development of an approach both to general philosophy and philosophy of science often referred to as ‘experimental philosophy’ or just ‘X-Phi’. Philosophers often make or presuppose empirical claims about how people would react to hypothetical cases, but their evidence for claims about what ‘we’ would say is usually very limited indeed. Philosophers of science have largely relied on their more or less intimate knowledge of their field of study to draw hypothetical conclusions about the state of scientific concepts and the nature of conceptual change in science. What they are lacking is some more objective quantitative data supporting their hypotheses. A growing number of philosophers (of science), along with a few psychologists and anthropologists, have tried to remedy this situation by designing experiments aimed at systematically exploring people’s reactions to philosophically important thought experiments or scientists’ use of their scientific concepts. Many of the results have been surprising and some of the conclusions drawn from them have been more than a bit provocative. This symposium attempts to provide a window into this new field of philosophical inquiry and to show how experimental philosophy provides crucial tools for the philosopher and encourages two-way interactions between scientists and philosophers.  相似文献   

15.
Mitochondria are crucial organelles as their role in cellular energy production of eukaryotes. Because the brain cells demand high energy for maintaining their normal activities, disturbances in mitochondrial physiology may lead to neuropathological events underlying neurodegenerative conditions such as Alzheimer’s disease, Parkinson’s disease and Huntington’s disease. Melatonin is an endogenous compound with a variety of physiological roles. In addition, it possesses potent antioxidant properties which effectively play protective roles in several pathological conditions. Several lines of evidence also reveal roles of melatonin in mitochondrial protection, which could prevent development and progression of neurodegeneration. Since the mitochondrial dysfunction is a primary event in neurodegeneration, the neuroprotection afforded by melatonin is thereby more effective in early stages of the diseases. This article reviews mechanisms which melatonin exerts its protective roles on mitochondria as a potential therapeutic strategy against neurodegenerative disorders.  相似文献   

16.
Thomas Kuhn and Paul Feyerabend promote incommensurability as a central component of their conflicting accounts of the nature of science. This paper argues that in so doing, they both develop Albert Einstein's views, albeit in different directions. Einstein describes scientific revolutions as conceptual replacements, not mere revisions, endorsing ‘Kant-on-wheels’ metaphysics in light of ‘world change’. Einstein emphasizes underdetermination of theory by evidence, rational disagreement in theory choice, and the non-neutrality of empirical evidence. Einstein even uses the term ‘incommensurable’ specifically to apply to challenges posed to comparatively evaluating scientific theories in 1949, more than a decade before Kuhn and Feyerabend. This analysis shows how Einstein anticipates substantial components of Kuhn and Feyerabend's views, and suggests that there are strong reasons to suspect that Kuhn and Feyerabend were directly inspired by Einstein's use of the term ‘incommensurable’, as well as his more general methodological and philosophical reflections.  相似文献   

17.
In this paper we attempt to investigate the historical and methodological aspects of the developments related to superfluid helium, concentrating on the period between 1941 and 1955. During this period, the various developments constituted a series of steps towards redefining and refining the two-fluid concept devised to explain the unexpected macroscopic behaviour of superfluid helium. The idea that superfluids are essentially ‘quantum structures on a macroscopic scale’ functioned as a heuristic principle which guided the theoretical physicists engaged in the above research programme.  相似文献   

18.
Many philosophers who do not analyze laws of nature as the axioms and theorems of the best deductive systems nevertheless believe that membership in those systems is evidence for being a law. This raises the question, “If the best systems analysis fails, what explains the fact that being a member of the best systems is evidence for being a law?” In this essay I answer this question on behalf of Leibniz. I argue that although Leibniz’s philosophy of laws is inconsistent with the best systems analysis, his philosophy of nature’s perfection enables him to explain why membership in the best systems is evidence for being a law of nature.  相似文献   

19.
This paper provides a detailed account of the period of the complex history of British algebra and geometry between the publication of George Peacock's Treatise on Algebra in 1830 and William Rowan Hamilton's paper on quaternions of 1843. During these years, Duncan Farquharson Gregory and William Walton published several contributions on ‘algebraical geometry’ and ‘geometrical algebra’ in the Cambridge Mathematical Journal. These contributions enabled them not only to generalize Peacock's symbolical algebra on the basis of geometrical considerations, but also to initiate the attempts to question the status of Euclidean space as the arbiter of valid geometrical interpretations. At the same time, Gregory and Walton were bound by the limits of symbolical algebra that they themselves made explicit; their work was not and could not be the ‘abstract algebra’ and ‘abstract geometry’ of figures such as Hamilton and Cayley. The central argument of the paper is that an understanding of the contributions to ‘algebraical geometry’ and ‘geometrical algebra’ of the second generation of ‘scientific’ symbolical algebraists is essential for a satisfactory explanation of the radical transition from symbolical to abstract algebra that took place in British mathematics in the 1830s–1840s.  相似文献   

20.
Recent financial research has provided evidence on the predictability of asset returns. In this paper we consider the results contained in Pesaran and Timmerman (1995), which provided evidence on predictability of excess returns in the US stock market over the sample 1959–1992. We show that the extension of the sample to the nineties weakens considerably the statistical and economic significance of the predictability of stock returns based on earlier data. We propose an extension of their framework, based on the explicit consideration of model uncertainty under rich parameterizations for the predictive models. We propose a novel methodology to deal with model uncertainty based on ‘thick’ modelling, i.e. on considering a multiplicity of predictive models rather than a single predictive model. We show that portfolio allocations based on a thick modelling strategy systematically outperform thin modelling. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号