共查询到20条相似文献,搜索用时 0 毫秒
1.
2.
《Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics》1999,30(2):237-259
Recent authors have raised objections to the counterfactual interpretation of the Aharonov–Bergmann–Lebowitz (ABL) rule of time-symmetrised quantum theory (TSQT). I distinguish between two different readings of the ABL rule, counterfactual and non-counterfactual, and confirm that TSQT advocate L. Vaidman is employing the counterfactual reading to which these authors object. Vaidman has responded to the objections by proposing a new kind of time-symmetrised counterfactual, which he has defined in two different ways. It is argued that neither definition succeeds in overcoming the objections, except in a limited special case previously noted by Cohen and Hiley. In addition, a connection is made between TSQT and Price’s concept of ‘advanced action’, which further supports the special case discussed. 相似文献
3.
4.
5.
6.
7.
J. A. Maclaren 《Cellular and molecular life sciences : CMLS》1961,17(8):346-347
Zusammenfassung Die früher als «Cystinmonosulfoxyd» beschriebene Verbindung (Thiosulfinat II) verhält sich wie eine äquimolekulare Mischung von Cystin (I) und dem entsprechenden Thiosulfonat (III). Oxydation von Cystin mit Perameisensäure führt, besonders in Gegenwart von HCl, über die Zwischenprodukte Thiosulfonat (III) und Sulfinsäure (VI) zu Cysteinsäure (VII). 相似文献
8.
Eric Schliesser 《Studies in history and philosophy of science》2012,43(1):160-171
This paper aims to contribute to a better understanding of the formation of the so-called Chicago-school of economics; it does so by focusing on (i) previously unpublished correspondence between George Stigler and Thomas Kuhn as well as (ii) Warren Nutter’s The Extent of Enterprise Monopoly in the United States, 1899–1939. Nutter’s book started out as a (1949) doctoral dissertation at The University of Chicago, part of Aaron Director’s Free Market Study. Besides Director, O.H. Brownlee and Milton Friedman were closely involved with supervising it. It was published by The University of Chicago Press in 1951. The book was explicitly understood as belonging to the “Chicago School” (Dow & Abernathy, 1963). But by the time of Reder’s well known (1982) review paper Nutter does not figure at all. I argue that the Stigler-Kuhn correspondence helps us better understand why Nutter disappeared from sight. More important, by contrasting the work of Nutter with that of Harberger, the episode reveals how Milton Friedman’s methodological statements became the rhetoric for a paradigm that was committed to a very different approach than the one advocated by Nutter or Friedman. 相似文献
9.
In The Structure of Scientific Revolutions, Kuhn famously advanced the claim that scientists work in a different world after a scientific revolution. Kuhn's view has been at the center of a philosophical literature that has tried to make sense of his bold claim, by listing Kuhn's view in good company with other seemingly constructivist proposals. The purpose of this paper is to take some steps towards clarifying what sort of constructivism (if any) is in fact at stake in Kuhn's view. To this end, I distinguish between two main (albeit not exclusive) notions of mind-dependence: a semantic notion and an ontological one. I point out that Kuhn's view should be understood as subscribing to a form of semantic mind-dependence, and conclude that semantic mind-dependence does not land us into any worrisome ontological mind-dependence, pace any constructivist reading of Kuhn. 相似文献
10.
Lukas M. Verburgt 《Annals of science》2016,73(1):40-67
This paper provides a detailed account of the period of the complex history of British algebra and geometry between the publication of George Peacock's Treatise on Algebra in 1830 and William Rowan Hamilton's paper on quaternions of 1843. During these years, Duncan Farquharson Gregory and William Walton published several contributions on ‘algebraical geometry’ and ‘geometrical algebra’ in the Cambridge Mathematical Journal. These contributions enabled them not only to generalize Peacock's symbolical algebra on the basis of geometrical considerations, but also to initiate the attempts to question the status of Euclidean space as the arbiter of valid geometrical interpretations. At the same time, Gregory and Walton were bound by the limits of symbolical algebra that they themselves made explicit; their work was not and could not be the ‘abstract algebra’ and ‘abstract geometry’ of figures such as Hamilton and Cayley. The central argument of the paper is that an understanding of the contributions to ‘algebraical geometry’ and ‘geometrical algebra’ of the second generation of ‘scientific’ symbolical algebraists is essential for a satisfactory explanation of the radical transition from symbolical to abstract algebra that took place in British mathematics in the 1830s–1840s. 相似文献
11.
My principal aims are to show that holding, adopting and endorsing (definitions of which I provide) are distinct cognitive attitudes that may be taken towards claims at different moments of scientific activities, and that none of them are reducible to acceptance (as defined by Jonathan Cohen); to explore in detail the differences between holding and accepting, using the controversies about GMOs to provide illustrations; and to draw some implications pertinent to democratic decision-making concerning public policies about science and technology, and to the responsibilities that scientists thereby incur. 相似文献
12.
H. Tiedemann 《Cellular and molecular life sciences : CMLS》1976,32(8):1078-1081
Summary Based on observations and on analogical deductions from the organization of the cerebellum, a hypothesis of a passive process of thinking is described. According to it, we try to program the cerebrum for the handling of an idea, regarding its solution we are, however, exposed passively to the brain. 相似文献
13.
Synaptic target recognition is a complex molecular event. In a differentiating presynaptic terminal, relatively ‘rare’ molecules first detect the cell identity of the synaptic target. Subsequently, many ‘common’ molecules continue the process of synaptogenesis. We present a theoretical framework for understanding synaptic target recognition and discuss the features of its molecular components and their integration, drawing on the rapid progress made in recent studies. 相似文献
14.
It is well documented that activation of calpain, a calcium-sensitive cysteine protease, marks the pathology of naturally and experimentally occuring neurodegenerative conditions. Calpain-mediated proteolysis of major membrane-skeletal protein, αII-spectrin, results in the appearance of two unique and highly stable breakdown products, which is an early event in neural cell pathology. This review focuses on spectrin degradation by calpain within neurons induced by diverse conditions, emphasizing a current picture of multi-pattern neuronal death and a recent success in the development of spectrin-based biomarkers. The issue is presented in the context of the major structural and functional properties of the two proteins.Received 7 March 2005; received after revision 22 April 2005; accepted 13 May 2005 相似文献
15.
16.
17.
Warren Alexander Dym 《Annals of science》2013,70(3):295-323
Mining companies after the Gold Rush depended heavily on foreign expertise, and yet historians of mining have glorified ‘German engineering’ in America. The application of German technology in America was fraught with difficulties, and most advances were micro- rather than macro-innovations, such as Philip Deidesheimer's famous square-set timbering on the Comstock Lode. The problem began at German mining schools, such as the Freiberg Mining Academy, where Americans like Louis and Henry Janin, while they acquired advanced training and adopted an engineering ethos, struggled to learn about Mexican and American mining. Having complemented their course of study to remedy this deficiency, the brothers returned to the US intending to modernize mining on the frontier. Louis attempted the ‘Freiberg Process’ of amalgamation on the Comstock Lode, but locally developed methods proved more feasible, and the experiment failed. He came to apply his training rather toward the micro-level problem of how to reprocess amalgamation waste heaps. 相似文献
18.
James A. Marcum 《Annals of science》2013,70(2):139-156
During the twentieth century, a controversy raged over the role of electrical forces and chemical substances in synaptic transmission. Although the story of the ‘main’ participants is well documented, the story of ‘lesser’ known participants is seldom told. For example, Alexander Forbes, who was a prominent member of the axonologists, played an active role in the controversy and yet is seldom mentioned in standard accounts of the controversy. During the 1930s, Forbes incorporated chemical substances into his theory of synaptic transmission, advocating a complementarity model for the role of electrical forces and chemical substances. By focusing on Forbes and the axonologists, the controversy is simply more than a debate over ‘soup’ vs. ‘sparks’ but also involves the relative roles of electrical forces and chemical substances in synaptic transmission. The implications of this case study for the nature of scientific controversies are also discussed. 相似文献
19.
Gabriele Gramelsberger 《Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics》2010,41(3):233-241
The scientific understanding of atmospheric processes has been rooted in the mechanical and physical view of nature ever since dynamic meteorology gained ground in the late 19th century. Conceiving the atmosphere as a giant ‘air mass circulation engine’ entails applying hydro- and thermodynamical theory to the subject in order to describe the atmosphere’s behaviour on small scales. But when it comes to forecasting, it turns out that this view is far too complex to be computed. The limitation of analytical methods precludes an exact solution, forcing scientists to make use of numerical simulation. However, simulation introduces two prerequisites to meteorology: First, the partitioning of the theoretical view into two parts—the large-scale behaviour of the atmosphere, and the effects of smaller-scale processes on this large-scale behaviour, so-called parametrizations; and second, the dependency on computational power in order to achieve a higher resolution. The history of today’s atmospheric circulation modelling can be reconstructed as the attempt to improve the handling of these basic constraints. It can be further seen as the old schism between theory and application under new circumstances, which triggers a new discussion about the question of how processes may be conceived in atmospheric modelling. 相似文献
20.
Deep ocean currents are not accessible to direct human perception. Their insertion into global structures of circulation is even more profoundly removed from individual sensorial experience. But oceanographers tend to use wider concepts of experience to include instruments, traditions of observation and theoretical models. Historians and philosophers of science, as well as STS scholars, have also redefined scientific experience as operational and collective transformations of parts of the world around us into fragments of larger bodies of knowledge. This paper pursues this definition to follow the instrumental and epistemological resources available to those “observing” deep-water circulation at the Strait of Gibraltar in two very distinct moments, ca. 1870 and ca. 1985, respectively through the works of scientists like William B. Carpenter and the transnational team involved in the Gibraltar Experiment. Detecting and mapping the Gibraltar undercurrent necessitated taking data of temperature and salinity as proxies for masses of water. Making it relevant to world ocean currents required the use of models and moving across scales. In both contexts, empires of global reach provided the globalizing motivations and infrastructures. 相似文献