首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
As a defender of the fundamental importance of Mendel’s experiments for understanding heredity, the English biologist William Bateson (1861–1926) did much to publicize the usefulness of Mendelian science for practical breeders. In the course of his campaigning, he not only secured a reputation among breeders as a scientific expert worth listening to but articulated a vision of the ideal relations between pure and applied science in the modern state. Yet historical writing about Bateson has tended to underplay these utilitarian elements of his program, to the extent of portraying him, notably in still-influential work from the 1960s and 1970s, as a type specimen of the scientist who could not care less about application. This paper offers a corrective view of Bateson himself—including the first detailed account of his role as an expert witness in a courtroom dispute over the identity of a commercial pea variety—and an inquiry into the historiographic fate of his efforts in support of Mendelism’s productivity. For all that a Marxian perspective classically brings applied science to the fore, in Bateson’s case, and for a range of reasons, it did the opposite during the Cold War.  相似文献   

2.
This paper traces the background to R. A. Fisher's multi-factorial theory of inheritance. It is argued that the traditional account is incomplete, and that Karl Pearson's well-known pre-Fisherian objections to the theory were in fact overcome by Pearson himself. It is further argued that Pearson's stated reasons for not accepting his own achievement has to be seen as a rationalization, standing in for deeper-seated metaphysical objections to the Mendelian paradigm of a type not readily discussed in a formal scientific paper. The apparent, post-Fisherian, continued acceptance of Pearson's objections is presented as an interesting problem for the historian and sociologist.  相似文献   

3.
A prevalent narrative locates the discovery of the statistical phenomenon of regression to the mean in the work of Francis Galton. It is claimed that after 1885, Galton came to explain the fact that offspring deviated less from the mean value of the population than their parents did as a population-level statistical phenomenon and not as the result of the processes of inheritance. Arguing against this claim, we show that Galton did not explain regression towards mediocrity statistically, and did not give up on his ideas regarding an inheritance process that caused offspring to revert to the mean. While the common narrative focuses almost exclusively on Galton’s statistics, our arguments emphasize the anthropological and biological questions that Galton addressed. Galton used regression towards mediocrity to support the claim that some biological types were more stable than others and hence were resistant to evolutionary change. This view had implications concerning both natural selection and eugenics. The statistical explanation attributed to Galton appeared later, during the biometrician-mutationist debate in the early 1900s. It was in the context of this debate and specifically by the biometricians, that the development of the statistical explanation was originally attributed to Galton.  相似文献   

4.
It is argued that Hugo de Vries's conversion to Mendelism did not agree with his previous theoretical framework. De Vries regarded the number of offspring expressing a certain character as a hereditary quality, intrinsic to the state of the pangene involved. His was a shortlived conversion since after the ‘rediscovery’ he failed to unify his older views with Mendelism. De Vries was never very much of a Mendelian. The usual stories of the Dutch ‘rediscovery’ need, therefore, a considerable reshaping.  相似文献   

5.
Epigenesis has become a far more exciting issue in Kant studies recently, especially with the publication of Jennifer Mensch's Kant’ Organicism. In my commentary, I propose to clarify my own position on epigenesis relative to that of Mensch and others by once again considering the discourse of epigenesis in the wider eighteenth century. Historically, I maintain that Kant was never fully an epigenesist because he feared its materialist implications. This makes it highly unlikely that he drew heavily, as other interpreters like Dupont and Huneman have suggested, on Caspar Friedrich Wolff for his ultimate theory of “generic preformation.” In order to situate more precisely what Kant made of epigenesis, I distinguish his metaphysical use, as elaborated by Mensch, from his view of it as a theory for life science. In that light, I raise questions about the scope and authority of philosophy vis a vis natural science.  相似文献   

6.
Work throughout the history and philosophy of biology frequently employs ‘chance’, ‘unpredictability’, ‘probability’, and many similar terms. One common way of understanding how these concepts were introduced in evolution focuses on two central issues: the first use of statistical methods in evolution (Galton), and the first use of the concept of “objective chance” in evolution (Wright). I argue that while this approach has merit, it fails to fully capture interesting philosophical reflections on the role of chance expounded by two of Galton's students, Karl Pearson and W.F.R. Weldon. Considering a question more familiar from contemporary philosophy of biology—the relationship between our statistical theories of evolution and the processes in the world those theories describe—is, I claim, a more fruitful way to approach both these two historical actors and the broader development of chance in evolution.  相似文献   

7.
The concept of phenomenotechnique has been regarded as Bachelard's most original contribution to the philosophy of science. Innovative as this neologism may seem, it benefited from a generation of debates on the nature and status of scientific facts, among conventionalist thinkers and their opponents. Granting that Bachelard stood among the opponents to conventionalism, this article nonetheless reveals deep similarities between his work and that of two conventionalist thinkers who insisted on what we call today the theory-ladenness of scientific experiment: Pierre Duhem and Édouard Le Roy. This article, therefore, compares Bachelard's notion of phenomenotechnique with Duhem's developments on the double character of scientific instruments, and with Le Roy's claim that scientific facts are fabricated to meet the requirements of theory. It shows how Bachelard retained Duhem and Le Roy's views on the interplay between theory and experiment but rejected their sceptical conclusions on the limitations of experimental control. It claims that this critical inheritance of conventionalism was made possible by a reflection on technology, which led Bachelard to re-evaluate the artificiality of scientific facts: instead of regarding this artificiality as a limitation of science, as Le Roy did, he presented it as a condition for objective knowledge.  相似文献   

8.
Otsubo S 《Annals of science》2005,62(2):205-231
This paper explores the eugenic through of Yamanouchi Shigeo (1876-1973), who was trained in plant cytology under the tutelage of botanist and eugenicist John Coulter (1851-1928) in the USA, and later become one of the early and important popularizers of eugenic ideas in Japan. His career demonstrates a direct link between Japanese and US eugenics. Despite his academic training and research at various internationally renowned institutions, numerous publications, and longevity, his life has received little scholarly attention. By the early twentieth century, most biologists in Japan, as in the USA, began accepting Mendelian evolutionary theory and rejecting the Lamarckian notion of inheritance of acquire characteristics. However, Yamanouchi Shigeo's eugenic view represents a paradox: he was a mendelian cytologist sympathetic to Lamarckism. Was his 'nurture'-oriented eugenic view unscientific? is that why he was largely ignored in the history of botany in Japan? This study attempts to answer these questions and to analyse the origins and distinct features of Yamanouchi's eugenic ideas by situating Yamanouchi's eugenic through historically and culturally. After examining his scientific papers, popular writings, and documents of various organizations to which he belonged, I argue that Yamanouchi's 'softer' (or less biologically deterministic) perspective may have reflected the Japanese desire to catch up with the dominant 'race' by using eugenics without accepting permanent inferior status.  相似文献   

9.
In 1918, Henry de Dorlodot—priest, theologian, and professor of geology at the University of Louvain (Belgium)—published Le Darwinisme au point de vue de l'Orthodoxie Catholique (translated as Darwinism and Catholic Thought) in which he defended a reconciliation between evolutionary theory and Catholicism with his own particular kind of theistic evolutionism. He subsequently announced a second volume in which he would extend his conclusions to the origin of Man. Traditionalist circles in Rome reacted vehemently. Operating through the Pontifical Biblical Commission, they tried to force Dorlodot to withdraw his book and to publicly disown his ideas by threatening him with an official condemnation, a strategy that had been used against Catholic evolutionists since the late nineteenth century. The archival material on the ‘Dorlodot affair’ shows how this policy ‘worked’ in the early stages of the twentieth century but also how it would eventually reach the end of its logic. The growing popularity of theistic evolutionism among Catholic intellectuals, combined with Dorlodot's refusal to pull back amidst threats, made certain that the traditionalists did not get their way completely, and the affair ended in an uncomfortable status quo. Dorlodot did not receive the official condemnation that had been threatened, nor did he withdraw his theories, although he stopped short on publishing on the subject. With the decline of the traditionalists’ power and authority, the policy of denunciation towards evolutionists made way for a growing tolerance. The ‘Dorlodot affair’—which occurred in a pivotal era in the history of the Church—can be seen as exemplary with regards to the changing attitude of the Roman authorities towards evolutionism in the first half of the twentieth century.  相似文献   

10.
Nicolas-Auguste Tissot (1824–1897) published a series of papers on cartography in which he introduced a tool which became known later on, among geographers, under the name of the Tissot indicatrix. This tool was broadly used during the twentieth century in the theory and in the practical aspects of the drawing of geographical maps. The Tissot indicatrix is a graphical representation of a field of ellipses on a map that describes its distortion. Tissot studied extensively, from a mathematical viewpoint, the distortion of mappings from the sphere onto the Euclidean plane that are used in drawing geographical maps, and more generally he developed a theory for the distortion of mappings between general surfaces. His ideas are at the heart of the work on quasiconformal mappings that was developed several decades after him by Grötzsch, Lavrentieff, Ahlfors and Teichmüller. Grötzsch mentions the work of Tissot, and he uses the terminology related to his name (in particular, Grötzsch uses the Tissot indicatrix). Teichmüller mentions the name of Tissot in a historical section in one of his fundamental papers where he claims that quasiconformal mappings were used by geographers, but without giving any hint about the nature of Tissot’s work. The name of Tissot is missing from all the historical surveys on quasiconformal mappings. In the present paper, we report on this work of Tissot. We shall mention some related works on cartography, on the differential geometry of surfaces, and on the theory of quasiconformal mappings. This will place Tissot’s work in its proper context.  相似文献   

11.
This article traces the origins of Kenneth Wilson's conception of effective field theories (EFTs) in the 1960s. I argue that what really made the difference in Wilson's path to his first prototype of EFT are his long-standing pragmatic aspirations and methodological commitments. Wilson's primary interest was to work on mathematically interesting physical problems and he thought that progress could be made by treating them as if they could be analyzed in principle by a sufficiently powerful computer. The first point explains why he had no qualms about twisting the structure of field theories; the second why he divided the state-space of a toy model field theory into continuous slices by following a standard divide-and-conquer algorithmic strategy instead of working directly with a fully discretized and finite theory. I also show how Wilson's prototype bears the mark of these aspirations and commitments and clear up a few striking ironies along the way.  相似文献   

12.
The empiricism of eighteenth-century experimental science meant that the development of scientific instruments influenced the formulation of new concepts; a two-way process for new theory also affected instrument design. This relationship between concept and instrumentation will be examined by tracing the development of electrical instruments and theory during this period. The different functions fulfilled by these devices will also be discussed. Empiricism was especially important in such a new field of research as electricity, for it gave rise to phenomena that could not have been predicted by theory alone. However, the interpretation of these phenomena, and what the natural philosopher thought he observed, were often unconsciously determined by current ideas and attitudes; the interaction between instrumentally induced phenomena and observation was more complex than was realized at the time. The shortcomings of this empirical approach will be discussed. In the case of electricity this became increasingly apparent during the latter part of the century. The many discoveries had to be placed in a unifying framework before new advances could be made. Instruments, however, continued to play an important role in scientific progress, for they made visible what was hidden in nature.  相似文献   

13.
Two works on hydrostatics, by Simon Stevin in 1586 and by Blaise Pascal in 1654, are analysed and compared. The contrast between the two serves to highlight aspects of the qualitative novelty involved in changes within science in the first half of the seventeenth century. Stevin attempted to derive his theory from unproblematic postulates drawn from common sense but failed to achieve his goal insofar as he needed to incorporate assumptions involved in his engineering practice but not sanctioned by his postulates. Pascal's theory went beyond common sense by introducing a novel concept, pressure. Theoretical reflection on novel experiments was involved in the construction of the new concept and experiment also provided important evidence for the theory that deployed it. The new experimental reasoning was qualitatively different from the Euclidean style of reasoning adopted by Stevin. The fact that a conceptualization of a technical sense of pressure adequate for hydrostatics was far from obvious is evident from the work of those, such as Galileo and Descartes, who did not make significant moves in that direction.  相似文献   

14.
15.
In 1668 Robert Hooke recognised the utility of a barometer which could foretell storms at sea, but neither he nor his contemporaries in Britain or elsewhere in Europe succeeded in constructing such an instrument which would work reliably on a moving ship. Theorists and instrument makers, including Hooke, Amontons, De Luc, Passement, Magellan and Blondeau proposed novel forms of tube, but at the time it was not possible to work glass to the suggested shape. The competition between France and England was won by Edward Nairne, who devised the constricted-tube barometer for Captain Cook's second voyage of 1772-75. Nairne barometers were soon taken on other British exploring voyages, but French ships were slow to follow the pattern, possibly in consequence of naval disruption following the Revolution. The earliest Nairne examples were adapted from the domestic barometer, with the tube mounted on a flat back, but within the lifetime of Nairne &; Blunt marine barometers adopted the form common for most of the nineteenth century, with the tube enclosed within a square or round-section wooden frame.  相似文献   

16.
Euler invented integral transforms in the context of second order differential equations. He used them in a fragment published in 1763 and in a chapter of Institutiones Calculi Integralis (1769). In introducing them he made use of earlier work in which a concept akin to the integral transform is implicit. It would, however, be reading too much into that earlier work to see it as contributing to the theory of the integral transform. Other work sometimes cited in this context in fact has different concerns.  相似文献   

17.
In the first decades of the nineteenth century the French mechanicians—Cauchy and Poisson amongst them—developed a theory of linear elasticity according to which matter is composed of material points. They believed that these points interact by means of opposite central forces, whose magnitude depends on the length of the segment joining the particles. This theory suggested that homogeneous isotropic materials were characterized by a unique elastic constant. Later experiments, however, showed that two elastic constants were necessary. These results undermined the corpuscular model of matter as well as the interpretation of elasticity in terms of central intermolecular actions. The continuous theory of Green, based on the postulate that a potential function exists, gained fresh consensus in light of these experiments. These opposite views continued throughout the nineteenth century until Woldemar Voigt proposed a molecular model confirmed by experiments. This article presents the theories of each of these scientists and describes the contrasting views of nineteenth-century mechanicians.  相似文献   

18.
In the second half of the eighteenth century a lively debate was going on in Germany about the nature of light. One important contribution to this discussion, namely a paper by Nicolas Béguelin, is studied in this article. In his essay, Béguelin compared the Newtonian emission theory of light and the wave theory of Leonhard Euler. Whereas others opted for one of the two theories by invoking arguments or authorities, Béguelin made a systematic search for experiments which he hoped would settle the dispute. Two of these experiments were most original. The first, which Béguelin himself performed, concerned light rays grazing a glass surface. For several reasons it did not have the impact it deserved. The second one was a thought experiment which was meant to illustrate a major tenet of the wave theory, that is, the analogy between light and sound. An analysis is given of these two experiments, and it is shown that neither of them brought the debate to an end.  相似文献   

19.
The nineteenth-century American scientist, philosopher and teacher Joseph LeConte (1823–1901) is well-known for his writings on geology and the reconciliation of evolutionary theory and religion, but he has not been properly recognized for his contributions to the physiology and psychology of vision. This study explores and assesses his work in the latter field, showing the nature of his original investigations into human vision and the influence of his book Sight: an exposition of the principles of monocular and binocular vision, which served as the major textbook on the subject in the United States from its publication in 1881 until after the turn of the century. Grounded in neo-Lamarckian evolutionary theory, LeConte's publications on vision had a strong impact upon subsequent studies of the phenomenon of human sight.  相似文献   

20.
Historians have explored the continuities between science and the arts in the Industrial Revolution, with much recent historiography emphasizing the hybrid nature of the activities of men of science around 1800. Chemistry in particular displayed this sort of hybridity between the philosophical and practical because the materials under investigation were important across the research spectrum. Inflammable gases were an example of such hybrid objects: pneumatic chemists through the eighteenth century investigated them, and in the process created knowledge, processes and instruments essential for the creation of a new gaslight industry from 1800. Once this industry began to expand and mature, the interests and experiments of the gas industry stimulated new research work which in turn had relevance for theoretical debates.

This paper explores how the emergence of the gas industry from 1800 provided an impetus for new work in theoretical chemistry. Boulton & Watt, important pioneers of the gas industry, explored the compositions of inflammable gases for practical purposes: the composition of these gases had an important effect on the luminosity of gaslights, and hence the economics of the new technology compared to older forms of lighting. As they explored these questions in their engineering work, they stimulated their friend William Henry to explore the nature of these gases further, and he carried out a series of experiments to determine their composition more exactly than Boulton & Watt had done. Henry published a series of paper between 1805 and 1820 where he made arguments about the compositions of inflammable airs, and further related these to contemporary debates about the laws of multiple and definite proportions, as well as John Dalton's atomism. Henry's research was also a hybrid of the theoretical and practical in that he tried to develop results useful for the fledgling gas industry. Specifically, he suggested the best kind of coal to use, and showed how gas quality varied with distillation time and temperature.  相似文献   


设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号