首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 328 毫秒
1.
While philosophers have subjected Galileo's classic thought experiments to critical analysis, they have tended to largely ignored the historical and intellectual context in which they were deployed, and the specific role they played in Galileo's overall vision of science. In this paper I investigate Galileo's use of thought experiments, by focusing on the epistemic and rhetorical strategies that he employed in attempting to answer the question of how one can know what would happen in an imaginary scenario. Here I argue we can find three different answers to this question in Galileo later dialogues, which reflect the changing meanings of ‘experience’ and ‘knowledge’ (scientia) in the early modern period. Once we recognise that Galileo's thought experiments sometimes drew on the power of memory and the explicit appeal to ‘common experience’, while at other times, they took the form of demonstrative arguments intended to have the status of necessary truths; and on still other occasions, they were extrapolations, or probable guesses, drawn from a carefully planned series of controlled experiments, it becomes evident that no single account of the epistemological relationship between thought experiment, experience and experiment can adequately capture the epistemic variety we find Galileo's use of imaginary scenarios. To this extent, we cannot neatly classify Galileo's use of thought experiments as either ‘medieval’ or ‘early modern’, but we should see them as indicative of the complex epistemological transformations of the early seventeenth century.  相似文献   

2.
Two works on hydrostatics, by Simon Stevin in 1586 and by Blaise Pascal in 1654, are analysed and compared. The contrast between the two serves to highlight aspects of the qualitative novelty involved in changes within science in the first half of the seventeenth century. Stevin attempted to derive his theory from unproblematic postulates drawn from common sense but failed to achieve his goal insofar as he needed to incorporate assumptions involved in his engineering practice but not sanctioned by his postulates. Pascal's theory went beyond common sense by introducing a novel concept, pressure. Theoretical reflection on novel experiments was involved in the construction of the new concept and experiment also provided important evidence for the theory that deployed it. The new experimental reasoning was qualitatively different from the Euclidean style of reasoning adopted by Stevin. The fact that a conceptualization of a technical sense of pressure adequate for hydrostatics was far from obvious is evident from the work of those, such as Galileo and Descartes, who did not make significant moves in that direction.  相似文献   

3.
John D. Norton is responsible for a number of influential views in contemporary philosophy of science. This paper will discuss two of them. The material theory of induction claims that inductive arguments are ultimately justified by their material features, not their formal features. Thus, while a deductive argument can be valid irrespective of the content of the propositions that make up the argument, an inductive argument about, say, apples, will be justified (or not) depending on facts about apples. The argument view of thought experiments claims that thought experiments are arguments, and that they function epistemically however arguments do. These two views have generated a great deal of discussion, although there hasn't been much written about their combination. I argue that despite some interesting harmonies, there is a serious tension between them. I consider several options for easing this tension, before suggesting a set of changes to the argument view that I take to be consistent with Norton's fundamental philosophical commitments, and which retain what seems intuitively correct about the argument view. These changes require that we move away from a unitary epistemology of thought experiments and towards a more pluralist position.  相似文献   

4.
The paper examines the relevance of the nomological view of nature to three discussions of tide in the thirteenth century. A nomological conception of nature assumes that the basic explanatory units of natural phenomena are universally binding rules stated in quantitative terms. (1) Robert Grosseteste introduced an account of the tide based on the mechanism of rarefaction and condensation, stimulated by the Moon's rays and their angle of incidence. He considered the Moon's action over the sea an example of the general efficient causality exerted through the universal activity of light or species. (2) Albert the Great posited a plurality of causes which cannot be reduced to a single cause. The connaturality of the Moon and the water is the only principle of explanation which he considered universal. Connaturality, however, renders neither formulation nor quantification possible. While Albert stressed the variety of causes of the tide, (3) Roger Bacon emphasized regularity and reduced the various causes producing tides into forces. He replaced the terminology of ‘natures’ by one of ‘forces’. Force, which in principle can be accurately described and measured, thus becomes a commensurable aspect of a diverse cosmos. When they reasoned why waters return to their place after the tide, Grosseteste argued that waters return in order to prevent a vacuum, Albert claimed that waters ‘follow their own nature’, while Bacon held that the ‘proper force’ of the water prevails over the distant force of the first heaven. I exhibit, for the thirteenth century, moments of the move away from the Aristotelian concerns. The basic elements of these concerns were essences and natures which reflect specific phenomena and did not allow for an image of nature as a unified system. In the new perspective of the thirteenth century the key was a causal link between the position of the Moon and the tide cycle, a link which is universal and still qualitative, yet expressed as susceptible to quantification.  相似文献   

5.
Summary Bees are able to indicate direction to their hive comrades by means of a waggling dance of 2 kinds: in the horizontal plane with regard to the sun they point directly towards the goal by a waggling walk using the same angle to the sun as they took in their flight. Inside the dark hive in the vertical honey-comb, they transpose the angle between goal and sun to the field of gravity, whereby the sun's direction is shown by a waggling walk upwards, and the angle to the right or left of the sun's position is given by a dance-direction in the corresponding angle to the right or left of the zenith.If a piece of blue sky is made visible in an observation hive to the bees which are dancing in orientation by gravity, they recognise the position of the sun by this polarisation sample, and the effort to orientate themselves directly by the sun (as in the horizontal plane) comes into conflict with the orientation by gravity. The result is a dance direction which corresponds remarkably well with the halving of the angle between what the dance direction should have been by gravity and what it should have been by light orientation (Figure 1). This is also true when the bee is orientating itself by polarised sky light over its back, while the sun is at the other side of the honey-comb under its front (Figure 2), a situation which does not occur during flight but which is important for its dance in the swarm. The bees receiving the information compensate the deviation of the angle determined by light, and fly to the right goal.As the sun itself, as well as the piece of blue sky, was made visible to the dancers, its influence dominated and they orientated themselves by its light (Figure 3).  相似文献   

6.
This study considers the contribution of Francesco Patrizi da Cherso (1529–1597) to the development of the concepts of void space and an infinite universe. Patrizi plays a greater role in the development of these concepts than any other single figure in the sixteenth century, and yet his work has been almost totally overlooked. I have outlined his views on space in terms of two major aspects of his philosophical attitude: on the one hand, he was a devoted Platonist and sought always to establish Platonism, albeit his own version of it, as the only currect philosophy; and on the other hand, he was more determinedly anti-Aristotelian than any other philosopher at that time. Patrizi's concept of space has its beginnings in Platonic notions, but is extended and refined in the light of a vigorous critique of Aristotle's position. Finally, I consider the influence of Patrizi's ideas in the seventeenth century, when various thinkers are seeking to overthrow the Aristotelian concept of place and the equivalence of dimensionality with corporeality. Pierre Gassendi (1592–1652), for example, needed a coherent concept of void space in which his atoms could move, while Henry More (1614–1687) sought to demonstrate the reality of incorporeal entities by reference to an incorporeal space. Both men could find the arguments they needed in Patrizi's comprehensive treatment of the subject.  相似文献   

7.
Pierre Duhem's (1861–1916) lifelong opposition to 19th century atomic theories of matter has been traditionally attributed to his conventionalist and/or positivist philosophy of science. Relatively recently, this traditional view position has been challenged by the claim that Duhem's opposition to atomism was due to the precarious state of atomic theories during the beginning of the 20th century. In this paper I present some of the difficulties with both the traditional and the new interpretation of Duhem's opposition to atomism and provide a new framework in which to understand his rejection of atomic hypotheses. I argue that although not positivist, instrumentalist, or conventionalist, Duhem's philosophy of physics was not compatible with belief in unobservable atoms and molecules. The key for understanding Duhem's resistance to atomism during the final phase of his career is the historicist arguments he presented in support of his ideal of physics.  相似文献   

8.
9.
I claim that one way thought experiments contribute to scientific progress is by increasing scientific understanding. Understanding does not have a currently accepted characterization in the philosophical literature, but I argue that we already have ways to test for it. For instance, current pedagogical practice often requires that students demonstrate being in either or both of the following two states: 1) Having grasped the meaning of some relevant theory, concept, law or model, 2) Being able to apply that theory, concept, law or model fruitfully to new instances. Three thought experiments are presented which have been important historically in helping us pass these tests, and two others that cause us to fail. Then I use this operationalization of understanding to clarify the relationships between scientific thought experiments, the understanding they produce, and the progress they enable. I conclude that while no specific instance of understanding (thus conceived) is necessary for scientific progress, understanding in general is.  相似文献   

10.
There are two roles that association played in 18th–19th century associationism. The first dominates modern understanding of the history of the concept: association is a causal link posited to explain why ideas come in the sequence they do. The second has been ignored: association is merely regularity in the trains of thought, and the target of explanation. The view of association as regularity arose in several forms throughout the tradition, but Thomas Brown (1778–1820) makes the distinction explicit. He argues that there is no associative link, and association is mere sequence. I trace this view of association through the tradition, and consider its implications: Brown's views, in particular, motivate a rethinking of the associationist tradition in psychology. Associationism was a project united by a shared explanandum phenomenon, rather than a theory united by a shared theoretical posit.  相似文献   

11.
John Norton's Material Theory of Induction (Norton, 2003, 2005, 2008, forthcoming) has a two-fold, negative and positive, goal. The negative goal is to establish that formal logics of induction fail if they are understood as universally applicable schemas of induction. The positive goal is to establish that it is material facts that enable and justify inductive inferences. I argue in this paper that Norton is more successful with his negative than with his positive ambition. While I do not deny that facts constitute an important type of enabler and justifier of inductions, they are by no means the only type. This paper suggests that there are no less than six other types of background information scientists need and use to fuel and warrant inductions. The discussion of additional enablers and justifiers of inductions will further show there are practically important and intellectually challenging methodological issues Norton's theory prevents us from seeing because it leaves out this or that type of enabler and justifier.  相似文献   

12.
Autophagy is a constitutive lysosomal catabolic pathway that degrades damaged organelles and protein aggregates. Stem cells are characterized by self-renewal, pluripotency, and quiescence; their long life span, limited capacity to dilute cellular waste and spent organelles due to quiescence, along with their requirement for remodeling in order to differentiate, all suggest that they require autophagy more than other cell types. Here, we review the current literature on the role of autophagy in embryonic and adult stem cells, including hematopoietic, mesenchymal, and neuronal stem cells, highlighting the diverse and contrasting roles autophagy plays in their biology. Furthermore, we review the few studies on stem cells, lysosomal activity, and autophagy. Novel techniques to detect autophagy in primary cells are required to study autophagy in different stem cell types. These will help to elucidate the importance of autophagy in stem cells during transplantation, a promising therapeutic approach for many diseases.  相似文献   

13.
Psychologists in the early years of the discipline were much concerned with the stimulus-error. Roughly, this is the problem encountered in introspective experiments when subjects are liable to frame their perceptual reports in terms of what they know of the stimulus, instead of just drawing on their perceptual experiences as they are supposedly felt. “Introspectionist” psychologist E. B. Titchener and his student E. G. Boring both argued in the early 20th century that the stimulus-error is a serious methodological pit-fall. While many of the theoretical suppositions motivating Titchener and Boring have been unfashionable since the rise of behaviourism, the stimulus-error brings our attention to one matter of perennial importance to psychophysics and the psychology of perception. This is the fact that subjects are liable to give different kinds of perceptual reports in response to the same stimulus. I discuss attempts to control for variable reports in recent experimental work on colour and lightness constancy, and the disputes that have arisen over which kinds of reports are legitimate. Some contemporary psychologists do warn us against a stimulus-error, even though they do not use this terminology. I argue that concern over the stimulus-error is diagnostic of psychologists' deep theoretical commitments, such as their conception of sensation, or their demarcation of perception from cognition. I conclude by discussing the relevance of this debate to current philosophy of perception.  相似文献   

14.
The aim of this article is to provide a historical response to Michel Janssen’s (2009) claim that the special theory of relativity establishes that relativistic phenomena are purely kinematical in nature, and that the relativistic study of such phenomena is completely independent of dynamical considerations regarding the systems displaying such behavior. This response will be formulated through a historical discussion of one of Janssen's cases, the experiments carried out by Walter Kaufmann on the velocity-dependence of the electron's mass. Through a discussion of the different responses formulated by early adherents of the principle of relativity (Albert Einstein, Max Planck, Hermann Minkowski and Max von Laue) to these experiments, it will be argued that the historical development of the special theory of relativity argues against Janssen's historical presentation of the case, and that this raises questions about his general philosophical claim. It will be shown, more specifically, that Planck and Einstein developed a relativistic response to the Kaufmann experiments on the basis of their study of the dynamics of radiation phenomena, and that this response differed significantly from the response formulated by Minkowski and Laue. In this way, it will be argued that there were, at the time, two different approaches to the theory of relativity, which differed with respect to its relation to theory, experiment, and history: Einstein's and Planck's heuristic approach, and Minkowski's and Laue's normative approach. This indicates that it is difficult to say, historically speaking, that the special theory of relativity establishes the kinematical nature of particular phenomena. Instead, it will be argued that the theory of relativity should not be seen as a theory but rather as outlining an approach, and that the nature of particular scientific phenomena is something that is open to scientific debate and dispute.  相似文献   

15.
Olfactory navigation in birds   总被引:3,自引:0,他引:3  
Summary Many bird species rely on an osmotactic mechanism to find food sources even at a considerable distance. Pigeons also rely on local odours for homeward orientation, and they integrate those perceived during passive transportation with those at the release site. It is possible to design experiments in which birds are given false olfactory information, and predictions about the effects of this can be made and tested. Pigeons build up their olfactory map by associating wind-borne odours with the directions from which they come; this was shown by experiments which aimed at preventing, limiting or altering this association. Some objections have been made to this conclusion; namely that even anosmic pigeons are sometimes homeward oriented, that they may be demotivated in flying or disturbed in their general behaviour, and that olfactory cues may be only one component of pigeo's navigational repertoire. The most recent experiments, however, confirm that pigeons derive directional information from atmospheric odouts. The lack of any knowledge about the chemical nature and distribution of the odorants which allow pigeons to navigate hinders progress in this area of research.  相似文献   

16.
In this paper, I introduce a new historical case study into the scientific realism debate. During the late-eighteenth century, the Scottish natural philosopher James Hutton made two important successful novel predictions. The first concerned granitic veins intruding from granite masses into strata. The second concerned what geologists now term “angular unconformities”: older sections of strata overlain by younger sections, the two resting at different angles, the former typically more inclined than the latter. These predictions, I argue, are potentially problematic for selective scientific realism in that constituents of Hutton's theory that would not be considered even approximately true today played various roles in generating them. The aim here is not to provide a full philosophical analysis but to introduce the case into the debate by detailing the history and showing why, at least prima facie, it presents a problem for selective realism. First, I explicate Hutton's theory. I then give an account of Hutton's predictions and their confirmations. Next, I explain why these predictions are relevant to the realism debate. Finally, I consider which constituents of Hutton's theory are, according to current beliefs, true (or approximately true), which are not (even approximately) true, and which were responsible for these successes.  相似文献   

17.
The emergence of dimensional analysis in the early nineteenth century involved a redefinition of the pre-existing concepts of homogeneity and dimensions, which entailed a shift from a qualitative to a quantitative conception of these notions. Prior to the nineteenth century, these concepts had been used as criteria to assess the soundness of operations and relations between geometrical quantities. Notably, the terms in such relations were required to be homogeneous, which meant that they needed to have the same geometrical dimensions. The latter reflected the nature of the quantities in question, such as volume vs area. As natural philosophy came to encompass non-geometrical quantities, the need arose to generalize the concept of homogeneity. In 1822, Jean Baptiste Fourier consequently redefined it to be the condition an equation must satisfy in order to remain valid under a change of units, and the ‘dimension' correspondingly became the power of a conversion factor. When these innovations eventually found an echo in France and Great Britain, in the second half of the century, tensions arose between the former, qualitative understanding of dimensions as reflecting the nature of physical quantities, and the new, quantitative conception based on unit conversion and measurement. The emergence of dimensional analysis thus provides a case study of how existing rules and concepts can find themselves redefined in the context of wider conceptual changes; in the present case this redefinition involved a generalization, but also a shift in meaning which led to conceptual tensions.  相似文献   

18.
We all know that, nowadays, physics and philosophy are housed in separate departments on university campuses. They are distinct disciplines with their own journals and conferences, and in general they are practiced by different people, using different tools and methods. We also know that this was not always the case: up until the early 17th century (at least), physics was a part of philosophy. So what happened? And what philosophical lessons should we take away? We argue that the split took place long after Newton's Principia (rather than before, as many standard accounts would have it), and offer a new account of the philosophical reasons that drove the separation. We argue that one particular problem, dating back to Descartes and persisting long into the 18th century, played a pivotal role. The failure to solve it, despite repeated efforts, precipitates a profound change in the relationship between physics and philosophy. The culprit is the problem of collisions. Innocuous though it may seem, this problem becomes the bellwether of deeper issues concerning the nature and properties of bodies in general. The failure to successfully address the problem led to a reconceptualization of the goals and subject-matter of physics, a change in the relationship between physics and mechanics, and a shift in who had authority over the most fundamental issues in physics.  相似文献   

19.
We consider computational modeling in two fields: chronobiology and cognitive science. In circadian rhythm models, variables generally correspond to properties of parts and operations of the responsible mechanism. A computational model of this complex mechanism is grounded in empirical discoveries and contributes a more refined understanding of the dynamics of its behavior. In cognitive science, on the other hand, computational modelers typically advance de novo proposals for mechanisms to account for behavior. They offer indirect evidence that a proposed mechanism is adequate to produce particular behavioral data, but typically there is no direct empirical evidence for the hypothesized parts and operations. Models in these two fields differ in the extent of their empirical grounding, but they share the goal of achieving dynamic mechanistic explanation. That is, they augment a proposed mechanistic explanation with a computational model that enables exploration of the mechanism’s dynamics. Using exemplars from circadian rhythm research, we extract six specific contributions provided by computational models. We then examine cognitive science models to determine how well they make the same types of contributions. We suggest that the modeling approach used in circadian research may prove useful in cognitive science as researchers develop procedures for experimentally decomposing cognitive mechanisms into parts and operations and begin to understand their nonlinear interactions.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号