首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 203 毫秒
1.
In 1925 a debate erupted in the correspondence columns of the British Medical Journal concerning the effectiveness of eating raw pancreas as a treatment for diabetes. Enthusiasts were predominantly general practitioners (GPs), who claimed success for the therapy on the basis of their clinical impressions. Their detractors were laboratory-oriented 'biochemist-physicians,' who considered that their own experiments demonstrated that raw pancreas therapy was ineffective. The biochemist-physicians consistently dismissed the GPs' observations as inadequately 'controlled'. They did not define the meaning of 'control' in this context, although it clearly did not have the term's present-day meaning of a trial employing an untreated comparison group of patients. Rather, the physician-biochemists' 'properly controlled' experiments involved careful regulation of their patients' diet and other environmental factors, and evaluation of the therapy's success through biochemical, rather than just clinical, criteria. However, my analysis suggests that these factors alone are inadequate to account for the biochemist-physicians' dismissal of the GPs' work as 'uncontrolled'. I suggest that the biochemist-physicians were deliberately exploiting the powerful rhetorical connotations of the term 'control'. Ultimately, they implied that only a trial which they themselves had conducted could be deemed 'adequately controlled'.  相似文献   

2.
While philosophers have subjected Galileo's classic thought experiments to critical analysis, they have tended to largely ignored the historical and intellectual context in which they were deployed, and the specific role they played in Galileo's overall vision of science. In this paper I investigate Galileo's use of thought experiments, by focusing on the epistemic and rhetorical strategies that he employed in attempting to answer the question of how one can know what would happen in an imaginary scenario. Here I argue we can find three different answers to this question in Galileo later dialogues, which reflect the changing meanings of ‘experience’ and ‘knowledge’ (scientia) in the early modern period. Once we recognise that Galileo's thought experiments sometimes drew on the power of memory and the explicit appeal to ‘common experience’, while at other times, they took the form of demonstrative arguments intended to have the status of necessary truths; and on still other occasions, they were extrapolations, or probable guesses, drawn from a carefully planned series of controlled experiments, it becomes evident that no single account of the epistemological relationship between thought experiment, experience and experiment can adequately capture the epistemic variety we find Galileo's use of imaginary scenarios. To this extent, we cannot neatly classify Galileo's use of thought experiments as either ‘medieval’ or ‘early modern’, but we should see them as indicative of the complex epistemological transformations of the early seventeenth century.  相似文献   

3.
Discussions on the relation between Mach and Planck usually focus on their famous controversy, a conflict between ‘instrumentalist’ and realist philosophies of science that revolved around the specific issue of the existence of atoms. This article approaches their relation from a different perspective, comparing their analyses of energy and energy conservation. It is argued that this reveals a number of striking similarities and differences. Both Mach and Planck agreed that the law was valid, and they sought to purge energy of its anthropomorphic elements. They did so in radically different ways, however, illustrating the differences between Mach's ‘historical’ and Planck's ‘rationalistic’ accounts of knowledge. Planck's attempt to de-anthropomorphize energy was part of his attempt to demarcate theoretical physics from other disciplines. Mach's attempt to de-anthropomorphize energy is placed in the context of fin-de-siècle Vienna. By doing so, this article also proposes a new interpretation of Mach as a philosopher, historian and sociologist of science.  相似文献   

4.
5.
I reappraise in detail Hertz's cathode ray experiments. I show that, contrary to Buchwald's (1995) evaluation, the core experiment establishing the electrostatic properties of the rays was successfully replicated by Perrin (probably) and Thomson (certainly). Buchwald's discussion of ‘current purification’ is shown to be a red herring. My investigation of the origin of Buchwald's misinterpretation of this episode reveals that he was led astray by a focus on what Hertz ‘could do’—his experimental resources. I argue that one should focus instead on what Hertz wanted to achieve—his experimental goals. Focusing on these goals, I find that his explicit and implicit requirements for a successful investigation of the rays’ properties are met by Perrin and Thomson. Thus, even by Hertz's standards, they did indeed replicate his experiment.  相似文献   

6.
This paper provides a detailed account of the period of the complex history of British algebra and geometry between the publication of George Peacock's Treatise on Algebra in 1830 and William Rowan Hamilton's paper on quaternions of 1843. During these years, Duncan Farquharson Gregory and William Walton published several contributions on ‘algebraical geometry’ and ‘geometrical algebra’ in the Cambridge Mathematical Journal. These contributions enabled them not only to generalize Peacock's symbolical algebra on the basis of geometrical considerations, but also to initiate the attempts to question the status of Euclidean space as the arbiter of valid geometrical interpretations. At the same time, Gregory and Walton were bound by the limits of symbolical algebra that they themselves made explicit; their work was not and could not be the ‘abstract algebra’ and ‘abstract geometry’ of figures such as Hamilton and Cayley. The central argument of the paper is that an understanding of the contributions to ‘algebraical geometry’ and ‘geometrical algebra’ of the second generation of ‘scientific’ symbolical algebraists is essential for a satisfactory explanation of the radical transition from symbolical to abstract algebra that took place in British mathematics in the 1830s–1840s.  相似文献   

7.
Initial applications of prediction markets (PMs) indicate that they provide good forecasting instruments in many settings, such as elections, the box office, or product sales. One particular characteristic of these ‘first‐generation’ (G1) PMs is that they link the payoff value of a stock's share to the outcome of an event. Recently, ‘second‐generation’ (G2) PMs have introduced alternative mechanisms to determine payoff values which allow them to be used as preference markets for determining preferences for product concepts or as idea markets for generating and evaluating new product ideas. Three different G2 payoff mechanisms appear in the existing literature, but they have never been compared. This study conceptually and empirically compares the forecasting accuracy of the three G2 payoff mechanisms and investigates their influence on participants' trading behavior. We find that G2 payoff mechanisms perform almost as well as their G1 counterpart, and trading behavior is very similar in both markets (i.e. trading prices and trading volume), except during the very last trading hours of the market. These results indicate that G2 PMs are valid instruments and support their applicability shown in previous studies for developing new product ideas or evaluating new product concepts. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

8.
The analytical notions of ‘thought style’, ‘paradigm’, ‘episteme’ and ‘style of reasoning’ are some of the most popular frameworks in the history and philosophy of science. Although their proponents, Ludwik Fleck, Thomas Kuhn, Michel Foucault, and Ian Hacking, are all part of the same philosophical tradition that closely connects history and philosophy, the extent to which they share similar assumptions and objectives is still under debate. In the first part of the paper, I shall argue that, despite the fact that these four thinkers disagree on certain assumptions, their frameworks have the same explanatory goal – to understand how objectivity is possible. I shall present this goal as a necessary element of a common project -- that of historicising Kant's a priori. In the second part of the paper, I shall make an instrumental use of the insights of these four thinkers to form a new model for studying objectivity. I shall also propose a layered diagram that allows the differences between the frameworks to be mapped, while acknowledging their similarities. This diagram will show that the frameworks of style of reasoning and episteme illuminate conditions of possibility that lie at a deeper level than those considered by thought styles and paradigms.  相似文献   

9.
Thomas Kuhn and Paul Feyerabend promote incommensurability as a central component of their conflicting accounts of the nature of science. This paper argues that in so doing, they both develop Albert Einstein's views, albeit in different directions. Einstein describes scientific revolutions as conceptual replacements, not mere revisions, endorsing ‘Kant-on-wheels’ metaphysics in light of ‘world change’. Einstein emphasizes underdetermination of theory by evidence, rational disagreement in theory choice, and the non-neutrality of empirical evidence. Einstein even uses the term ‘incommensurable’ specifically to apply to challenges posed to comparatively evaluating scientific theories in 1949, more than a decade before Kuhn and Feyerabend. This analysis shows how Einstein anticipates substantial components of Kuhn and Feyerabend's views, and suggests that there are strong reasons to suspect that Kuhn and Feyerabend were directly inspired by Einstein's use of the term ‘incommensurable’, as well as his more general methodological and philosophical reflections.  相似文献   

10.
The track record of a 20‐year history of density forecasts of state tax revenue in Iowa is studied, and potential improvements sought through a search for better‐performing ‘priors’ similar to that conducted three decades ago for point forecasts by Doan, Litterman and Sims (Econometric Reviews, 1984). Comparisons of the point and density forecasts produced under the flat prior are made to those produced by the traditional (mixed estimation) ‘Bayesian VAR’ methods of Doan, Litterman and Sims, as well as to fully Bayesian ‘Minnesota Prior’ forecasts. The actual record and, to a somewhat lesser extent, the record of the alternative procedures studied in pseudo‐real‐time forecasting experiments, share a characteristic: subsequently realized revenues are in the lower tails of the predicted distributions ‘too often’. An alternative empirically based prior is found by working directly on the probability distribution for the vector autoregression parameters—the goal being to discover a better‐performing entropically tilted prior that minimizes out‐of‐sample mean squared error subject to a Kullback–Leibler divergence constraint that the new prior not differ ‘too much’ from the original. We also study the closely related topic of robust prediction appropriate for situations of ambiguity. Robust ‘priors’ are competitive in out‐of‐sample forecasting; despite the freedom afforded the entropically tilted prior, it does not perform better than the simple alternatives. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

11.
In recent years an impressive array of publications has appeared claiming considerable successes of neural networks in modelling financial data but sceptical practitioners and statisticians are still raising the question of whether neural networks really are ‘a major breakthrough or just a passing fad’. A major reason for this is the lack of procedures for performing tests for misspecified models, and tests of statistical significance for the various parameters that have been estimated, which makes it difficult to assess the model's significance and the possibility that any short‐term successes that are reported might be due to ‘data mining’. In this paper we describe a methodology for neural model identification which facilitates hypothesis testing at two levels: model adequacy and variable significance. The methodology includes a model selection procedure to produce consistent estimators, a variable selection procedure based on statistical significance and a model adequacy procedure based on residuals analysis. We propose a novel, computationally efficient scheme for estimating sampling variability of arbitrarily complex statistics for neural models and apply it to variable selection. The approach is based on sampling from the asymptotic distribution of the neural model's parameters (‘parametric sampling’). Controlled simulations are used for the analysis and evaluation of our model identification methodology. A case study in tactical asset allocation is used to demonstrate how the methodology can be applied to real‐life problems in a way analogous to stepwise forward regression analysis. Neural models are contrasted to multiple linear regression. The results indicate the presence of non‐linear relationships in modelling the equity premium. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

12.
The apothecary occupied a liminal position in early modern society between profit and healing. Finding ways to distance their public image from trade was a common problem for apothecaries across Europe. This article uses the case of a Bolognese apothecary, Filippo Pastarino, to address the question of how early modern apothecaries chose to represent themselves to political authorities and to the wider public. ‘Mercy’, alongside ‘craft’, was a pillar of apothecaries’ social identity. By contrast, no matter how central financial transactions (‘money’) were to their activity, apothecaries did not want to be perceived as merchants. Thus, the assistance and advice apothecaries provided to patients and customers resulted as central aspects of their social role. In this context, Bolognese apothecaries aimed to defend their current status, which had been challenged by naturalist Ulysses Aldrovandi, city authorities and local monasteries. However, Pastarino's claims can also be seen as antecedents to the self-legitimizing strategy that seventeenth-century artisans deployed when faced with the need to enhance their new status as natural philosophers. The present study attributes a name, a date of birth and a shop to Filippo Pastarino, revising previous interpretations. More broadly, by focusing on how these artisans defended their position in the city it enriches our understanding of the self-representation of apothecaries.  相似文献   

13.
14.
In 1918, Henry de Dorlodot—priest, theologian, and professor of geology at the University of Louvain (Belgium)—published Le Darwinisme au point de vue de l'Orthodoxie Catholique (translated as Darwinism and Catholic Thought) in which he defended a reconciliation between evolutionary theory and Catholicism with his own particular kind of theistic evolutionism. He subsequently announced a second volume in which he would extend his conclusions to the origin of Man. Traditionalist circles in Rome reacted vehemently. Operating through the Pontifical Biblical Commission, they tried to force Dorlodot to withdraw his book and to publicly disown his ideas by threatening him with an official condemnation, a strategy that had been used against Catholic evolutionists since the late nineteenth century. The archival material on the ‘Dorlodot affair’ shows how this policy ‘worked’ in the early stages of the twentieth century but also how it would eventually reach the end of its logic. The growing popularity of theistic evolutionism among Catholic intellectuals, combined with Dorlodot's refusal to pull back amidst threats, made certain that the traditionalists did not get their way completely, and the affair ended in an uncomfortable status quo. Dorlodot did not receive the official condemnation that had been threatened, nor did he withdraw his theories, although he stopped short on publishing on the subject. With the decline of the traditionalists’ power and authority, the policy of denunciation towards evolutionists made way for a growing tolerance. The ‘Dorlodot affair’—which occurred in a pivotal era in the history of the Church—can be seen as exemplary with regards to the changing attitude of the Roman authorities towards evolutionism in the first half of the twentieth century.  相似文献   

15.
Has the rise of data-intensive science, or ‘big data’, revolutionized our ability to predict? Does it imply a new priority for prediction over causal understanding, and a diminished role for theory and human experts? I examine four important cases where prediction is desirable: political elections, the weather, GDP, and the results of interventions suggested by economic experiments. These cases suggest caution. Although big data methods are indeed very useful sometimes, in this paper's cases they improve predictions either limitedly or not at all, and their prospects of doing so in the future are limited too.  相似文献   

16.
Mining companies after the Gold Rush depended heavily on foreign expertise, and yet historians of mining have glorified ‘German engineering’ in America. The application of German technology in America was fraught with difficulties, and most advances were micro- rather than macro-innovations, such as Philip Deidesheimer's famous square-set timbering on the Comstock Lode. The problem began at German mining schools, such as the Freiberg Mining Academy, where Americans like Louis and Henry Janin, while they acquired advanced training and adopted an engineering ethos, struggled to learn about Mexican and American mining. Having complemented their course of study to remedy this deficiency, the brothers returned to the US intending to modernize mining on the frontier. Louis attempted the ‘Freiberg Process’ of amalgamation on the Comstock Lode, but locally developed methods proved more feasible, and the experiment failed. He came to apply his training rather toward the micro-level problem of how to reprocess amalgamation waste heaps.  相似文献   

17.
Early geological investigations in the St David's area (Pembrokeshire) are described, particularly the work of Murchison. In a reconnaissance survey in 1835, he regarded a ridge of rocks at St David's as intrusive in unfossiliferous Cambrian; and the early Survey mapping (chiefly the work of Aveline and Ramsay) was conducted on that assumption, leading to the publication of maps in 1845 and 1857. The latter represented the margins of the St David's ridge as ‘Altered Cambrian’. So the supposedly intrusive ‘syenite’ was regarded as younger, and there was no Precambrian. These views were challenged by a local doctor, Henry Hicks, who developed an idea of the ex-Survey palaeontologist John Salter that the rocks of the ridge were stratified and had formed a Precambrian island, round which Cambrian sediments (now confirmed by fossil discoveries) had been deposited. Hicks subsequently proposed subdivision of his Precambrian into ‘Dimetian’, ‘Pebidian’, and (later) ‘Arvonian’, and he attempted correlations with rocks in Shropshire, North Wales, and even North America, seeking to develop the neo-Neptunist ideas of Sterry Hunt. The challenge to the Survey's work was countered in the 1880s by the Director General, Geikie, who showed that Hicks's idea of stratification in the Dimetian was mistaken. A heated controversy developed, several amateur geologists, supported by a group of Cambridge Sedgwickians, forming a coalition of ‘Archaeans’ against the Survey. Geikie was supported by Lloyd Morgan. Attention focused particularly on Ogof Lle-sugn Cave and St Non's Arch, with theory/controversy-ladenness of observations evident on both sides. Evidence from an eyewitness student record of a Geological Society meeting reveals the ‘sanit`ized’ nature of the official summary of the debate in QJGS. Field mapping early in the twentieth century by J. F. N. Green allowed a compromise consensus to be achieved, but Green's evidence for unconformity between the Cambrian and the Dimetian, obtained by excavation, can no longer be verified, and his consensual history of the area may need revision. Unconformity between the Cambrian and the Pebidian tuffs is not in doubt, however, and Precambrian at St David's is accepted. The study exhibits features of geological controversy and the British geological community in the nineteenth century. It also furnishes a further instance of the great influence of Murchison in nineteenth-century British geology and the side-effects of his controversy with Sedgwick.  相似文献   

18.
Proposed by Einstein, Podolsky, and Rosen (EPR) in 1935, the entangled state has played a central part in exploring the foundation of quantum mechanics. At the end of the twentieth century, however, some physicists and mathematicians set aside the epistemological debates associated with EPR and turned it from a philosophical puzzle into practical resources for information processing. This paper examines the origin of what is known as quantum information. Scientists had considered making quantum computers and employing entanglement in communications for a long time. But the real breakthrough only occurred in the 1980s when they shifted focus from general-purpose systems such as Turing machines to algorithms and protocols that solved particular problems, including quantum factorization, quantum search, superdense code, and teleportation. Key to their development was two groups of mathematical manipulations and deformations of entanglement—quantum parallelism and ‘feedback EPR’—that served as conceptual templates. The early success of quantum parallelism and feedback EPR was owed to the idealized formalism of entanglement researchers had prepared for philosophical discussions. Yet, such idealization is difficult to hold when the physical implementation of quantum information processors is at stake. A major challenge for today's quantum information scientists and engineers is thus to move from Einstein et al.'s well-defined scenarios into realistic models.  相似文献   

19.
20.
The paper examines the relevance of the nomological view of nature to three discussions of tide in the thirteenth century. A nomological conception of nature assumes that the basic explanatory units of natural phenomena are universally binding rules stated in quantitative terms. (1) Robert Grosseteste introduced an account of the tide based on the mechanism of rarefaction and condensation, stimulated by the Moon's rays and their angle of incidence. He considered the Moon's action over the sea an example of the general efficient causality exerted through the universal activity of light or species. (2) Albert the Great posited a plurality of causes which cannot be reduced to a single cause. The connaturality of the Moon and the water is the only principle of explanation which he considered universal. Connaturality, however, renders neither formulation nor quantification possible. While Albert stressed the variety of causes of the tide, (3) Roger Bacon emphasized regularity and reduced the various causes producing tides into forces. He replaced the terminology of ‘natures’ by one of ‘forces’. Force, which in principle can be accurately described and measured, thus becomes a commensurable aspect of a diverse cosmos. When they reasoned why waters return to their place after the tide, Grosseteste argued that waters return in order to prevent a vacuum, Albert claimed that waters ‘follow their own nature’, while Bacon held that the ‘proper force’ of the water prevails over the distant force of the first heaven. I exhibit, for the thirteenth century, moments of the move away from the Aristotelian concerns. The basic elements of these concerns were essences and natures which reflect specific phenomena and did not allow for an image of nature as a unified system. In the new perspective of the thirteenth century the key was a causal link between the position of the Moon and the tide cycle, a link which is universal and still qualitative, yet expressed as susceptible to quantification.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号