首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Category management—a relatively new function in marketing—involves large-scale, real-time forecasting of multiple data series in complex environments. In this paper, we illustrate how Bayesian Vector Auto regression (BVAR) fulfils the category manager's decision-support requirements by providing accurate forecasts of a category's state variables (prices, volumes and advertising levels), incorporating management interventions (merchandising events such as end-aisle displays), and revealing competitive dynamics through impulse response analyses. Using 124 weeks of point-of-sale scanner data comprising 31 variables for four brands, we compare the out-of-sample forecasts from BVAR to forecasts from exponential smoothing, univariate and multivariate Box-Jenkins transfer function analyses, and multivariate ARMA models. Theil U's indicate that BVAR forecasts are superior to those from alternate approaches. In large-scale forecasting applications, BVAR's ease of identification and parsimonious use of degrees of freedom are particularly valuable.  相似文献   

2.
At the beginning of the 1730s René Antoine Ferchault de Réaumur published two long memoirs on a new type of thermometer equipped with a specially calibrated scale — known ever since as the Réaumur scale. It became one of the most common ‘standardized’ thermometers in Europe until the late nineteenth century. What made this thermometer so successful? What was it specifically? I will first argue that the real Réaumur thermometer as an instrument was a fiction, a ghost — an idealized instrument. On paper, it was theoretically flawless. In reality, the standardized Réaumur thermometer was most likely never achieved. This article shows that its success was essentially due to a recontextualisation from theoretical natural philosophy — Réaumur's principle of uniformity — to: 1) the context of artisanal knowledge and practices and 2) the context of making and reporting in the Mémoires de l'Académie royale des sciences actual measurements done in the field (in Paris at the Observatory, in Provincial France, in the Colonies, and in the rest of Europe). Réaumur's thermometer was essentially a theoretical method to which was associated a particular scale. It was the instrument's reification for market consumption and fieldwork that gave this specific type of thermometer materiality and authority. Although most Réaumur thermometers ever made were strikingly different from one another, over time the thermomètre de Réaumur designation became a brand, a seal of approval born from customary artisanal practices and cultural habitudes.  相似文献   

3.
When Bouvet discovered the relationship between the binary arithmetic of Leibniz and the hexagrams of the I ching—in reality only a purely formal correspondence—he sent to Leibniz a woodcut diagram of the Fu-Hsi arrangement, which provides the key to the analogy. This diagram, in a re-drawn version, was first published by Gorai Kinzō in a study of Leibniz's interpretation of the I ching and Confucianism which has been influential in providing, indirectly, the principal source for the accounts of Wilhelm and Needham. Yet this pioneering study of Leibniz's interpretation of the hexagrams is virtually unknown. Even the account of Needham, who saved it from complete obscurity, contains one or two inaccuracies about it and these are repeated by Zacher in his otherwise excellent monograph on Leibniz's binary arithmetic.  相似文献   

4.
This paper is an account of Kepler's explicit awareness of the problem of experimental error. As a study of the Astronomia nova shows, Kepler exploited his awareness of the occurrences of experimental errors to guide him to the right conclusion. Errors were thus employed, so to speak, perhaps for the first time, to bring about a major physical discovery: Kepler's laws of planetary motion. ‘Know then’, to use Kepler's own words, ‘that errors show us the way to truth.’ With a survey of Kepler's revolutionary contribution to optics, the paper demonstrates that Kepler's awareness of the problem of experimental error extended beyond discrepancies between calculations and observations to types of error which pertain to observations and instruments. It emerges that Kepler's belief in the unity of knowledge and physical realism, facilitated—indeed created—the right philosophical posture for comprehending the problem of error in an entirely novel way.  相似文献   

5.
I reappraise in detail Hertz's cathode ray experiments. I show that, contrary to Buchwald's (1995) evaluation, the core experiment establishing the electrostatic properties of the rays was successfully replicated by Perrin (probably) and Thomson (certainly). Buchwald's discussion of ‘current purification’ is shown to be a red herring. My investigation of the origin of Buchwald's misinterpretation of this episode reveals that he was led astray by a focus on what Hertz ‘could do’—his experimental resources. I argue that one should focus instead on what Hertz wanted to achieve—his experimental goals. Focusing on these goals, I find that his explicit and implicit requirements for a successful investigation of the rays’ properties are met by Perrin and Thomson. Thus, even by Hertz's standards, they did indeed replicate his experiment.  相似文献   

6.
In the early years of the nineteenth century, the English chemist John Dalton (1766–1844) developed his atomic theory, a set of theoretical commitments describing the nature of atoms and the rules guiding their interactions and combinations. In this paper, I examine a set of conceptual and illustrative tools used by Dalton in developing his theory as well as in presenting it to the public in printed form as well as in his many public lectures. These tools—the concept of ‘atmosphere’, the pile of shot analogy, and Dalton's system of chemical notation—served not just to guide Dalton's own thinking and to make his theories clear to his various audiences, but also to bind these theories together into a coherent system, presented in its definitive form in the three volumes of A New System of Chemical Philosophy (1808, 1810, and 1827). Despite these links, Dalton's contemporaries tended to pick and choose which of his theories to accept; his system of notation failed to be adopted in part because it embodied the whole of his system indivisibly.  相似文献   

7.
In this paper I take a sceptical view of the standard cosmological model and its variants, mainly on the following grounds: (i) The method of mathematical modelling that characterises modern natural philosophy—as opposed to Aristotle's—goes well with the analytic, piecemeal approach to physical phenomena adopted by Galileo, Newton and their followers, but it is hardly suited for application to the whole world. (ii) Einstein's first cosmological model (1917) was not prompted by the intimations of experience but by a desire to satisfy Mach's Principle. (iii) The standard cosmological model—a Friedmann–Lemaı̂tre–Robertson–Walker spacetime expanding with or without end from an initial singularity—is supported by the phenomena of redshifted light from distant sources and very nearly isotropic thermal background radiation provided that two mutually inconsistent physical theories are jointly brought to bear on these phenomena, viz the quantum theory of elementary particles and Einstein's theory of gravity. (iv) While the former is certainly corroborated by high-energy experiments conducted under conditions allegedly similar to those prevailing in the early world, precise tests of the latter involve applications of the Schwarzschild solution or the PPN formalism for which there is no room in a Friedmann–Lemaı̂tre–Robertson–Walker spacetime.  相似文献   

8.
9.
10.
In this paper, I compare Pierre-Simon Laplace's celebrated formulation of the principle of determinism in his 1814 Essai philosophique sur les probabilités with the formulation of the same principle offered by Roger Joseph Boscovich in his Theoria philosophiae naturalis, published 56 years earlier. This comparison discloses a striking general similarity between the two formulations of determinism as well as certain important differences. Regarding their similarities, both Boscovich's and Laplace's conceptions of determinism involve two mutually interdependent components—ontological and epistemic—and they are both intimately linked with the principles of causality and continuity. Regarding their differences, however, Boscovich's formulation of the principle of determinism turns out not only to be temporally prior to Laplace's but also—being founded on fewer metaphysical principles and more rooted in and elaborated by physical assumptions—to be more precise, complete and comprehensive than Laplace's somewhat parenthetical statement of the doctrine. A detailed analysis of these similarities and differences, so far missing in the literature on the history and philosophy of the concept of determinism, is the main goal of the present paper.  相似文献   

11.
In 1918, Henry de Dorlodot—priest, theologian, and professor of geology at the University of Louvain (Belgium)—published Le Darwinisme au point de vue de l'Orthodoxie Catholique (translated as Darwinism and Catholic Thought) in which he defended a reconciliation between evolutionary theory and Catholicism with his own particular kind of theistic evolutionism. He subsequently announced a second volume in which he would extend his conclusions to the origin of Man. Traditionalist circles in Rome reacted vehemently. Operating through the Pontifical Biblical Commission, they tried to force Dorlodot to withdraw his book and to publicly disown his ideas by threatening him with an official condemnation, a strategy that had been used against Catholic evolutionists since the late nineteenth century. The archival material on the ‘Dorlodot affair’ shows how this policy ‘worked’ in the early stages of the twentieth century but also how it would eventually reach the end of its logic. The growing popularity of theistic evolutionism among Catholic intellectuals, combined with Dorlodot's refusal to pull back amidst threats, made certain that the traditionalists did not get their way completely, and the affair ended in an uncomfortable status quo. Dorlodot did not receive the official condemnation that had been threatened, nor did he withdraw his theories, although he stopped short on publishing on the subject. With the decline of the traditionalists’ power and authority, the policy of denunciation towards evolutionists made way for a growing tolerance. The ‘Dorlodot affair’—which occurred in a pivotal era in the history of the Church—can be seen as exemplary with regards to the changing attitude of the Roman authorities towards evolutionism in the first half of the twentieth century.  相似文献   

12.
Between the 1860s and the 1910s, British acoustics was transformed from an area of empirical research into a mathematically organized field. Musical motives—improving musical scales and temperaments, making better musical instruments, and understanding the nature of musical tones—were among the major driving forces of acoustical researchers in nineteenth-century Britain. The German acoustician, Helmholtz, had a major impact on British acousticians who also had extensive interactions with American and French acousticians. Rayleigh's acoustics, reflecting all these features, bore remarkable fruit in his treatise The Theory of Sound, which successfully subjected empirical acoustics to analytical mathematics. His accomplishments made British acoustics a subfield of physics, thus distinguishing it from the ‘new acoustics’ in early twentieth-century America.  相似文献   

13.
By now, the story of T. D. Lysenko's phantasmagoric career in the Soviet life sciences is widely familiar. While Lysenko's attempts to identify I. V. Michurin, the horticulturist, as the source of his own inductionist ideas about heredity are recognized as a gambit calculated to enhance his legitimacy, the real roots of those ideas are still shrouded in mystery. This paper suggests those roots may be found in a tradition in Russian biology that stretches back to the 1840s—a tradition inspired by the doctrines of Jean-Baptiste Lamarck and Etienne and Isidore Geoffroy Saint-Hilaire. The enthusiastic reception of those doctrines in Russia and of their practical application—acclimatization of exotic life forms—gave rise to the durable scientific preoccupation with transforming nature which now seems implicated in creating the context for Lysenko's successful bid to become an arbiter of the biological sciences.  相似文献   

14.
15.
In this paper, three theories of progress and the aim of science are discussed: (i) the theory of progress as increasing explanatory power, advocated by Popper in The logic of scientific discovery (1935/1959); (ii) the theory of progress as approximation to the truth, introduced by Popper in Conjectures and refutations (1963); (iii) the theory of progress as a steady increase of competing alternatives, which Feyerabend put forward in the essay “Reply to criticism. Comments on Smart, Sellars and Putnam” (1965) and defended as late as the last edition of Against method (1993). It is argued that, contrary to what Feyerabend scholars have predominantly assumed, Feyerabend's changing attitude towards falsificationism—which he often advocated at the beginning of his career, and vociferously attacked in the 1970s and 1980s—must be explained by taking into account not only Feyerabend's very peculiar view of the aim of science, but also Popper's changing account of progress.  相似文献   

16.
Proposed by Einstein, Podolsky, and Rosen (EPR) in 1935, the entangled state has played a central part in exploring the foundation of quantum mechanics. At the end of the twentieth century, however, some physicists and mathematicians set aside the epistemological debates associated with EPR and turned it from a philosophical puzzle into practical resources for information processing. This paper examines the origin of what is known as quantum information. Scientists had considered making quantum computers and employing entanglement in communications for a long time. But the real breakthrough only occurred in the 1980s when they shifted focus from general-purpose systems such as Turing machines to algorithms and protocols that solved particular problems, including quantum factorization, quantum search, superdense code, and teleportation. Key to their development was two groups of mathematical manipulations and deformations of entanglement—quantum parallelism and ‘feedback EPR’—that served as conceptual templates. The early success of quantum parallelism and feedback EPR was owed to the idealized formalism of entanglement researchers had prepared for philosophical discussions. Yet, such idealization is difficult to hold when the physical implementation of quantum information processors is at stake. A major challenge for today's quantum information scientists and engineers is thus to move from Einstein et al.'s well-defined scenarios into realistic models.  相似文献   

17.
We show that contrasting results on trading volume's predictive role for short‐horizon reversals in stock returns can be reconciled by conditioning on different investor types' trading. Using unique trading data by investor type from Korea, we provide explicit evidence of three distinct mechanisms leading to contrasting outcomes: (i) informed buying—price increases accompanied by high institutional buying volume are less likely to reverse; (ii) liquidity selling—price declines accompanied by high institutional selling volume in institutional investor habitat are more likely to reverse; (iii) attention‐driven speculative buying—price increases accompanied by high individual buying‐volume in individual investor habitat are more likely to reverse. Our approach to predict which mechanism will prevail improves reversal forecasts following return shocks: An augmented contrarian strategy utilizing our ex ante formulation increases short‐horizon reversal strategy profitability by 40–70% in the US and Korean stock markets.  相似文献   

18.
There is a substantial literature on Feyerabend's relativism—including a few papers in this collection—but fewer specific studies of the ways that his writings and ideas have been taken up among the non-academic public. This is odd, given his obvious interest in the lives and concerns of persons who were not ‘intellectuals’—a term that, for him, had a pejorative ring to it. It is also odd, given the abundance of evidence of how Feyerabend's relativism played a role in a specific national and cultural context—namely, contemporary Italian debates about relativism. This paper offers a study of how Feyerabend's ideas have been deployed by Italian intellectuals and cultural commentators—including the current Pope—and critically assesses them.  相似文献   

19.
Sydney Chapman is unanimously considered to have played a founding role in modern geomagnetism and to have opened up new lines of research in geophysics generally. Nevertheless, Chapman's conviction regarding the synthesis of the explanatory mechanisms of the atmosphere has gone practically unnoticed in the historiography of geophysics. This paper examines Chapman's contribution to ionospheric physics. It aims to understand Chapman's theory of ionospheric layer formation, and particularly its link to his theory of ozone formation. It deals first with the traits which characterized Chapman's personality, as a way of explaining—and even perhaps justifying— his quest for the integration and synthesis of geophysical knowledge. It then analyses Chapman's model of ionospheric layers and his suggestions regarding its use as an operational tool (without ontological connotations), before continuing with his account of the formation of the ozone layer, which seemed to constitute the missing link for understanding ionosphere layer formation. The paper concludes with Chapman's attempt to reconcile geomagnetic and radio evidence.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号