首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We present a forecasting model based on fuzzy pattern recognition and weighted linear regression. In this model fuzzy pattern recognition is used to find homogeneous fuzzy classes in a heterogeneous data set. It is assumed that the classes represent typical situations. For each class a weighted regression analysis is conducted. The forecasting results obtained by the class regression analysis are aggregated to obtain the ‘overall’ estimation of the regression model. We apply the model to the forecasting of economic data of the USA. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

2.
The model presented in this paper integrates two distinct components of the demand for durable goods: adoptions and replacements. The adoption of a new product is modeled as an innovation diffusion process, using price and population as exogenous variables. Adopters are expected to eventually replace their old units of the product, with a probability which depends on the age of the owned unit, and other random factors such as overload, style-changes etc. It is shovn that the integration of adoption and replacement demand components in our model yields quality sales forecasts, not only under conditions where detailed data on replacement sales is available, but also when the forecaster's access is limited to total sales data and educated guesses on certain elements of the replacement process.  相似文献   

3.
When evaluating the launch of a new product or service, forecasts of the diffusion path and the effects of the marketing mix are critically important. Currently no unified framework exists to provide guidelines on the inclusion and specification of marketing mix variables into models of innovation diffusion. The objective of this research is to examine empirically the role of prices in diffusion models, in order to establish whether price can be incorporated effectively into the simpler time-series models. Unlike existing empirical research which examines the models' fit to historical data, we examine the predictive validity of alternative models. Only if the incorporation of prices improves the predictive performance of diffusion models can it be argued that these models have validity. A series of diffusion models which include prices are compared against a number of well-accepted diffusion models, including the Bass (1969) model, and more recently developed ‘flexible’ diffusion models. For short data series and long-lead time forecasting, the situation typical of practical situations, price rarely added to the forecasting capability of simpler time-series models. Copyright © 1998 John Wiley & Sons, Ltd.  相似文献   

4.
Summary The humanity of the Upper Palæolithic is represented exclusively byHomo sapiens; already at that time a remarkable diversity in human species was evident. Only by an intensive study of present and future discoveries of skeletons the question can be solved whether the remarkable variety is an individual variability of a homogeneous population or whether it corresponds to different races (up to six).  相似文献   

5.
This paper introduces a methodology for estimating the likelihood of private information usage amongst earnings analysts. This is achieved by assuming that one group of analysts generate forecasts based on the underlying dynamics of earnings, while all other analysts are assumed to issue forecasts based on the prevailing consensus forecast. Given this behavioural dichotomy, we are able to derive (and estimate) a structural econometric model of forecast behaviour, which has implications regarding the determinants of analysts' private information endowments and forecast accuracy over the forecast horizon. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

6.
In this paper we consider a population where the state of each individual follows a Markov chain. If the population is recorded for a very few periods only, it is still possible to estimate the transition matrix and to make projections into the far future. These forecasts are sensible if the chains are time homogeneous, but this is difficult to check if only a few periods are observed. We suggest a simple method to check this assumption, and obtain an upper bound on the time the process can have been time homogeneous. The method is also applied to a second-order Markov chain and to the mover-stayer model. AMS subject classification (1980): Primary 62M05, Secondary 62M20.  相似文献   

7.
Diffusion of new products may be deterred by consumers' uncertainties about how they will perform. This paper introduces a decision-theoretic framework for modeling the diffusion of consumables, in which consumers choose between a current and new product so as to maximize expected utility. Consumers that are sufficiently risk-averse delay adoption, and change their prior uncertainties in a Bayesian fashion using information generated by early adopters. Under certain assumptions about the underlying consumer choice process and the market dynamics, the result is logistic growth in the share of consumers that choose the new product. The model can be generalized by allowing for consumer heterogeneity with respect to performance of the new product. The paper concludes with a discussion of applications for market forecasting, design of market trials and other extensions.  相似文献   

8.
Agricultural productivity highly depends on the cost of energy required for cultivation. Thus prior knowledge of energy consumption is an important step for energy planning and policy development in agriculture. The aim of the present study is to evaluate the application potential of multiple linear regression (MLR) and machine learning tools such as support vector regression (SVR) and Gaussian process regression (GPR) to forecast the agricultural energy consumption of Turkey. In the development of the models, widespread indicators such as agricultural value-added, total arable land, gross domestic product share of agriculture, and population data were used as input parameters. Twenty-eight-year historical data from 1990 to 2017 were utilized for the training and testing stages of the models. A Bayesian optimization method was applied to improve the prediction capability of SVR and GPR models. The performance of the models was measured by various statistical tools. The results indicated that the Bayesian optimized GPR (BGPR) model with exponential kernel function showed a superior prediction capability over MLR and Bayesian optimized SVR model. The root mean square error, mean absolute deviation, mean absolute percentage error, and coefficient of determination (R2) values for the BGPR model were determined as 0.0022, 0.0005, 0.2041, and 0.9999 in the training phase and 0.0452, 0.0310, 7.7152, and 0.9677 in the testing phase, respectively. As a result, it can be concluded that the proposed BGPR model is an efficient technique and has the potential to predict agricultural energy consumption with high accuracy.  相似文献   

9.
How do scientific innovations spread within and across scientific communities? In this paper, we propose a general account of the diffusion of scientific innovations. This account acknowledges that novel ideas must be elaborated on and conceptually translated before they can be adopted and applied to field-specific problems. We motivate our account by examining an exemplary case of knowledge diffusion, namely, the early spread of theories of rational decision-making. These theories were grounded in a set of novel mathematical tools and concepts that originated in John von Neumann and Oskar Morgenstern's Theory of Games and Economic Behavior (1944/1947) and subsequently spread widely across the social and behavioral sciences. Introducing a network-based diffusion measure, we trace the spread of those tools and concepts into distinct research areas. We furthermore present an analytically tractable typology for classifying publications according to their roles in the diffusion process. The proposed framework allows for a systematic examination of the conditions under which scientific innovations spread within and across preexisting and newly emerging scientific communities.  相似文献   

10.
The emergence of dimensional analysis in the early nineteenth century involved a redefinition of the pre-existing concepts of homogeneity and dimensions, which entailed a shift from a qualitative to a quantitative conception of these notions. Prior to the nineteenth century, these concepts had been used as criteria to assess the soundness of operations and relations between geometrical quantities. Notably, the terms in such relations were required to be homogeneous, which meant that they needed to have the same geometrical dimensions. The latter reflected the nature of the quantities in question, such as volume vs area. As natural philosophy came to encompass non-geometrical quantities, the need arose to generalize the concept of homogeneity. In 1822, Jean Baptiste Fourier consequently redefined it to be the condition an equation must satisfy in order to remain valid under a change of units, and the ‘dimension' correspondingly became the power of a conversion factor. When these innovations eventually found an echo in France and Great Britain, in the second half of the century, tensions arose between the former, qualitative understanding of dimensions as reflecting the nature of physical quantities, and the new, quantitative conception based on unit conversion and measurement. The emergence of dimensional analysis thus provides a case study of how existing rules and concepts can find themselves redefined in the context of wider conceptual changes; in the present case this redefinition involved a generalization, but also a shift in meaning which led to conceptual tensions.  相似文献   

11.
Daily electricity consumption data, available almost in real time, can be used in Italy to estimate the level of industrial production in any given month before the month is over. We present a number of procedures that do this using electricity consumption in the first 14 days of the month. (This is an extension of a previous model that used monthly electricity data.) We show that, with a number of adjustments, a model using half-monthly electricity data generates acceptable estimates of the monthly production index. More precisely, these estimates are more accurate than univariate forecasts but less accurate than estimates based on monthly electricity data. A further improvement can be obtained by combining ‘half-monthly’ electricity-based estimates with univariate forecasts. We also present quarterly estimates and discuss confidence intervals for various types of forecasts.  相似文献   

12.
13.
This is a case study of a closely managed product. Its purpose is to determine whether time-series methods can be appropriate for business planning. By appropriate, we mean two things: whether these methods can model and estimate the special events or features that are often present in sales data; and whether they can forecast accurately enough one, two and four quarters ahead to be useful for business planning. We use two time-series methods, Box-Jenkins modeling and Holt-Winters adaptive forecasting, to obtain forecasts of shipments of a closely managed product. We show how Box-Jenkins transfer-function models can account for the special events in the data. We develop criteria for choosing a final model which differ from the usual methods and are specifically directed towards maximizing the accuracy of next-quarter, next-half-year and next-full-year forecasts. We find that the best Box-Jenkins models give forecasts which are clearly better than those obtained from Holt-Winters forecast functions, and are also better than the judgmental forecasts of IBM's own planners. In conclusion, we judge that Box-Jenkins models can be appropriate for business planning, in particular for determining at the end of the year baseline business-as-usual annual and monthly forecasts for the next year, and in mid-year for resetting the remaining monthly forecasts.  相似文献   

14.
15.
Explanations implicitly end with something that makes sense, and begin with something that does not make sense. A statistical relationship, for example, a numerical fact, does not make sense; an explanation of this relationship adds something, such as causal information, which does make sense, and provides an endpoint for the sense-making process. Does social science differ from natural science in this respect? One difference is that in the natural sciences, models are what need “understanding.” In the social sciences, matters are more complex. There are models, such as causal models, which need to be understood, but also depend on background knowledge that goes beyond the model and the correlations that make it up, which produces a regress. The background knowledge is knowledge of in-filling mechanisms, which are normally made up of elements that involve the direct understanding of the acting and believing subjects themselves. These models, and social science explanations generally, are satisfactory only when they end the regress in this kind of understanding or use direct understanding evidence to decide between alternative mechanism explanations.  相似文献   

16.
Stemming from human accident, error, or neglect, technological disasters, such as chemical spills, toxic waste contamination, nuclear radiation, transportation accidents, and factory explosions, are products of the modern industrial complex. Toxic contamination of the land can permanently displace people from their homes and erase places from the landscape. Commemoration provides an opportunity to remember the past and celebrate culturally significant place attachments while contributing to the recovery process by aiding in community healing after devastating events. We focus on two key components regarding commemoration after technological disaster, namely the acknowledgement of wrongdoing and the celebration of a resilient population and landscape. We argue that a combination of ecofeminist philosophy and environmental justice frameworks allows for a better understanding of the cycle of disaster and mitigation as it pertains to targeted groups, and that commemorative acts and artifacts following human-made disasters often fail to successfully reform this cycle. Moreover, the combination of ecofeminist philosophy and environmental justice allows us to examine the complex relationship between responsibility and targeted groups through disaster commemoration, which serves as an important way to communicate wrongdoing to both the local and greater population. Through engagement with ecofeminist philosophy and environmental justice frameworks, we explicate how commemoration after technological disaster can disrupt or reinforce systematic inequalities.  相似文献   

17.
We introduce a new methodology for forecasting, which we call signal diffusion mapping. Our approach accommodates features of real‐world financial data which have been ignored historically in existing forecasting methodologies. Our method builds upon well‐established and accepted methods from other areas of statistical analysis. We develop and adapt those models for use in forecasting. We also present tests of our model on data in which we demonstrate the efficacy of our approach. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

18.
In 1904 Joachim published an influential paper dealing with ‘Aristotle's Conception of Chemical Combination’1 which has provided the basis of much more recent studies.2 About the same time, Duhem3 developed what he regarded as an essentially Aristotelian view of chemistry, based on his understanding of phenomenological thermodynamics. He does not present a detailed textual analysis, but rather emphasises certain general ideas. Joachim's classic paper contains obscurities which I have been unable to fathom and theses which do not seem to be fully explained, or which at least seem difficult for the modern reader to understand. An attempt is made here to provide a systematic account of the Aristotelian theory of the generation of substances by the mixing of elements by reconsidering Joachim's treatment in the light of the sort of points which most interested Duhem.The work described in this paper was undertaken with a view to providing a basis for presenting, evaluating and criticising Duhem's understanding of what was for him modern (i.e. 19th-century) chemistry. This latter project will be taken up on another occasion. I hope the present paper will be of some value to a broader philosophical readership in so far as it provides a fairly clear conception of matter which might be called Aristotelian, even if it is not precisely Aristotle's, and raises certain clear problems of interpretation. It may also be of interest to historians of chemistry in suggesting an analysis of the old chemical notion of a mixt independent of atomic theories.  相似文献   

19.
This is the last in a series of three papers on the history of the Lenz–Ising model from 1920 to the early 1970s. In the first paper, I studied the invention of the model in the 1920s, while in the second paper, I documented a quite sudden change in the perception of the model in the early 1960s when it was realized that the Lenz–Ising model is actually relevant for the understanding of phase transitions. In this article, which is self-contained, I study how this realization affected attempts to understand critical phenomena, which can be understood as limiting cases of (first-order) phase transitions, in the epoch from circa 1965 to 1970, where these phenomena were recognized as a research field in its own right. I focus on two questions: What kinds of insight into critical phenomena was the employment of the Lenz–Ising model thought to give? And how could a crude model, which the Lenz–Ising model was thought to be, provide this understanding? I document that the model played several roles: At first, it played a role analogous to experimental data: hypotheses about real systems, in particular relations between critical exponents and what is now called the hypothesis of scaling, which was advanced by Benjamin Widom and others, were confronted with numerical results for the model, in particular the model’s so-called critical exponents. A positive result of a confrontation was seen as positive evidence for this hypothesis. The model was also used to gain insight into specific aspects of critical phenomena, for example that diverse physical systems exhibit similar behavior close to a critical point. Later, a more systematic program of understanding critical phenomena emerged that involved an explicit formulation of what it means to understand critical phenomena, namely, the elucidation of what features of the Hamiltonian of models lead to what kinds of behavior close to critical points. Attempts to accomplish this program culminated with the so-called hypothesis of universality, put forward independently by Robert B. Griffiths and Leo P. Kadanoff in 1970. They divided critical phenomena into classes with similar critical behavior. I also study the crucial role of the Lenz–Ising model in the development and justification of these ideas.  相似文献   

20.
The hedging of weather risks has become extremely relevant in recent years, promoting the diffusion of weather‐derivative contracts. The pricing of such contracts requires the development of appropriate models for the prediction of the underlying weather variables. Within this framework, a commonly used specification is the ARFIMA‐GARCH. We provide a generalization of such a model, introducing time‐varying memory coefficients. Our model satisfies the empirical evidence of the changing memory level observed in average temperature series, and provides useful improvements in the forecasting, simulation, and pricing issues related to weather derivatives. We present an application related to the forecast and simulation of a temperature index density, which is then used for the pricing of weather options. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号