共查询到20条相似文献,搜索用时 15 毫秒
1.
A distinction is made between theory-driven and phenomenological models. It is argued that phenomenological models are significant means by which theory is applied to phenomena. They act both as sources of knowledge of their target systems and are explanatory of the behaviors of the latter. A version of the shell-model of nuclear structure is analyzed and it is explained why such a model cannot be understood as being subsumed under the theory structure of Quantum Mechanics. Thus its representational capacity does not stem from its close link to theory. It is shown that the shell model yields knowledge about the target and is explanatory of certain behaviors of nuclei. Aspects of the process by which the shell model acquires its representational capacity are analyzed. It is argued that these point to the conclusion that the representational status of the model is a function of its capacity to function as a source of knowledge and its capacity to postulate and explain underlying mechanisms that give rise to the observed behavior of its target. 相似文献
2.
In this paper, we develop and refine the idea that understanding is a species of explanatory knowledge. Specifically, we defend the idea that S understands why p if and only if S knows that p, and, for some q, S’s true belief that q correctly explains p is produced/maintained by reliable explanatory evaluation. We then show how this model explains the reception of James Bjorken’s explanation of scaling by the broader physics community in the late 1960s and early 1970s. The historical episode is interesting because Bjorken’s explanation initially did not provide understanding to other physicists, but was subsequently deemed intelligible when Feynman provided a physical interpretation that led to experimental tests that vindicated Bjorken’s model. Finally, we argue that other philosophical models of scientific understanding are best construed as limiting cases of our more general model. 相似文献
3.
During the 1930s and 1940s, American physical organic chemists employed electronic theories of reaction mechanisms to construct models offering explanations of organic reactions. But two molecular rearrangements presented enormous challenges to model construction. The Claisen and Cope rearrangements were predominantly inaccessible to experimental investigation and they confounded explanation in theoretical terms. Drawing on the idea that models can be autonomous agents in the production of scientific knowledge, I argue that one group of models in particular were functionally autonomous from the Hughes–Ingold theory. Cope and Hardy’s models of the Claisen and Cope rearrangements were resources for the exploration of the Hughes–Ingold theory that otherwise lacked explanatory power. By generating ‘how-possibly’ explanations, these models explained how these rearrangements could happen rather than why they did happen. Furthermore, although these models were apparently closely connected to theory in terms of their construction, I argue that partial autonomy issued in extra-logical factors concerning the attitudes of American chemists to the Hughes–Ingold theory. And in the absence of a complete theoretical hegemony, a degree of consensus was reached concerning modelling the Claisen rearrangement mechanism. 相似文献
4.
Uskali Mäki 《Studies in history and philosophy of science》2009,40(2):185-195
A newly emerged field within economics, known as geographical economics, claims to have provided a unified approach to the study of spatial agglomerations at different spatial scales by showing how these can be traced back to the same basic economic mechanisms. We analyse this contemporary episode of explanatory unification in relation to major philosophical accounts of unification. In particular, we examine the role of argument patterns in unifying derivations, the role of ontological convictions and mathematical structures in shaping unification, the distinction between derivational and ontological unification, the issue of how explanation and unification relate, and finally the idea that unification comes in degrees. 相似文献
5.
When evaluating the launch of a new product or service, forecasts of the diffusion path and the effects of the marketing mix are critically important. Currently no unified framework exists to provide guidelines on the inclusion and specification of marketing mix variables into models of innovation diffusion. The objective of this research is to examine empirically the role of prices in diffusion models, in order to establish whether price can be incorporated effectively into the simpler time-series models. Unlike existing empirical research which examines the models' fit to historical data, we examine the predictive validity of alternative models. Only if the incorporation of prices improves the predictive performance of diffusion models can it be argued that these models have validity. A series of diffusion models which include prices are compared against a number of well-accepted diffusion models, including the Bass (1969) model, and more recently developed ‘flexible’ diffusion models. For short data series and long-lead time forecasting, the situation typical of practical situations, price rarely added to the forecasting capability of simpler time-series models. Copyright © 1998 John Wiley & Sons, Ltd. 相似文献
6.
The role of p53 in tumour suppression: lessons from mouse models 总被引:10,自引:1,他引:9
The use of mouse models has greatly contributed to our understanding of the role of p53 in tumour suppression. Mice homozygous
for a deletion in the p53 gene develop tumours at high frequency, providing essential evidence for the importance of p53 as
a tumour suppressor. Additionally, crossing these knockout mice or transgenic expression p53 dominant negative alleles with
other tumour-prone mouse strains has allowed the effect of p53 loss on tumour development to be examined further. In a variety
of mouse models, absence of p53 facilitates tumorigenesis, thus providing a means to study how the lack of p53 enhances tumour
development and to define genetic pathways of p53 action. Depending on the particular model system, loss of p53 either results
in deregulated cell-cylce entry or aberrant apoptosis (programmed cell death), confirming results found in cell culture systems
and providing insight into in vitro function of p53. Finally, as p53 null mice rapidly develop tumours, they are useful for
evaluating agents for either chemopreventative or therapeutic activities. 相似文献
7.
M. A. Persinger 《Cellular and molecular life sciences : CMLS》1987,43(1):39-48
Summary Research concerning the complex relation between weather and psychological processes has emphasized three important issues: methodological problems, the determination of the major behavioral factors, and the isolation of neurobiological mechanisms. This paper reviews the current status of each issue. Weather changes are most frequently associated with behaviors that are the endpoints of inferred psychological processes that include mood, subclinical pain, anxiety, and the correlates of schedule shifts. Learning and conditioning appear to mediate a powerful influence over weather-related responses. This may explain the large individual variability in these behaviors. The most well-known group effects associated with weather changes involve psychiatric populations. Clinical subpopulations may respond in different ways to different aspects of the same weather system as well as to different types of air masses. Likely neurobiological mechanisms through which meteorogenic stimuli may mediate whole organismic effects include the locus coeruleal and limbic systems. Expected psychobiological consequences are examined in detail. The magnitude and temporal-spatial characteristics of weather effects indicate they are the subject matter of behavioral epidemiology. 相似文献
8.
Behavioral economics is a field of study that is often thought of as interdisciplinary, insofar as it uses psychological insights to inform economic models. Yet the level of conceptual and methodological exchange between the two disciplines is disputed in the literature. On the one hand, behavioral economic models are often presented as psychologically informed models of individual decision-making (Camerer & Loewenstein, 2003). On the other hand, these models have often been criticized for being merely more elaborated “as if” economic models (Berg & Gigerenzer, 2010). The aim of this paper is to contribute to this debate by looking at a central topic in behavioral economics: the case of social preferences. Have findings or research methods been exchanged between psychology and economics in this research area? Have scientists with different backgrounds “travelled” across domains, thus transferring their expertise from one discipline to another? By addressing these and related questions, this paper will assess the level of knowledge transfer between psychology and economics in the study of social preferences. 相似文献
9.
From sunspots to the Southern Oscillation: confirming models of large-scale phenomena in meteorology
Christopher Pincock 《Studies in history and philosophy of science》2009,40(1):45-56
The epistemic problem of assessing the support that some evidence confers on a hypothesis is considered using an extended example from the history of meteorology. In this case, and presumably in others, the problem is to develop techniques of data analysis that will link the sort of evidence that can be collected to hypotheses of interest. This problem is solved by applying mathematical tools to structure the data and connect them to the competing hypotheses. I conclude that mathematical innovations provide crucial epistemic links between evidence and theories precisely because the evidence and theories are mathematically described. 相似文献
10.
S. G. Hall 《Journal of forecasting》1986,5(4):205-215
This paper considers the consequences of the stochastic error process in large non-linear forecasting models. As such models are non-linear, the deterministic forecast is neither the mean nor the mode of the density function of the endogenous variables. Under a specific assumption as to the class of the non-linearity it is shown that the deterministic forecast is actually the vector of marginal medians of the density function. Stochastic simulation techniques are then used to test whether one large forecasting model actually lies within this class. 相似文献
11.
This paper compares the properties of a structural model—the London Business School model of the U.K. economy—with a time series model. Information provided by this type of comparison is a useful diagnostic tool for detecting types of model misspecification. This is a more meaningful way of proceeding rather than attempting to establish the superiority of one type of model over another. In lieu of a better structural model, the effects of inappropriate dynamic specification can be reduced by combining the forecasts of both the structural and time series models. For many variables considered here these provide more accurate forecasts than each of the model types alone. 相似文献
12.
David Gabauer 《Journal of forecasting》2020,39(5):788-796
This study introduces volatility impulse response functions (VIRF) for dynamic conditional correlation–generalized autoregressive conditional heteroskedasticity (DCC-GARCH) models. In addition, the implications with respect to network analysis—using the connectedness approach of Diebold and Y lmaz (Journal of Econometrics, 2014, 182(1), 119–134)—is discussed. The main advantages of this framework are (i) that the time-varying dynamics do not underlie a rolling-window approach and (ii) that it allows us to test whether the propagation mechanism is time varying or not. An empirical analysis on the volatility transmission mechanism across foreign exchange rate returns is illustrated. The results indicate that the Swiss franc and the euro are net transmitters of shocks, whereas the British pound and the Japanese yen are net volatility receivers of shocks. Finally, the findings suggest a high degree of comovement across European currencies, which has important portfolio and risk management implications. 相似文献
13.
Icaro Romolo Sousa Agostino Wesley Vieira da Silva Claudimar Pereira da Veiga Adriano Mendonça Souza 《Journal of forecasting》2020,39(7):1043-1056
The purpose of this paper is to present the result of a systematic literature review regarding the application and development of forecasting models in the industrial context, especially the context of manufacturing processes and operations management. The study was conducted considering the preparation of an established research protocol to know, discuss, and analyze the main approaches adopted by researchers in the field. To achieve this objective, we analyzed 354 recent papers published in periodicals between 2008 and 2018. This paper makes three main contributions to the field: (i) it presents an updated portfolio of prediction models in the industrial context, providing a reference point for researchers and industrial managers; (ii) it presents a characterization of the field of study through the identification of publication vehicles, frequency, and the principal authors and countries related to the development of research on the theme; (iii) it proposes a unified framework, listing the characteristics of the prediction models with their respective application contexts, identifying the current research directions to provide theoretical aids for the development of new approaches to forecasting in industry. The results of this study provide an empirical base for further discussions on studies that focus on forecasting in the industrial context. 相似文献
14.
Stephen K. McNees 《Journal of forecasting》1982,1(1):37-48
This article stresses how little is known about the quality, particularly the relative quality, of macroeconometric models. Most economists make a strict distinction between the quality of a model per se and the accuracy of solutions based on that model. While this distinction is valid, it leaves unanswered how to compare the‘validity’of conditional models. The standard test, the accuracy of ex post simulations, is not definitive when models with differing degrees of exogeneity are compared. In addition, it is extremely difficult to estimate the relative quantitative importance of conceptual problems of models, such as parameter instability across‘policy regimes’ In light of the difficulty in comparisons of conditional macroeconometric models, many model-builders and users assume that the best models are those that have been used to make the most accurate forecasts are those made with the best models. Forecasting experience indicates that forecasters using macroeconometric models have produced more accurate macroeconomic forecasts than either naive or sophisticated unconditional statistical models. It also suggests that judgementally adjusted forecasts have been more accurate than model-based forecasts generated mechanically. The influence of econometrically-based forecasts is now so pervasive that it is difficult to find examples of‘purely judgemental’forecasts. 相似文献
15.
I analyse the case of three Japanese-Portuguese interpreters who have given support to technology transfer from a steel company in Japan to one in Brazil for more than thirty years. Their job requires them to be ‘interactional experts’ in steel-making. The Japanese–Portuguese interpreters are immersed in more than the language of steel-making as their job involves a great deal of ‘physical contiguity’ with steel-making practice. Physical contiguity undoubtedly makes the acquisition of interactional expertise easier. This draws attention to the lack of empirical work on the exact way that the physical and the linguistic interact in the acquisition of interactional expertise, or any other kind of expertise. 相似文献
16.
Sarita D. Lee Andy A. Shen Junhyung Park Ryan J. Harrigan Nicole A. Hoff Anne W. Rimoin Frederic Paik Schoenberg 《Journal of forecasting》2022,41(1):201-210
Point process models, such as Hawkes and recursive models, have recently been shown to offer improved accuracy over more traditional compartmental models for the purposes of modeling and forecasting the spread of disease epidemics. To explicitly test the performance of these two models in a real-world and ongoing epidemic, we compared the fit of Hawkes and recursive models to outbreak data on Ebola virus disease (EVD) in the Democratic Republic of the Congo in 2018–2020. The models were estimated, and the forecasts were produced, time-stamped, and stored in real time, so that their prospective value can be assessed and to guard against potential overfitting. The fit of the two models was similar, with both models resulting in much smaller errors in the beginning and waning phases of the epidemic and with slightly smaller error sizes on average for the Hawkes model compared with the recursive model. Our results suggest that both Hawkes and recursive point process models can be used in near real time during the course of an epidemic to help predict future cases and inform management and mitigation strategies. 相似文献
17.
On-line prediction of electric load in the buses of the EHV grid of a power generation and transmission system is basic information required by on-line procedures for centralized advanced dispatching of power generation. This paper presents two alternative approaches to on-line short term forecasting of the residual component of the load obtained after the removal of the base load from a time series of total load. The first approach involves the use of stochastic ARMA models with time-varying coefficients. The second consists in the use of an extension of Wiener filtering due to Zadeh and Ragazzini. Real data representing a load process measured in an area of Northern Italy and simulated data reproducing a non-stationary process with known characteristics constitute the basis of a numerical comparison allowing one to determine under which conditions each method is more appropriate. 相似文献
18.
Georgios Tsiotas 《Journal of forecasting》2020,39(2):296-312
Value at risk (VaR) is a risk measure widely used by financial institutions in allocating risk. VaR forecast estimation involves the conditional evaluation of quantiles based on the currently available information. Recent advances in VaR evaluation incorporate a proxy for conditional variance, yielding the conditional autoregressive VaR (CAViaR) models. However, early work in finance literature has shown that the introduction of power transformations has resulted in improvements in volatility forecasting. Having a direct association between volatility and conditional VaR, we adopt power-transformed CAViaR models. We investigate whether the flexible conditional VaR dynamics associated with power-transformed CAViaR models can result in better forecasting results than those assumed by the nontransformed CAViaR models. Estimation in CAViaR models is based on an early-rejection Markov chain Monte Carlo algorithm. We illustrate our forecasting evaluation results using simulated and financial daily return data series. The results demonstrate that there is strong evidence that supports the use of power-transformed CAViaR models when forecasting VaR. 相似文献
19.
This paper presents the system of analysis used by the Xerox Corporation to relate the external environment to company decisions. The system is sophisticated and elaborate, comes to grips with such issues as product forecasting, market monitoring, activity monitoring, materials and labour cost analysis, and product price analysis. In addition, the system examines the longer-term issues associated with corporate strategy, with the more recent initiatives directed toward the strategic focus The Xerox case illustrates very well how externally provided forecasts of economic environments, both at home and abroad, can be used as inputs to a variety of econometric products to serve the individual corporation. The challenge in this work is to build the bridges from the external forces to the critical company decisions. That is a task which requires sophisticated tools and skilled professionals to accomplish. This case study shows what can be done. 相似文献
20.
The development of nineteenth-century geodetic measurement challenges the dominant coherentist account of metric success. Coherentists argue that measurements of a parameter are successful if their numerical outcomes convergence across varying contextual constraints. Aiming at numerical convergence, in turn, offers an operational aim for scientists to solve problems of coordination. Geodesists faced such a problem of coordination between two indicators of the earth's polar flattening, which were both based on imperfect ellipsoid models. While not achieving numerical convergence, their measurements produced novel data that grounded valuable theoretical hypotheses. Consequently, they ought to be regarded as epistemically successful. This insight warrants a dynamic revision of coherentism, which allows to judge the success of a metric based on both its coherence and fruitfulness. On that view, scientific measurement aims to coordinate theoretical definitions and produce novel data and theoretical insights. 相似文献