首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
The recent discussion on scientific representation has focused on models and their relationship to the real world. It has been assumed that models give us knowledge because they represent their supposed real target systems. However, here agreement among philosophers of science has tended to end as they have presented widely different views on how representation should be understood. I will argue that the traditional representational approach is too limiting as regards the epistemic value of modelling given the focus on the relationship between a single model and its supposed target system, and the neglect of the actual representational means with which scientists construct models. I therefore suggest an alternative account of models as epistemic tools. This amounts to regarding them as concrete artefacts that are built by specific representational means and are constrained by their design in such a way that they facilitate the study of certain scientific questions, and learning from them by means of construction and manipulation.  相似文献   

2.
The present paper argues that ‘mature mathematical formalisms’ play a central role in achieving representation via scientific models. A close discussion of two contemporary accounts of how mathematical models apply—the DDI account (according to which representation depends on the successful interplay of denotation, demonstration and interpretation) and the ‘matching model’ account—reveals shortcomings of each, which, it is argued, suggests that scientific representation may be ineliminably heterogeneous in character. In order to achieve a degree of unification that is compatible with successful representation, scientists often rely on the existence of a ‘mature mathematical formalism’, where the latter refers to a—mathematically formulated and physically interpreted—notational system of locally applicable rules that derive from (but need not be reducible to) fundamental theory. As mathematical formalisms undergo a process of elaboration, enrichment, and entrenchment, they come to embody theoretical, ontological, and methodological commitments and assumptions. Since these are enshrined in the formalism itself, they are no longer readily obvious to either the novice or the proficient user. At the same time as formalisms constrain what may be represented, they also function as inferential and interpretative resources.  相似文献   

3.
The Quantum Hall Effects offer a rich variety of theoretical and experimental advances. They provide interesting insights on such topics as gauge invariance, strong interactions in Condensed Matter physics, emergence of new paradigms. This paper focuses on some related philosophical questions. Various brands of positivism or agnosticism are confronted with the physics of the Quantum Hall Effects. Hacking׳s views on Scientific Realism, Chalmers׳ on Non-Figurative Realism are discussed. It is argued that the difficulties with those versions of realism may be resolved within a dialectical materialist approach. The latter is argued to provide a rational approach to the phenomena, theory and ontology of the Quantum Hall Effects.  相似文献   

4.
The purpose of this paper is to investigate the applicability of a contemporary time series forecasting technique, transfer function modeling, to the problem of forecasting sectoral employment levels in small regional economies. The specific sectoral employment levels to be forecast are manufacturing, durable manufacturing, non-durable manufacturing and non-manufacturing employment. Due to data constraints at the small region level, construction of traditional causal econometric models is often very difficult; thus time series approaches become particularly attractive. The results suggest that transfer function models using readily available national indicator series as drivers can provide more accurate forecasts of small region sectoral employment levels than univariate time series models.  相似文献   

5.
“Colligation”, a term first introduced in philosophy of science by William Whewell (1840), today sparks a renewed interest beyond Whewell scholarship. In this paper, we argue that adopting the notion of colligation in current debates in philosophy of science can contribute to our understanding of scientific models. Specifically, studying colligation allows us to have a better grasp of how integrating diverse model components (empirical data, theory, useful idealization, visual and other representational resources) in a creative way may produce novel generalizations about the phenomenon investigated. Our argument is built both on the theoretical appraisal of Whewell’s philosophy of science and the historical rehabilitation of his scientific work on tides. Adopting a philosophy of science in practice perspective, we show how colligation emerged from Whewell’s empirical work on tides. The production of idealized maps (“cotidal maps”) illustrates the unifying and creative power of the activity of colligating in scientific practice. We show the importance of colligation in modelling practices more generally by looking at its epistemic role in the construction of the San Francisco Bay Model.  相似文献   

6.
It has been acknowledged that wavelets can constitute a useful tool for forecasting in economics. Through a wavelet multi‐resolution analysis, a time series can be decomposed into different timescale components and a model can be fitted to each component to improve the forecast accuracy of the series as a whole. Up to now, the literature on forecasting with wavelets has mainly focused on univariate modelling. On the other hand, in a context of growing data availability, a line of research has emerged on forecasting with large datasets. In particular, the use of factor‐augmented models have become quite widespread in the literature and among practitioners. The aim of this paper is to bridge the two strands of the literature. A wavelet approach for factor‐augmented forecasting is proposed and put to test for forecasting GDP growth for the major euro area countries. The results show that the forecasting performance is enhanced when wavelets and factor‐augmented models are used together. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

7.
I propose a distinct type of robustness, which I suggest can support a confirmatory role in scientific reasoning, contrary to the usual philosophical claims. In model robustness, repeated production of the empirically successful model prediction or retrodiction against a background of independently-supported and varying model constructions, within a group of models containing a shared causal factor, may suggest how confident we can be in the causal factor and predictions/retrodictions, especially once supported by a variety of evidence framework. I present climate models of greenhouse gas global warming of the 20th Century as an example, and emphasize climate scientists' discussions of robust models and causal aspects. The account is intended as applicable to a broad array of sciences that use complex modeling techniques.  相似文献   

8.
We propose a framework to describe, analyze, and explain the conditions under which scientific communities organize themselves to do research, particularly within large-scale, multidisciplinary projects. The framework centers on the notion of a research repertoire, which encompasses well-aligned assemblages of the skills, behaviors, and material, social, and epistemic components that a group may use to practice certain kinds of science, and whose enactment affects the methods and results of research. This account provides an alternative to the idea of Kuhnian paradigms for understanding scientific change in the following ways: (1) it does not frame change as primarily generated and shaped by theoretical developments, but rather takes account of administrative, material, technological, and institutional innovations that contribute to change and explicitly questions whether and how such innovations accompany, underpin, and/or undercut theoretical shifts; (2) it thus allows for tracking of the organization, continuity, and coherence in research practices which Kuhn characterized as ‘normal science’ without relying on the occurrence of paradigmatic shifts and revolutions to be able to identify relevant components; and (3) it requires particular attention be paid to the performative aspects of science, whose study Kuhn pioneered but which he did not extensively conceptualize. We provide a detailed characterization of repertoires and discuss their relationship with communities, disciplines, and other forms of collaborative activities within science, building on an analysis of historical episodes and contemporary developments in the life sciences, as well as cases drawn from social and historical studies of physics, psychology, and medicine.  相似文献   

9.
In this paper, we argue that, contra Strevens (2013), understanding in the sciences is sometimes partially constituted by the possession of abilities; hence, it is not (in such cases) exhausted by the understander's bearing a particular psychological or epistemic relationship to some set of structured propositions. Specifically, the case will be made that one does not really understand why a modeled phenomenon occurred unless one has the ability to actually work through (meaning run and grasp at each step) a model simulation of the underlying dynamic.  相似文献   

10.
In this response, doubts are expressed relating to the treatment by Hoyningen-Huene and Oberheim of the relation between incommensurability and content comparison. A realist response is presented to their treatment of ontological replacement. Further questions are raised about the coherence of the neo-Kantian idea of the world-in-itself as well as the phenomenal worlds hypothesis. The notion of common sense is clarified. Meta-incommensurability is dismissed as a rhetorical device which obstructs productive discussion.  相似文献   

11.
This article addresses knowledge transfer dynamics in agent-based computational social science. The goal of the text is twofold. First, it describes the tensions arising from the convergence of different disciplinary traditions in the emergence of this new area of study and, second, it shows how these tensions are dealt with through the articulation of distinctive practices of knowledge production and transmission. To achieve this goal, three major instances of knowledge transfer dynamics in agent-based computational social science are analysed. The first instance is the emergence of the research field. Relations of knowledge transfer and cross-fertilisation between agent-based computational social science and wider and more established disciplinary areas: complexity science, computational science and social science, are discussed. The second instance is the approach to scientific modelling in the field. It is shown how the practice of agent-based modelling is affected by the conflicting coexistence of shared methodological commitments transferred from both empirical and formal disciplines. Lastly, the third instance pertains internal practices of knowledge production and transmission. Through the discussion of these practices, the tensions arising from converging dissimilar disciplinary traditions in agent-based computational social science are highlighted.  相似文献   

12.
Cumulative Sum techniques are widely used in quality control and model monitoring. A single-sided cusum may be regarded essentially as a sequence of sequential tests which, in many cases, such as those for the Exponential Family, is equivalent to a Sequence of Sequential Probability Ratio Tests. The relationship between cusums and Bayesian decisions is difficult to establish using conventional methods. An alternative approach is proposed which not only reveals a relation but also offers a very simple formulation of the decision process involved in model monitoring. This is first illustrated for a Normal mean and then extended to other important practical cases including Dynamic Models. For V-mask cusum graphs a particular feature is the interpretation of the distance of the V vertex from the latest plotted point in terms of the prior precision as measured in ‘equivalent’ observations.  相似文献   

13.
The history of modern economics abounds with pleas for more pluralism as well as pleas for more unification. These seem to be contradictory goals, suggesting that pluralism and unification are mutually exclusive, or at least that they involve trade-offs with more of one necessarily being traded off against less of the other. This paper will use the example of Paul Samuelson's Foundations of Economic Analysis (1947) to argue that the relationship between pluralism and unification is often more complex than this simple dichotomy suggests. In particular, Samuelson's Foundations is invariably presented as a key text in the unification of modern economics during the middle of the twentieth century; and in many ways that is entirely correct. But Samuelson's unification was not at the theoretical (causal and explanatory) level, but rather at the purely mathematical derivational level. Although this fact is recognized in the literature on Samuelson, what seems to be less recognized is that for Samuelson, much of the motivation for this unification was pluralist in spirit: not to narrow scientific economics into one single theory, but rather to allow for more than one theory to co-exist under a single unified derivational technique. This hidden pluralism will be discussed in detail. The paper concludes with a discussion of the implications for more recent developments in economics.  相似文献   

14.
As a discipline distinct from ecology, conservation biology emerged in the 1980s as a rigorous science focused on protecting biodiversity. Two algorithmic breakthroughs in information processing made this possible: place-prioritization algorithms and geographical information systems. They provided defensible, data-driven methods for designing reserves to conserve biodiversity that obviated the need for largely intuitive and highly problematic appeals to ecological theory at the time. But the scientific basis of these achievements and whether they constitute genuine scientific progress has been criticized. We counter by pointing out important inaccuracies about the science and rejecting the apparent theory-first focus. More broadly, the case study reveals significant limitations of the predominant epistemic-semantic conceptions of scientific progress and the considerable merits of pragmatic, practically-oriented accounts.  相似文献   

15.
This paper uses a meta‐analysis to survey existing factor forecast applications for output and inflation and assesses what causes large factor models to perform better or more poorly at forecasting than other models. Our results suggest that factor models tend to outperform small models, whereas factor forecasts are slightly worse than pooled forecasts. Factor models deliver better predictions for US variables than for UK variables, for US output than for euro‐area output and for euro‐area inflation than for US inflation. The size of the dataset from which factors are extracted positively affects the relative factor forecast performance, whereas pre‐selecting the variables included in the dataset did not improve factor forecasts in the past. Finally, the factor estimation technique may matter as well. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

16.
This is a case study of a closely managed product. Its purpose is to determine whether time-series methods can be appropriate for business planning. By appropriate, we mean two things: whether these methods can model and estimate the special events or features that are often present in sales data; and whether they can forecast accurately enough one, two and four quarters ahead to be useful for business planning. We use two time-series methods, Box-Jenkins modeling and Holt-Winters adaptive forecasting, to obtain forecasts of shipments of a closely managed product. We show how Box-Jenkins transfer-function models can account for the special events in the data. We develop criteria for choosing a final model which differ from the usual methods and are specifically directed towards maximizing the accuracy of next-quarter, next-half-year and next-full-year forecasts. We find that the best Box-Jenkins models give forecasts which are clearly better than those obtained from Holt-Winters forecast functions, and are also better than the judgmental forecasts of IBM's own planners. In conclusion, we judge that Box-Jenkins models can be appropriate for business planning, in particular for determining at the end of the year baseline business-as-usual annual and monthly forecasts for the next year, and in mid-year for resetting the remaining monthly forecasts.  相似文献   

17.
Many publications on tourism forecasting have appeared during the past twenty years. The purpose of this article is to organize and summarize that scattered literature. General conclusions are also drawn from the studies to help those wishing to develop tourism forecasts of their own. The forecasting techniques discussed include time series models, econometric causal models, the gravity model and expert-opinion techniques. The major conclusions are that time series models are the simplest and least costly (and therefore most appropriate for practitioners); the gravity model is best suited to handle international tourism flows (and will be most useful to governments and tourism agencies); and expert-opinion methods are useful when data are unavailable. Further research is needed on the use of economic indicators in tourism forecasting, on the development of attractivity and emissiveness indexes for use in gravity and econometric models and on empirical comparisons among the different methods.  相似文献   

18.
The problem of establishing intensional criteria to demarcate science from non-science, and in particular science from pseudoscience, received a great amount of attention in the 20th century philosophy of science. It remains unsolved. This article compares demarcation criteria found in Marcus Tullius Cicero’s rejection of genethliac astrology and other pseudo-divinatory techniques in his De divinatione (44 BCE) with criteria advocated by a broad selection of modern philosophers of science and other specialists in science studies. Remarkable coincidences across two millennia are found on five basic criteria, which hints at a certain historical stability of some of the most fundamental features of a concept of “science” broadly construed.  相似文献   

19.
The study of brand choice decisions with multiple alternatives has been successfully modelled for more than a decade using the Multinomial Logit model. Recently, neural network modelling has received increasing attention and has been applied to an array of marketing problems such as market response or segmentation. We show that a Feedforward Neural Network with Softmax output units and shared weights can be viewed as a generalization of the Multinomial Logit model. The main difference between the two approaches lies in the ability of neural networks to model non‐linear preferences with few (if any) a priori assumptions about the nature of the underlying utility function, while the Multinomial Logit can suffer from a specification bias. Being complementary, these approaches are combined into a single framework. The neural network is used as a diagnostic and specification tool for the Logit model, which will provide interpretable coefficients and significance statistics. The method is illustrated on an artificial dataset where the market is heterogeneous. We then apply the approach to panel scanner data of purchase records, using the Logit to analyse the non‐linearities detected by the neural network. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

20.
In a recent paper, Otávio Bueno (2012) introduced a narrower understanding of Hacking's concept of styles of scientific reasoning. Although its ultimate goal is to serve a pluralist view of science, Bueno's proposal is a thought-provoking attempt at outlining a concept of style that would keep most of the original understanding's heuristic value, while providing some analytical grip on the specific details of particular scientific practices. In this reply, I consider solely this latter more proximate goal. More precisely, I assess whether or not Bueno's narrower understanding of styles could provide historians and philosophers of science with a workable unit to investigate particular transformations in scientific practices. While the author's proposal is certainly interesting overall, the usefulness of the unit it describes may be compromised by three shortcomings: 1° the extent to which the unit is meant to be narrower is indeterminate; 2° it does not improve much on the analytical capabilities of Hacking's concept; and 3° like Hacking's concept it is rather powerless to capture the dynamical character of particular scientific practices.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号