首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
The critics of rational choice theory (RCT) frequently build on the contrast between so-called thick and thin applications of RCT to argue that thin RCT lacks the potential to explain the choices of real-world agents. In this paper, I draw on often-cited RCT applications in several decision sciences to demonstrate that despite this prominent critique there are at least two different senses in which thin RCT can explain real-world agents’ choices. I then defend this thesis against the most influential objections put forward by the critics of RCT. In doing so, I explicate the implications of my thesis for the ongoing philosophical debate concerning the explanatory potential of RCT and the comparative merits of widely endorsed accounts of explanation.  相似文献   

2.
Scientists often diverge widely when choosing between research programs. This can seem to be rooted in disagreements about which of several theories, competing to address shared questions or phenomena, is currently the most epistemically or explanatorily valuable—i.e. most successful. But many such cases are actually more directly rooted in differing judgments of pursuit-worthiness, concerning which theory will be best down the line, or which addresses the most significant data or questions. Using case studies from 16th-century astronomy and 20th-century geology and biology, I argue that divergent theory choice is thus often driven by considerations of scientific process, even where direct epistemic or explanatory evaluation of its final products appears more relevant. Broadly following Kuhn's analysis of theoretical virtues, I suggest that widely shared criteria for pursuit-worthiness function as imprecise, mutually-conflicting values. However, even Kuhn and others sensitive to pragmatic dimensions of theory ‘acceptance’, including the virtue of fruitfulness, still commonly understate the role of pursuit-worthiness—especially by exaggerating the impact of more present-oriented virtues, or failing to stress how ‘competing’ theories excel at addressing different questions or data. This framework clarifies the nature of the choice and competition involved in theory choice, and the role of alternative theoretical virtues.  相似文献   

3.
Claims that the standard procedure for testing scientific theories is inapplicable to Everettian quantum theory, and hence that the theory is untestable, are due to misconceptions about probability and about the logic of experimental testing. Refuting those claims by correcting those misconceptions leads to an improved theory of scientific methodology (based on Popper׳s) and testing, which allows various simplifications, notably the elimination of everything probabilistic from the methodology (‘Bayesian’ credences) and from fundamental physics (stochastic processes).  相似文献   

4.
A question at the intersection of scientific modeling and public choice is how to deal with uncertainty about model predictions. This “high-level” uncertainty is necessarily value-laden, and thus must be treated as irreducibly subjective. Nevertheless, formal methods of uncertainty analysis should still be employed for the purpose of clarifying policy debates. I argue that such debates are best informed by models which integrate objective features (which model the world) with subjective ones (modeling the policy-maker). This integrated subjectivism is illustrated with a case study from the literature on monetary policy. The paper concludes with some morals for the use of models in determining climate policy.  相似文献   

5.
I argue that the Oxford school Everett interpretation is internally incoherent, because we cannot claim that in an Everettian universe the kinds of reasoning we have used to arrive at our beliefs about quantum mechanics would lead us to form true beliefs. I show that in an Everettian context, the experimental evidence that we have available could not provide empirical confirmation for quantum mechanics, and moreover that we would not even be able to establish reference to the theoretical entities of quantum mechanics. I then consider a range of existing Everettian approaches to the probability problem and show that they do not succeed in overcoming this incoherence.  相似文献   

6.
String theory has been the dominating research field in theoretical physics during the last decades. Despite the considerable time elapse, no new testable predictions have been derived by string theorists and it is understandable that doubts have been voiced. Some people have argued that it is time to give up since testability is wanting. But the majority has not been convinced and they continue to believe that string theory is the right way to go. This situation is interesting for philosophy of science since it highlights several of our central issues. In this paper we will discuss string theory from a number of different perspectives in general methodology. We will also relate the realism/antirealism debate to the current status of string theory. Our goal is two-fold; both to take a look at string theory from philosophical perspectives and to use string theory as a test case for some philosophical issues.  相似文献   

7.
This paper reviews, extends and applies alternative normative decision models for the assessment of the value of forecast information, concentrating primarily on the factors influencing the tractability of assessment and interpretation in specific decision problems. As an empirical illustration of t lhe models, the paper presents a valuation analysis of the published judgmental price forecasts of a veteran analyst of the hog market to a perfectly competitive farm enterprise.  相似文献   

8.
9.
    
In recent papers, Zurek [(2005). Probabilities from entanglement, Born's rule pk=|ψk|2 from entanglement. Physical Review A, 71, 052105] has objected to the decision-theoretic approach of Deutsch [(1999) Quantum theory of probability and decisions. Proceedings of the Royal Society of London A, 455, 3129–3137] and Wallace [(2003). Everettian rationality: defending Deutsch's approach to probability in the Everett interpretation. Studies in History and Philosophy of Modern Physics, 34, 415–438] to deriving the Born rule for quantum probabilities on the grounds that it courts circularity. Deutsch and Wallace assume that the many worlds theory is true and that decoherence gives rise to a preferred basis. However, decoherence arguments use the reduced density matrix, which relies upon the partial trace and hence upon the Born rule for its validity. Using the Heisenberg picture and quantum Darwinism—the notion that classical information is quantum information that can proliferate in the environment pioneered in Ollivier et al. [(2004). Objective properties from subjective quantum states: Environment as a witness. Physical Review Letters, 93, 220401 and (2005). Environment as a witness: Selective proliferation of information and emergence of objectivity in a quantum universe. Physical Review A, 72, 042113]—I show that measurement interactions between two systems only create correlations between a specific set of commuting observables of system 1 and a specific set of commuting observables of system 2. This argument picks out a unique basis in which information flows in the correlations between those sets of commuting observables. I then derive the Born rule for both pure and mixed states and answer some other criticisms of the decision theoretic approach to quantum probability.  相似文献   

10.
This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning.  相似文献   

11.
Summary Both male and femaleOrchesella cincta (Collembola) were able to discriminate between spermatophores of different origin. Females chose spermatophores deposited by closely related males while males preferentially destroyed spermatophores of other males.  相似文献   

12.
While forecasting involves forward/predictive thinking, it depends crucially on prior diagnosis for suggesting a model of the phenomenon, for defining‘relevant’variables, and for evaluating forecast accuracy via the model. The nature of diagnostic thinking is examined with respect to these activities. We first consider the difficulties of evaluating forecast accuracy without a causal model of what generates outcomes. We then discuss the development of models by considering how attention is directed to variables via analogy and metaphor as well as by what is unusual or abnormal. The causal relevance of variables is then assessed by reference to probabilistic signs called‘cues to causality’. These are: temporal order, constant conjunction, contiguity in time and space, number of alternative explanations, similarity, predictive validity, and robustness. The probabilistic nature of the cues is emphasized by discussing the concept of spurious correlation and how causation does not necessarily imply correlation. Implications for improving forecasting are considered with respect to the above issues.  相似文献   

13.
The development of evolutionary game theory (EGT) is closely linked with two interdisciplinary exchanges: the import of game theory into biology, and the import of biologists’ version of game theory into economics. This paper traces the history of these two import episodes. In each case the investigation covers what exactly was imported, what the motives for the import were, how the imported elements were put to use, and how they related to existing practices in the respective disciplines. Two conclusions emerged from this study. First, concepts derived from the unity of science discussion or the unification accounts of explanation are too strong and too narrow to be useful for analysing these interdisciplinary exchanges. Secondly, biology and economics—at least in relation to EGT—show significant differences in modelling practices: biologists seek to link EGT models to concrete empirical situations, whereas economists pursue conceptual exploration and possible explanation.  相似文献   

14.
The paper challenges a recent attempt by Jouni-Matti Kuukkanen to show that since Thomas Kuhn’s philosophical standpoint can be incorporated into coherentist epistemology, it does not necessarily lead to: (Thesis 1) an abandonment of rationality and rational interparadigm theory comparison, nor to (Thesis 2) an abandonment of convergent realism. Leaving aside the interpretation of Kuhn as a coherentist, we will show that Kuukkanen’s first thesis is not sufficiently explicated, while the second one entirely fails. With regard to Thesis 1, we argue that Kuhn’s view on inter-paradigm theory comparison allows only for (what we shall dub as) ‘the weak notion of rationality’, and that Kuukkanen’s argument is thus acceptable only in view of such a notion. With regard to Thesis 2, we show that even if we interpret Kuhn as a coherentist, his philosophical standpoint cannot be seen as compatible with convergent realism since Kuhn’s argument against it is not ‘ultimately empirical’, as Kuukkanen takes it to be.  相似文献   

15.
In the last decade much has been made of the role that models play in the epistemology of measurement. Specifically, philosophers have been interested in the role of models in producing measurement outcomes. This discussion has proceeded largely within the context of the physical sciences, with notable exceptions considering measurement in economics. However, models also play a central role in the methods used to develop instruments that purport to quantify psychological phenomena. These methods fall under the umbrella term ‘psychometrics’. In this paper, we focus on Clinical Outcome Assessments (COAs) and discuss two measurement theories and their associated models: Classical Test Theory (CTT) and Rasch Measurement Theory. We argue that models have an important role to play in coordinating theoretical terms with empirical content, but to do so they must serve: 1) as a representation of the measurement interaction; and 2) in conjunction with a theory of the attribute in which we are interested. We conclude that Rasch Measurement Theory is a more promising approach than CTT in these regards despite the latter's popularity with health outcomes researchers.  相似文献   

16.
Kuhn argued against both the correspondence theory of truth and convergent realism. Although he likely misunderstood the nature of the correspondence theory, which it seems he wrongly believed to be an epistemic theory, Kuhn had an important epistemic point to make. He maintained that any assessment of correspondence between beliefs and reality is not possible, and therefore, the acceptance of beliefs and the presumption of their truthfulness has to be decided on the basis of other criteria. I will show that via Kuhn’s suggested epistemic values, specifically via problem-solving, his philosophy can be incorporated into a coherentist epistemology. Further, coherentism is, in principle, compatible with convergent realism. However, an argument for increasing likeness to truth requires appropriate historical continuity. Kuhn maintained that the history of science is full of discontinuity, and therefore, the historical condition of convergent realism is not satisfied.  相似文献   

17.
MicroRNAs (miRNAs) are natural, single-stranded, small RNA molecules which subtly control gene expression. Several studies indicate that specific miRNAs can regulate heart function both in development and disease. Despite prevention programs and new therapeutic agents, cardiovascular disease remains the main cause of death in developed countries. The elevated number of heart failure episodes is mostly due to myocardial infarction (MI). An increasing number of studies have been carried out reporting changes in miRNAs gene expression and exploring their role in MI and heart failure. In this review, we furnish a critical analysis of where the frontier of knowledge has arrived in the fields of basic and translational research on miRNAs in cardiac ischemia. We first summarize the basal information on miRNA biology and regulation, especially concentrating on the feedback loops which control cardiac-enriched miRNAs. A focus on the role of miRNAs in the pathogenesis of myocardial ischemia and in the attenuation of injury is presented. Particular attention is given to cardiomyocyte death (apoptosis and necrosis), fibrosis, neovascularization, and heart failure. Then, we address the potential of miR-diagnosis (miRNAs as disease biomarkers) and miR-drugs (miRNAs as therapeutic targets) for cardiac ischemia and heart failure. Finally, we evaluate the use of miRNAs in the emerging field of regenerative medicine.  相似文献   

18.
The paper illustrates how organic chemists dramatically altered their practices in the middle part of the twentieth century through the adoption of analytical instrumentation — such as ultraviolet and infrared absorption spectroscopy and nuclear magnetic resonance spectroscopy — through which the difficult process of structure determination for small molecules became routine. Changes in practice were manifested in two ways: in the use of these instruments in the development of ‘rule-based’ theories; and in an increased focus on synthesis, at the expense of chemical analysis. These rule-based theories took the form of generalizations relating structure to chemical and physical properties, as measured by instrumentation. This ‘Instrumental Revolution’ in organic chemistry was two-fold: encompassing an embrace of new tools that provided unprecedented access to structures, and a new way of thinking about molecules and their reactivity in terms of shape and structure. These practices suggest the possibility of a change in the ontological status of chemical structures, brought about by the regular use of instruments. The career of Robert Burns Woodward (1917–1979) provides the central historical examples for the paper. Woodward was an organic chemist at Harvard from 1937 until the time of his death. In 1965, he won the Nobel Prize in Chemistry.  相似文献   

19.
信息几何是在Riemann流形上采用现代微分几何方法来研究统计学问题的基础性、前沿性学科,被誉为是继Shannon开辟现代信息理论之后的又一新的理论变革,在信息科学与系统理论研究领域展现出了巨大的发展潜力.本文首先从参数化概率分布族的内蕴几何结构特征与信息的几何性质出发,精炼了信息几何的科学内涵,指出信息几何相比于经典统计学与信息论的理论优势与方法的革新.然后简要阐述了信息几何与微分几何的联系,综述了信息几何理论的发展历史与近20年来信息几何在神经网络、统计推断、通信编码、系统理论、物理学和医学成像等各领域应用的研究现状,归纳和总结了其中所体现的信息几何的基本原理和基本方法,并对信息几何的发展给予注记.特别地,对信息几何在信号处理领域中的应用成果进行了较全面的总结和概括,阐述了信息几何在信号检测、参数估计与滤波等方面的最新研究成果.最后,展望信息几何的发展前景,提出了信息几何在信号处理领域中的若干开放性问题.  相似文献   

20.
农业易相发展理论与中国农业现代化   总被引:3,自引:0,他引:3  
在21世纪即将来临的生物经济时代,生命科学与生物技术将对农业产生广泛而深远的影响。本文在对现代农业进行反思的基础上,通过比较研究与实证分析,研究提出了农业易相发展理论和基于生物经济时代的新型农业体系的概念框架。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号