首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
It is argued that we cannot understand the notion of proper functions of artefacts independently of social notions. Functions of artefacts are related to social facts via the use of artefacts. The arguments in this article can be used to improve existing function theories that look to the causal history of artefacts to determine the function. A view that takes the intentions of designers into account to determine the proper function is both natural and often correct, but it is shown that there are exceptions to this. Taking a social constitutive element into account may amend these backwards looking theories. An improved theory may either have a disjunctive form—either the history or collective intentions determine the proper function—or, as is suggested in the article, be in the form of an encompassing account that views the designers’ intentions as social, in so far as they are accepted by the users. Designers have authority, which is a social fact. The views argued for here are applied to two existing theories of artefact functions, a causal historic approach and an action theoretic approach.  相似文献   

2.
John Norton's The Material Theory of Induction bristles with fresh insights and provocative ideas that provide a much needed stimulus to a stodgy if not moribund field. I use quantum mechanics (QM) as a medium for exploring some of these ideas. First, I note that QM offers more predictability than Newtonian mechanics for the Norton dome and other cases where classical determinism falters. But this ability of QM to partially cure the ills of classical determinism depends on facts about the quantum Hamiltonian operator that vary from case to case, providing an illustration of Norton's theme of the importance of contingent facts for inductive reasoning. Second, I agree with Norton that Bayesianism as developed for classical probability theory does not constitute a universal inference machine, and I use QM to explain the sense in which this is so. But at the same time I defend a brand of quantum Bayesianism as providing an illuminating account of how physicists' reasoning about quantum events. Third, I argue that if the probabilities induced by quantum states are regarded as objective chances then there are strong reasons to think that fair infinite lotteries are impossible in a quantum world.  相似文献   

3.
I present an account of classical genetics to challenge theory-biased approaches in the philosophy of science. Philosophers typically assume that scientific knowledge is ultimately structured by explanatory reasoning and that research programs in well-established sciences are organized around efforts to fill out a central theory and extend its explanatory range. In the case of classical genetics, philosophers assume that the knowledge was structured by T. H. Morgan’s theory of transmission and that research throughout the later 1920s, 30s, and 40s was organized around efforts to further validate, develop, and extend this theory. I show that classical genetics was structured by an integration of explanatory reasoning (associated with the transmission theory) and investigative strategies (such as the ‘genetic approach’). The investigative strategies, which have been overlooked in historical and philosophical accounts, were as important as the so-called laws of Mendelian genetics. By the later 1920s, geneticists of the Morgan school were no longer organizing research around the goal of explaining inheritance patterns; rather, they were using genetics to investigate a range of biological phenomena that extended well beyond the explanatory domain of transmission theories. Theory-biased approaches in history and philosophy of science fail to reveal the overall structure of scientific knowledge and obscure the way it functions.  相似文献   

4.
In this paper, a response to Ed Levy's discussion of medical quantification, I reflect on the ambitions of my book Trust in Numbers. I explore the idealized method of randomized clinical trials, revealed in his case study, as a social technology, one endowed with a persuasive scientific rationale but shaped also by political and social demands. The scholarly study of quantification requires not a choice between blind admiration and sweeping rejection, but a nuanced understanding. This should take into account not only the cognitive aspects of science, but also its role in relation to institutions and customs, examined with some specificity. While history is narrowed and distorted when it is written to support a position on some present issue, historical and social studies of science should at least provide tools of criticism. For this, the historian of science must look beyond narrow communities of specialists, and seek a wider perspective on science as an administrative tool and a bearer of cultural and political values.  相似文献   

5.
In his account of probable reasoning, Poincaré used the concept, or at least the language, of conventions. In particular, he claimed that the prior probabilities essential for inverse probable reasoning are determined conventionally. This paper investigates, in the light of Poincaré's well known claim about the conventionality of metric geometry, what this could mean, and how it is related to other views about the determination of prior probabilities. Particular attention is paid to the similarities and differences between Poincaré's conventionalism as it applies to probabilities and de Finetti's subjectivism. The aim of the paper is to suggest that in accounts of the development of ideas about probable reasoning, particularly those customarily described as Bayesian, Poincaré's discussion deserves more attention than it has so far received.  相似文献   

6.
It is often held by philosophers of science that special, idealized situations are prior to complex cases in several senses: equations for complex cases are derived from those for special cases by “composing” special case equations; behavior in complex cases is explained in terms of behavior in special cases; one learns the true nature of a property in the special case where it is allowed to work in isolation. In this paper, I argue that a strand of non-equilibrium thermodynamics which attempts to go beyond the limitations of classical non-equilibrium thermodynamics adheres to something that is the reverse of this picture. Thus, the legitimacy (or lack thereof) of this picture lies very near to the heart of foundational issues in non-equilibrium thermodynamics.  相似文献   

7.
It is commonly argued that values “fill the logical gap” of underdetermination of theory by evidence, namely, values affect our choice between two or more theories that fit the same evidence. The underdetermination model, however, does not exhaust the roles values play in evidential reasoning. I introduce WAVE – a novel account of the logical relations between values and evidence. WAVE states that values influence evidential reasoning by adjusting evidential weights. I argue that the weight-adjusting role of values is distinct from their underdetermination gap-filling role. Values adjust weights in three ways. First, values affect our trust in the testimony of others. Second, values influence the evidential thresholds required for justified epistemic judgments. Third, values influence the relative weight of a certain type of evidence within a body of multimodal discordant evidence. WAVE explains, from an epistemic perspective, rather than psychological, how smokers, for example, can find the same evidence about the dangers of smoking less persuasive than non-smokers. WAVE allows for a wider effect of values on our accepted scientific theories and beliefs than the effect for which the underdetermination model allows alone; therefore, science studies scholars must consider WAVE in their research and analysis of evidential case studies.  相似文献   

8.
A pessimistic strain of thought is fomenting in the health studies literature regarding the status of medicine. Ioannidis’s (2005) now famous finding that “most published research findings are false” and Stegenga’s (2018) book-length argument for medical nihilism are examples of this. In this paper, we argue that these positions are incorrect insofar as they rest on an untenable account of the nature of facts. Proper attention to fallibilism and the social organization of knowledge, as well as Bayesian probabilities in medical reasoning, prompt us to ask why the cynics expect the results of quantitative studies to be incontrovertibly true in the first place. While we agree with Ioannidis and others’ identified flaws in the medical research enterprise, and encourage rectification, we conclude that medical nihilism is not the natural outcome of the current state of research.  相似文献   

9.
Modern clinical case reporting takes the form of problem-solution narratives that redescribe symptoms in terms of disease categories. Authored almost always by those who have played a part in the medical assessment of the patient, reports historicise the salient details of an individual's illness as a complex effect of identifiable antecedent causes. Candidate hypotheses linking illness to pathological mechanisms are suggested by the patient’s experience, and by data that emerge from clinical examination and investigation. Observational and interpretive statements from these considerations are fitted into a temporally inflected account of the patient’s medical condition, configured from the vantage point of hindsight. Drawing on established forms of deferred telling, readers are invited to follow a story that drip-feeds a mixture of contingent and non-incidental information into the account, which engenders and frustrates curiosity, creates expectations, and challenges powers of reasoning and pattern recognition. Whereas case reporting once favoured memoir, the sentimental tale and eccentric biography as the means by which its historical narrative was cast, the preferred genres of contemporary case reporting include detective fiction, and puzzle and riddle narratives, formats that conceptualise the medical consultation in narrow problem-solution terms.  相似文献   

10.
11.
This paper discusses whether asset restructuring can improve firm performance over decades. Variation in the stock price or the financial ratio is used as the dependent variable of either short‐ or long‐term effectiveness to evaluate the variance both before and after asset restructuring. The result is varied. It is necessary to develop a foresight approach for the mixed situation. This work pioneers to forecast effectiveness of asset restructuring with a rebalanced and clustered support vector machine (RCS). The profitability variation 1 year before and after asset restructuring is used as the dependent variable. The current financial indicators of the year of asset restructuring are used as independent variables. Specially treated listed companies are used as research samples, as they frequently adopt asset restructuring. In modeling, the skew distribution of samples achieving and failing to achieve performance improvement with asset restructuring is handled with rebalancing. The similar experienced knowledge of asset restructuring to the current asset restructuring is filtered out with clustering. With the help from rebalancing and clustering, a support vector machine is constructed for prediction, together with other forecasting models of multivariate discriminant analysis, logistic regression, probit regression, and case‐based reasoning. These models' standalone modes are used as benchmarks. The empirical results demonstrate the applicability of the RCS for forecasting effectiveness of asset restructuring.  相似文献   

12.
In cases of animal mimicry, the receiver of the signal learns the truth that he is either dealing with the real thing or with a mimic. Thus, despite being a prototypical example of animal deception, mimicry does not seem to qualify as deception on the traditional definition, since the receiver is not actually misled. We offer a new account of propositional content in sender-receiver games that explains how the receiver is misled (and deceived) by mimicry. We show that previous accounts of deception, and of propositional content, give incorrect results about whether certain signals are deceptive.  相似文献   

13.
Two works on hydrostatics, by Simon Stevin in 1586 and by Blaise Pascal in 1654, are analysed and compared. The contrast between the two serves to highlight aspects of the qualitative novelty involved in changes within science in the first half of the seventeenth century. Stevin attempted to derive his theory from unproblematic postulates drawn from common sense but failed to achieve his goal insofar as he needed to incorporate assumptions involved in his engineering practice but not sanctioned by his postulates. Pascal's theory went beyond common sense by introducing a novel concept, pressure. Theoretical reflection on novel experiments was involved in the construction of the new concept and experiment also provided important evidence for the theory that deployed it. The new experimental reasoning was qualitatively different from the Euclidean style of reasoning adopted by Stevin. The fact that a conceptualization of a technical sense of pressure adequate for hydrostatics was far from obvious is evident from the work of those, such as Galileo and Descartes, who did not make significant moves in that direction.  相似文献   

14.
实时、可靠的社会信号,是实施有效的社会管理,特别是反馈闭环式的社会管理创新的基础.然而,相对于物理信号,社会信号的处理与分析有待系统化的研究与发展.随着社会新媒体和社会网络的蓬勃兴起和普及,这一问题变得更加迫切和急需.本文就社会信号与社会系统建模与管理,社会信号的刻画,社会传感网的构建.计算辩证推理和综合的人工社会、计算实验、平行执行(ACP)方法等展开讨论,希望在此基础上进一步开展研究与应用,最终建立面向社会信号之获取、分析、解析和应用的一般框架与方法体系.  相似文献   

15.
Philosophers of science continue to elaborate our understanding of the roles that values play in scientific reasoning, practice, and institutions. This special issue focuses on the environmental sciences, a mosaic of fields ranging from restoration ecology to forestry to climatology, unified by its attention to the relationships between humans and their habitats. It is a field that revolves around ameliorating environmental problems, aiming to support the provision of social goods and provide guidance to policymakers about how to regulate individuals and industries. Values abound in such judgments as setting the boundaries of an ecosystem, integrating the human dimensions of social-ecological systems, and collaborating with stakeholders. Since few in the field are likely to insist that these judgments can be made without reference to social values, environmental science can serve as fertile ground for exploring the ethical, social, and political terrain at the frontier of the science and values discourse.  相似文献   

16.
Philip Kitcher's The Advancement of Science sets out, programmatically, a new naturalistic view of science as a process of building consensus practices. Detailed historical case studies—centrally, the Darwinian revolutio—are intended to support this view. I argue that Kitcher's expositions in fact support a more conservative view, that I dub ‘Legend Naturalism’. Using four historical examples which increasingly challenge Kitcher's discussions, I show that neither Legend Naturalism, nor the less conservative programmatic view, gives an adequate account of scientific progress. I argue for a naturalism that is more informed by psychology and a normative account that is both more social and less realist than the views articulated in The Advancement of Science.  相似文献   

17.
The application of analytic continuation in quantum field theory (QFT) is juxtaposed to T-duality and mirror symmetry in string theory. Analytic continuation—a mathematical transformation that takes the time variable t to negative imaginary time—it—was initially used as a mathematical technique for solving perturbative Feynman diagrams, and was subsequently the basis for the Euclidean approaches within mainstream QFT (e.g., Wilsonian renormalization group methods, lattice gauge theories) and the Euclidean field theory program for rigorously constructing non-perturbative models of interacting QFTs. A crucial difference between theories related by duality transformations and those related by analytic continuation is that the former are judged to be physically equivalent while the latter are regarded as physically inequivalent. There are other similarities between the two cases that make comparing and contrasting them a useful exercise for clarifying the type of argument that is needed to support the conclusion that dual theories are physically equivalent. In particular, T-duality and analytic continuation in QFT share the criterion for predictive equivalence that two theories agree on the complete set of expectation values and the mass spectra and the criterion for formal equivalence that there is a “translation manual” between the physically significant algebras of observables and sets of states in the two theories. The analytic continuation case study illustrates how predictive and formal equivalence are compatible with physical inequivalence, but not in the manner of standard underdetermination cases. Arguments for the physical equivalence of dual theories must cite considerations beyond predictive and formal equivalence. The analytic continuation case study is an instance of the strategy of developing a physical theory by extending the formal or mathematical equivalence with another physical theory as far as possible. That this strategy has resulted in developments in pure mathematics as well as theoretical physics is another feature that this case study has in common with dualities in string theory.  相似文献   

18.
The article addresses the topic of the growth of mathematical knowledge with a special focus on the question: How are mathematical objects introduced to mathematical practice? It takes as starting point a proposal made in a previous paper which is based on a case study on the introduction of Riemann surfaces. The claim is that (i) a new object first refers to previously accepted objects, and that (ii) reasoning is possible via a correspondence to the objects with reference to which it is introduced. In addition Riemann surfaces are geometrical objects, i.e., they are placed in a geometrical context, which makes new definitions possible. This proposal is tested on a case study on Minkowski’s introduction of convex bodies. The conclusion is that the proposal holds also for this example. In both cases we notice that in a first stage is a close connection between the new object and the objects it is introduced with reference to, and that in a later stage, the new object is given an independent definition. Even though the two cases display similarity in these respects, we also point to certain differences between the cases in the process of the first stage. Overall we notice the fruitfulness of representing problems in different contexts.  相似文献   

19.
This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning.  相似文献   

20.
To study climate change, scientists employ computer models, which approximate target systems with various levels of skill. Given the imperfection of climate models, how do scientists use simulations to generate knowledge about the causes of observed climate change? Addressing a similar question in the context of biological modelling, Levins (1966) proposed an account grounded in robustness analysis. Recent philosophical discussions dispute the confirmatory power of robustness, raising the question of how the results of computer modelling studies contribute to the body of evidence supporting hypotheses about climate change. Expanding on Staley’s (2004) distinction between evidential strength and security, and Lloyd’s (2015) argument connecting variety-of-evidence inferences and robustness analysis, I address this question with respect to recent challenges to the epistemology robustness analysis. Applying this epistemology to case studies of climate change, I argue that, despite imperfections in climate models, and epistemic constraints on variety-of-evidence reasoning and robustness analysis, this framework accounts for the strength and security of evidence supporting climatological inferences, including the finding that global warming is occurring and its primary causes are anthropogenic.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号