首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 953 毫秒
1.
S-dualities have been held to have radical implications for our metaphysics of fundamentality. In particular, it has been claimed that they make the fundamentality status of a physical object theory-relative in an important new way. But what physicists have had to say on the issue has not been clear or consistent, and in particular seems to be ambiguous between whether S-dualities demand an anti-realist interpretation of fundamentality talk or merely a revised realism. This paper is an attempt to bring some clarity to the matter. After showing that even antecedently familiar fundamentality claims are true only relative to a raft of metaphysical, physical, and mathematical assumptions, I argue that the relativity of fundamentality inherent in S-duality nevertheless represents something new, and that part of the reason for this is that it has both realist and anti-realist implications for fundamentality talk. I close by discussing the broader significance that S-dualities have for structuralist metaphysics and for fundamentality metaphysics more generally.  相似文献   

2.
In 1895 sociologist and philosopher Georg Simmel published a paper: ‘On a connection of selection theory to epistemology’. It was focussed on the question of how behavioural success and the evolution of the cognitive capacities that underlie it are to be related to knowing and truth. Subsequently, Simmel’s ideas were largely lost, but recently (2002) an English translation was published by Coleman in this journal. While Coleman’s contextual remarks are solely concerned with a preceding evolutionary epistemology, it will be argued here that Simmel pursues a more unorthodox, more radically biologically based and pragmatist, approach to epistemology in which the presumption of a wholly interests-independent truth is abandoned, concepts are accepted as species-specific and truth tied intimately to practical success. Moreover, Simmel’s position, shorn of one too-radical commitment, shares its key commitments with the recently developed interactivist–constructivist framework for understanding biological cognition and naturalistic epistemology. There Simmel’s position can be given a natural, integrated, three-fold elaboration in interactivist re-analysis, unified evolutionary epistemology and learnable normativity.  相似文献   

3.
Part of the distinction between artefacts, objects made by humans for particular purposes, and natural objects is that artefacts are subject to normative judgements. A drill, say, can be a good drill or a poor drill, it can function well or correctly or it can malfunction. In this paper I investigate how such judgements fit into the domain of the normative in general and what the grounds for their normativity are. Taking as a starting point a general characterization of normativity proposed by Dancy, I argue how statements such as ‘this is a good drill’ or ‘this drill is malfunctioning’ can be seen to express normative facts, or the content of normative statements. What they say is that a user who has a desire to achieve a particular relevant outcome has a reason to use, or not to use, the artefact in question. Next this analysis is extended to show that not just statements that say that an artefact performs its function well or poorly, but all statements that ascribe a function to an artefact can be seen as expressing a normative fact. On this approach the normativity of artefacts is analyzed in terms of reasons on grounds of practical, and to a lesser extent theoretical, rationality. I close by investigating briefly to what extent reasons on moral grounds are, in the analysis adopted here, involved in the normativity of artefacts.  相似文献   

4.
This paper criticizes the traditional philosophical account of the quantization of gauge theories and offers an alternative. On the received view, gauge theories resist quantization because they feature distinct mathematical representatives of the same physical state of affairs. This resistance is overcome by a sequence of ad hoc modifications, justified in part by reference to semiclassical electrodynamics. Among other things, these modifications introduce ”ghosts”: particles with unphysical properties which do not appear in asymptotic states and which are said to be purely a notational convenience. I argue that this sequence of modifications is unjustified and inadequate, making it a poor basis for the interpretation of ghosts. I then argue that gauge theories can be quantized by the same method as any other theory. On this account, ghosts are not purely notation: they are coordinates on the classical configuration space of the theory—specifically, on its gauge structure. This interpretation does not fall prey to the standard philosophical arguments against the significance of ghosts, due to Weingard. Weingard’s argumentative strategy, properly applied, in fact tells in favor of ghosts’ physical significance.  相似文献   

5.
Psychologists debate whether mental attributes can be quantified or whether they admit only qualitative comparisons of more and less. Their disagreement is not merely terminological, for it bears upon the permissibility of various statistical techniques. This article contributes to the discussion in two stages. First it explains how temperature, which was originally a qualitative concept, came to occupy its position as an unquestionably quantitative concept (§§1–4). Specifically, it lays out the circumstances in which thermometers, which register quantitative (or cardinal) differences, became distinguishable from thermoscopes, which register merely qualitative (or ordinal) differences. I argue that this distinction became possible thanks to the work of Joseph Black, ca. 1760. Second, the article contends that the model implicit in temperature’s quantitative status offers a better way for thinking about the quantitative status of mental attributes than models from measurement theory (§§5–6).  相似文献   

6.
In this paper, I explore Rosen’s (1994) ‘transcendental’ objection to constructive empiricism—the argument that in order to be a constructive empiricist, one must be ontologically committed to just the sort of abstract, mathematical objects constructive empiricism seems committed to denying. In particular, I assess Bueno’s (1999) ‘partial structures’ response to Rosen, and argue that such a strategy cannot succeed, on the grounds that it cannot provide an adequate metalogic for our scientific discourse. I conclude by arguing that this result provides some interesting consequences in general for anti-realist programmes in the philosophy of science.  相似文献   

7.
It is well-known that Newtonian gravity, commonly held to describe a gravitational force, can be recast in a form that incorporates gravity into the geometry of the theory: Newton–Cartan theory. It is less well-known that general relativity, a geometrical theory of gravity, can be reformulated in such a way that it resembles a force theory of gravity; teleparallel gravity does just this. This raises questions. One of these concerns theoretical underdetermination. I argue that these theories do not, in fact, represent cases of worrying underdetermination. On close examination, the alternative formulations are best interpreted as postulating the same spacetime ontology. In accepting this, we see that the ontological commitments of these theories cannot be directly deduced from their mathematical form. The spacetime geometry involved in a gravitational theory is not a straightforward consequence of anything internal to that theory as a theory of gravity. Rather, it essentially relies on the rest of nature (the non-gravitational interactions) conspiring to choose the appropriate set of inertial frames.  相似文献   

8.
It has become increasingly common in historiography of science to understand science and its products as inherently local. However, this orientation is faced with three problems. First, how can one explain the seeming universality of contemporary science? Second, if science is so reflective of its local conditions of production, how can it travel so effortlessly to other localities and even globally? And third, how can scientific knowledge attain validity outside its context of origin? I will argue that the notion of standardization and theories of delocalization manage to explain the ‘globality’ of science, but that localism denies ‘universality’ if it is understood non-spatially. Further, localism limits the validity of scientific knowledge unacceptably inside the laboratory walls or other boundaries of knowledge creation. This is not consistent with scientific practice. I will consider on what grounds extra-local knowledge inferences that transcend the boundaries of locality could be seen as justified.  相似文献   

9.
Psychophysics measures the attributes of perceptual experience. The question of whether some of these attributes should be interpreted as more fundamental, or “real,” than others has been answered differently throughout its history. The operationism of Stevens and Boring answers “no,” reacting to the perceived vacuity of earlier debates about fundamentality. The subsequent rise of multidimensional scaling (MDS) implicitly answers “yes” in its insistence that psychophysical data be represented in spaces of low dimensionality. I argue the return of fundamentality follows from a trend toward increasing epistemic humility. Operationism exhibited a kind of hubris in the constitutive role it assigned to the experimenter's presuppositions that is abandoned by the algorithmic methods of MDS. This broad epistemic trend is illustrated by following the trajectory of research on a particular candidate attribute: tonal volume.  相似文献   

10.
11.
Motivated by the question what it is that makes quantum mechanics a holistic theory (if so), I try to define for general physical theories what we mean by `holism'. For this purpose I propose an epistemological criterion to decide whether or not a physical theory is holistic, namely: a physical theory is holistic if and only if it is impossible in principle to infer the global properties, as assigned in the theory, by local resources available to an agent. I propose that these resources include at least all local operations and classical communication. This approach is contrasted with the well-known approaches to holism in terms of supervenience. The criterion for holism proposed here involves a shift in emphasis from ontology to epistemology. I apply this epistemological criterion to classical physics and Bohmian mechanics as represented on a phase and configuration space respectively, and for quantum mechanics (in the orthodox interpretation) using the formalism of general quantum operations as completely positive trace non-increasing maps. Furthermore, I provide an interesting example from which one can conclude that quantum mechanics is holistic in the above mentioned sense, although, perhaps surprisingly, no entanglement is needed.  相似文献   

12.
I began this study with Laudan's argument from the pessimistic induction and I promised to show that the caloric theory of heat cannot be used to support the premisses of the meta-induction on past scientific theories. I tried to show that the laws of experimental calorimetry, adiabatic change and Carnot's theory of the motive power of heat were (i) independent of the assumption that heat is a material substance, (ii) approximately true, (iii) deducible and accounted for within thermodynamics.I stressed that results (i) and (ii) were known to most theorists of the caloric theory and that result (iii) was put forward by the founders of the new thermodynamics. In other words, the truth-content of the caloric theory was located, selected carefully, and preserved by the founders of thermodynamics.However, the reader might think that even if I have succeeded in showing that laudan is wrong about the caloric theory, I have not shown how the strategy followed in this paper can be generalised against the pessimistic meta-induction. I think that the general strategy against Laudan's argument suggested in this paper is this: the empirical success of a mature scientific theory suggests that there are respects and degrees in which this theory is true. The difficulty for — and and real challenge to — philosophers of science is to suggest ways in which this truth-content can be located and shown to be preserved — if at all — to subsequent theories. In particular, the empirical success of a theory does not, automatically, suggest that all theoretical terms of the theory refer. On the contrary, judgments of referential success depend on which theoretical claims are well-supported by the evidence. This is a matter of specific investigation. Generally, one would expect that claims about theoretical entities which are not strongly supported by the evidence or turn out to be independent of the evidence at hand, are not compelling. For simply, if the evidence does not make it likely that our beliefs about putative theoretical entities are approximately correct, a belief in those entities would be ill-founded and unjustified. Theoretical extrapolations in science are indespensable , but they are not arbitrary. If the evidence does not warrant them I do not see why someone should commit herself to them. In a sense, the problem with empricist philisophers is not that they demand that theoretical beliefs must be warranted by evidence. Rather, it is that they claim that no evidence can warrant theorretical beliefs. A realist philosopher of science would not disagree on the first, but she has good grounds to deny the second.I argued that claims about theoretical entities which are not strongly supported by the evidence must not be taken as belief-worthy. But can one sustaon the more ambitious view that loosely supported parts of a theory tend to be just those that include non-referring terms? There is an obvious excess risk in such a generalisation. For there are well-known cases in which a theoretical claim was initially weakly supported by the evidence  相似文献   

13.
Standard objections to the notion of a hedged, or ceteris paribus, law of nature usually boil down to the claim that such laws would be either (1) irredeemably vague, (2) untestable, (3) vacuous, (4) false, or a combination thereof. Using epidemiological studies in nutrition science as an example, I show that this is not true of the hedged law-like generalizations derived from data models used to interpret large and varied sets of empirical observations. Although it may be ‘in principle impossible’ to construct models that explicitly identify all potential causal interferers with the relevant generalization, the view that our failure to do so is fatal to the very notion of a cp-law is plausible only if one illicitly infers metaphysical impossibility from epistemic impossibility. I close with the suggestion that a model-theoretic approach to cp-laws poses a problem for recent attempts to formulate a Mill–Ramsey–Lewis theory of cp-laws.  相似文献   

14.
Recently, many historians of science have chosen to present their historical narratives from the ‘actors’-eye view’. Scientific knowledge not available within the actors’ culture is not permitted to do explanatory work. Proponents of the Sociology of Scientific Knowledge (SSK) purport to ground this historiography on epistemological relativism. I argue that they are making an unnecessary mistake: unnecessary because the historiographical genre in question can be defended on aesthetic and didactic grounds; and a mistake because the argument from relativism is in any case incoherent.The argument of the present article is self-contained, but steers clear of metaphysical debates in the philosophy of science. To allay fears of hidden assumptions, the sequel, to be published in the following issue, will consider SSK’s prospects of succour from scientific realism, instrumentalism, and a metaphysical system of Bruno Latour’s own devising.  相似文献   

15.
Philosophers now commonly reject the value free ideal for science by arguing that non-epistemic values, including personal or social values, are permissible within the core of scientific research. However, little attention has been paid to the normative political consequences of this position. This paper explores these consequences and shows how political theory is fruitful for proceeding in a world without value-neutral science. I draw attention to an oft-overlooked argument employed by proponents of the value free ideal I dub the “political legitimacy argument.” This argument claims that the value-free ideal follows directly from the foundational principles of liberal democracy. If so, then the use of value-laden scientific information within democratic decision making would be illegitimate on purely political grounds. Despite highlighting this unaddressed and important argument, I show how it can be rejected. By appealing to deliberative democratic theory, I demonstrate scientific information can be value-laden and politically legitimate. The deliberative democratic account I develop is well suited for capturing the intuitions of many opponents of the value free ideal and points to a new set of questions for those interested in values in science.  相似文献   

16.
Under the Basel II Accord, banks and other authorized deposit‐taking institutions (ADIs) have to communicate their daily risk estimates to the monetary authorities at the beginning of the trading day, using a variety of value‐at‐risk (VaR) models to measure risk. Sometimes the risk estimates communicated using these models are too high, thereby leading to large capital requirements and high capital costs. At other times, the risk estimates are too low, leading to excessive violations, so that realized losses are above the estimated risk. In this paper we analyze the profit‐maximizing problem of an ADI subject to capital requirements under the Basel II Accord as ADIs have to choose an optimal VaR reporting strategy that minimizes daily capital charges. Accordingly, we suggest a dynamic communication and forecasting strategy that responds to violations in a discrete and instantaneous manner, while adapting more slowly in periods of no violations. We apply the proposed strategy to Standard & Poor's 500 Index and show there can be substantial savings in daily capital charges, while restricting the number of violations to within the Basel II penalty limits. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

17.
We propose an innovative approach to model and predict the outcome of football matches based on the Poisson autoregression with exogenous covariates (PARX) model recently proposed by Agosto, Cavaliere, Kristensen, and Rahbek (Journal of Empirical Finance, 2016, 38(B), 640–663). We show that this methodology is particularly suited to model the goal distribution of a football team and provides a good forecast performance that can be exploited to develop a profitable betting strategy. This paper improves the strand of literature on Poisson‐based models, by proposing a specification able to capture the main characteristics of goal distribution. The betting strategy is based on the idea that the odds proposed by the market do not reflect the true probability of the match because they may also incorporate the betting volumes or strategic price settings in order to exploit betters' biases. The out‐of‐sample performance of the PARX model is better than the reference approach by Dixon and Coles (Applied Statistics, 1997, 46(2), 265–280). We also evaluate our approach in a simple betting strategy, which is applied to English football Premier League data for the 2013–2014, 2014–2015, and 2015–2016 seasons. The results show that the return from the betting strategy is larger than 30% in most of the cases considered and may even exceed 100% if we consider an alternative strategy based on a predetermined threshold, which makes it possible to exploit the inefficiency of the betting market.  相似文献   

18.
In this paper, I consider Kitcher’s (1993) account of reference for the expressions of past science. Kitcher’s case study is of Joseph Priestley and his expression ‘dephlogisticated air’. There is a strong intuitive case that ‘dephlogisticated air’ referred to oxygen, but it was underpinned by very mistaken phlogiston theory, so concluding either that dephlogisticated air referred straightforwardly or that it failed to refer both have unpalatable consequences. Kitcher argues that the reference of such terms is best considered relative to each token—some tokens refer, and others do not. His account thus relies crucially on how this distinction between tokens can be made good—a puzzle I call the discrimination problem. I argue that the discrimination problem cannot be solved. On any reading of Kitcher’s defence of the distinction, the grounds provided are either insufficient or illegitimate. On the first reading, Kitcher violates the principle of humanity by making Priestley’s referential success a matter of the mental contents of modern speakers. The second reading sidesteps the problem of beliefs by appealing to mind-independent facts, but I argue that these are insufficient to achieve reference because of the indeterminacy introduced by the qua problem. On the third and final reading, Priestley’s success is given by what he would say in counterfactual circumstances. I argue that even if there are facts about what Priestley would say, and there is reason for doubt, there is no motivation to think that such facts determine how Priestley referred in the actual world.  相似文献   

19.
In this second paper, I continue my discussion of the problem of reference for scientific realism. First, I consider a final objection to Kitcher’s account of reference, which I generalise to other accounts of reference. Such accounts make attributions of reference by appeal to our pretheoretical intuitions about how true statements ought to be distibuted among the scientific utterances of the past. I argue that in the cases that merit discussion, this strategy fails because our intuitions are unstable. The interesting cases are importantly borderline—it really isn’t clear what we ought to say about how those terms referred. I conclude that in many relevant cases, our grounds for thinking that the theoretical terms of the past referred are matched by our grounds for thinking that they failed to refer, in such a way that deciding on either result is arbitrary and bad news for the realist. In response to this problem, in the second part of the paper I expand upon Field’s (1973) account of partial reference to sketch a new way of thinking about the theoretical terms of the past—that they partially referred and partially failed to refer.  相似文献   

20.
This paper is a critical response to Hylarie Kochiras’ “Gravity and Newton’s substance counting problem,” Studies in History and Philosophy of Science 40 (2009) 267-280. First, the paper argues that Kochiras conflates substances and beings; it proceeds to show that Newton is a substance monist. The paper argues that on methodological grounds Newton has adequate resources to respond to the metaphysical problems diagnosed by Kochiras. Second, the paper argues against the claim that Newton is committed to two speculative doctrines attributed to him by Kochiras and earlier Andrew Janiak: i) the passivity of matter and ii) the principle of local causation. Third, the paper argues that while Kochiras’ (and Janiak’s) arguments about Newton’s metaphysical commitments are mistaken, it qualifies the characterization of Newton as an extreme empiricist as defended by Howard Stein and Rob DiSalle. In particular, the paper shows that Newton’s empiricism was an intellectual and developmental achievement that built on non trivial speculative commitments about the nature of matter and space.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号