首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This article discusses methods of inductive inferences that are methods of visualizations designed in such a way that the “eye” can be employed as a reliable tool for judgment. The term “eye” is used as a stand-in for visual cognition and perceptual processing. In this paper “meaningfulness” has a particular meaning, namely accuracy, which is closeness to truth. Accuracy consists of precision and unbiasedness. Precision is dealt with by statistical methods, but for unbiasedness one needs expert judgment. The common view at the beginning of the twentieth century was to make the most efficient use of this kind of judgment by representing the data in shapes and forms in such a way that the “eye” can function as a reliable judge to reduce bias. The need for judgment of the “eye” is even more necessary when the background conditions of the observations are heterogeneous. Statistical procedures require a certain minimal level of homogeneity, but the “eye” does not. The “eye” is an adequate tool for assessing topological similarities when, due to heterogeneity of the data, metric assessment is not possible. In fact, graphical assessments precedes measurement, or to put it more forcefully, the graphic method is a necessary prerequisite for measurement.  相似文献   

2.
“Teleosemantic” or “biosemantic” theories form a strong naturalistic programme in the philosophy of mind and language. They seek to explain the nature of mind and language by recourse to a natural history of “proper functions” as selected-for effects of language- and thought-producing mechanisms. However, they remain vague with respect to the nature of the proposed analogy between selected-for effects on the biological level and phenomena that are not strictly biological, such as reproducible linguistic and cultural forms. This essay critically explores various interpretations of this analogy. It suggests that these interpretations can be explicated by contrasting adaptationist with pluralist readings of the evolutionary concept of adaptation. Among the possible interpretations of the relations between biological adaptations and their analogues in language and culture, the two most relevant are a linear, hierarchical, signalling-based model that takes its cues from the evolution of co-operation and joint intentionality and a mutualistic, pluralist model that takes its cues from mimesis and symbolism in the evolution of human communication. Arguing for the merits of the mutualistic model, the present analysis indicates a path towards an evolutionary pluralist version of biosemantics that will align with theories of cognition as being environmentally “scaffolded”. Language and other cultural forms are partly independent reproducible structures that acquire proper functions of their own while being integrated with organism-based cognitive traits in co-evolutionary fashion.  相似文献   

3.
It is widely acknowledged that the patient's perspective should be considered when making decisions about how her care will be managed. Patient participation in the decision making process may play an important role in bringing to light and incorporating her perspective. The GRADE framework is touted as an evidence-based process for determining recommendations for clinical practice; i.e. determining how care ought to be managed. GRADE recommendations are categorized as “strong” or “weak” based on several factors, including the “values and preferences” of a “typical” patient. The strength of the recommendation also provides instruction to the clinician about when and how patients should participate in the clinical encounter, and thus whether an individual patient's values and preferences will be heard in her clinical encounter. That is, a “strong” recommendation encourages “paternalism” and a “weak” recommendation encourages shared decision making. We argue that adoption of the GRADE framework is problematic to patient participation and may result in care that is not respectful of the individual patient's values and preferences. We argue that the root of the problem is the conception of “values and preferences” in GRADE – the framework favours population thinking (e.g. “typical” patient “values and preferences”), despite the fact that “values and preferences” are individual in the sense that they are deeply personal. We also show that tying the strength of a recommendation to a model of decision making (paternalism or shared decision making) constrains patient participation and is not justified (theoretically and/or empirically) in the GRADE literature.  相似文献   

4.
Structural symmetry is observed in the majority of fundamental protein folds and gene duplication and fusion evolutionary processes are postulated to be responsible. However, convergent evolution leading to structural symmetry has also been proposed; additionally, there is debate regarding the extent to which exact primary structure symmetry is compatible with efficient protein folding. Issues of symmetry in protein evolution directly impact strategies for de novo protein design as symmetry can substantially simplify the design process. Additionally, when considering gene duplication and fusion in protein evolution, there are two competing models: “emergent architecture” and “conserved architecture”. Recent experimental work has shed light on both the evolutionary process leading to symmetric protein folds as well as the ability of symmetric primary structure to efficiently fold. Such studies largely support a “conserved architecture” evolutionary model, suggesting that complex protein architecture was an early evolutionary achievement involving oligomerization of smaller polypeptides.  相似文献   

5.
6.
The way metrologists conceive of measurement has undergone a major shift in the last two decades. This shift can in great part be traced to a change in the statistical methods used to deal with the expression of measurement results, and, more particularly, with the calculation of measurement uncertainties. Indeed, as we show, the incapacity of the frequentist approach to the calculus of uncertainty to deal with systematic errors has prompted the replacement of the customary frequentist methods by fully Bayesian procedures. The epistemological ramifications of the Bayesian approach merge with a deep empiricist mood tantamount to an “epistemic turn”: measurement results are analysed in terms of degrees of belief, and central concepts such as error and accuracy are called into question. We challenge the perspective entailed by this epistemic turn: we insist on the centrality of the concepts of error and accuracy by underlining the intentional character of measurement that is intimately linked to the process of correction of experimental data. We further circumvent the difficulties posed by the classical analysis of measurement by stressing the social rather than the epistemic dimension of measurement activities.  相似文献   

7.
Building on Norton's “material theory of induction,” this paper shows through careful historical analysis that analogy can act as a methodological principle or stratagem, providing experimentalists with a useful framework to assess data and devise novel experiments. Although this particular case study focuses on late eighteenth and early nineteenth-century experiments on the properties and composition of acids, the results of this investigation may be extended and applied to other research programs. A stage in-between what Steinle calls “exploratory experimentation” and robust theory, I argue that analogy encouraged research to substantiate why the likenesses should outweigh the differences (or vice versa) when evaluating results and designing experiments.  相似文献   

8.
This paper distinguishes between two arguments based on measurement robustness and defends the epistemic value of robustness for the assessment of measurement reliability. I argue that the appeal to measurement robustness in the assessment of measurement is based on a different inferential pattern and is not exposed to the same objections as the no-coincidence argument which is commonly associated with the use of robustness to corroborate individual results. This investigation sheds light on the precise meaning of reliability that emerges from measurement assessment practice. In addition, by arguing that the measurement assessment robustness argument has similar characteristics across the physical, social and behavioural sciences, I defend the idea that there is continuity in the notion of measurement reliability across sciences.  相似文献   

9.
The goal of this paper is to provide an interpretation of Feyerabend's metaphysics of science as found in late works like Conquest of Abundance and Tyranny of Science. Feyerabend's late metaphysics consists of an attempt to criticize and provide a systematic alternative to traditional scientific realism, a package of views he sometimes referred to as “scientific materialism.” Scientific materialism is objectionable not only on metaphysical grounds, nor because it provides a poor ground for understanding science, but because it implies problematic claims about the epistemic and cultural authority of science, claims incompatible with situating science properly in democratic societies. I show how Feyerabend's metaphysical view, which I call “the abundant world” or “abundant realism,” constitute a sophisticated and challenging form of ontological pluralism that makes interesting connections with contemporary philosophy of science and issues of the political and policy role of science in a democratic society.  相似文献   

10.
This paper analyses the practice of model-building “beyond the Standard Model” in contemporary high-energy physics and argues that its epistemic function can be grasped by regarding models as mediating between the phenomenology of the Standard Model and a number of “theoretical cores” of hybrid character, in which mathematical structures are combined with verbal narratives (“stories”) and analogies referring back to empirical results in other fields (“empirical references”). Borrowing a metaphor from a physics research paper, model-building is likened to the search for a Rosetta stone, whose significance does not lie in its immediate content, but rather in the chance it offers to glimpse at and manipulate the components of hybrid theoretical constructs. I shall argue that the rise of hybrid theoretical constructs was prompted by the increasing use of nonrigorous mathematical heuristics in high-energy physics. Support for my theses will be offered in form of a historical–philosophical analysis of the emergence and development of the theoretical core centring on the notion that the Higgs boson is a composite particle. I will follow the heterogeneous elements which would eventually come to form this core from their individual emergence in the 1960s and 1970s, through their collective life as a theoretical core from 1979 until the present day.  相似文献   

11.
Ankeny and Leonelli (2016) propose “repertoires” as a new way to understand the stability of certain research programs as well as scientific change in general. By bringing a more complete range of social, material, and epistemic elements into one framework, they position their work as a correction to the Kuhnian impulse in philosophy of science and other areas of science studies. I argue that this “post-Kuhnian” move is not complete, and that repertoires maintain an internalist perspective. Comparison with an alternative framework, the “sociotechnical imaginaries” of Jasanoff and Kim (2015), illustrates precisely which elements of practice are externalized by Ankeny and Leonelli. Specifically, repertoires discount the role of audience, without whom the repertoires of science are unintelligible, and lack an explicit place for ethical and political imagination, which provide meaning for otherwise mechanical promotion of particular research programs. This comparison reveals, I suggest, two distinct modes of scholarship, one internalist and the other critical. While repertoires can be modified to meet the needs of critical STS scholars and to completely reject Kuhn's internalism, whether or not we do so depends on what we want our scholarship to achieve.  相似文献   

12.
In this essay, I examine the curved spacetime formulation of Newtonian gravity known as Newton–Cartan gravity and compare it with flat spacetime formulations. Two versions of Newton–Cartan gravity can be identified in the physics literature—a “weak” version and a “strong” version. The strong version has a constrained Hamiltonian formulation and consequently a well-defined gauge structure, whereas the weak version does not (with some qualifications). Moreover, the strong version is best compared with the structure of what Earman (World enough and spacetime. Cambridge: MIT Press) has dubbed Maxwellian spacetime. This suggests that there are also two versions of Newtonian gravity in flat spacetime—a “weak” version in Maxwellian spacetime, and a “strong” version in Neo-Newtonian spacetime. I conclude by indicating how these alternative formulations of Newtonian gravity impact the notion of empirical indistinguishability and the debate over scientific realism.  相似文献   

13.
What, exactly, is the relation between statements about future contingents and statements concerning the spacelike? This question may be answered by transferring Thomasonian supervaluations for future tense statements to statements about the spacelike past, present and future, endorsing present contingents and past contingents. For this task, a language is described semantically which contains (frame-relative versions of) the usual quantifier-like tense operators, operators for (frame-relative) “somewhere”/“everywhere”, the operators “for every reference frame”/“for some reference frame” and three different “necessity” operators with their “possibility” counterparts. Technically, special attention is paid to interaction laws between the different kinds of operators. The “necessity” operators differ in the area on which alternatives must coincide in order to count as accessible. Supervaluations are discussed for past light-cone coincidence. Metaphysically, this approach points towards a distinction between two kinds of determinateness which were undistinguishable pre-relativistically: deictic determinateness (past light-cone) and narrative determinateness (frame-relative present-plus-past). An indeterministic solution to the problem of the “wings” is proposed which, without accepting a frame-independent spatially extended present, solves the problem of “massive coincidence” by carefully analysing the famous tunnel example as a story of decisions and by distinguishing between “whether” and “that”-clauses.  相似文献   

14.
Coping with recent heritage is troublesome for history of science museums, since modern scientific artefacts often suffer from a lack of esthetic and artistic qualities and expressiveness. The traditional object-oriented approach, in which museums collect and present objects as individual showpieces is inadequate to bring recent heritage to life. This paper argues that recent artefacts should be regarded as “key pieces.” In this approach the object derives its meaning not from its intrinsic qualities but from its place in an important historical event or development. The “key pieces” approach involves a more organic way of collecting and displaying, focussing less on the individual object and more on the context in which it functioned and its place in the storyline. Finally, I argue that the “key pieces” approach should not be limited to recent heritage. Using this method as a general guiding principle could be a way for history of science museums to appeal to today’s audiences.  相似文献   

15.
In the published version of Hugh Everett III's doctoral dissertation, he inserted what has become a famous footnote, the “note added in proof”. This footnote is often the strongest evidence given for any of various interpretations of Everett (the many worlds, many minds, many histories and many threads interpretations). In this paper I will propose a new interpretation of the footnote. One that is supported by evidence found in letters written to and by Everett; one that is suggested by a new interpretation of Everett, an interpretation that takes seriously the central position of relative states in Everett's pure wave mechanics: the relative facts interpretation. Of central interest in this paper is how to make sense of Everett's claim in the “note added in proof” that “all elements of a superposition (all “branches”) are “actual,” none any more “real” than the rest.”  相似文献   

16.
17.
The importance of the Unruh effect lies in the fact that, together with the related (but distinct) Hawking effect, it serves to link the three main branches of modern physics: thermal/statistical physics, relativity theory/gravitation, and quantum physics. However, different researchers can have in mind different phenomena when they speak of “the Unruh effect” in flat spacetime and its generalization to curved spacetimes. Three different approaches are reviewed here. They are shown to yield results that are sometimes concordant and sometimes discordant. The discordance is disconcerting only if one insists on taking literally the definite article in “the Unruh effect.” It is argued that the role of linking different branches of physics is better served by taking “the Unruh effect” to designate a family of related phenomena. The relation between the Hawking effect and the generalized Unruh effect for curved spacetimes is briefly discussed.  相似文献   

18.
In The Theory of Relativity and A Priori Knowledge (1920b), Reichenbach developed an original account of cognition as coordination of formal structures to empirical ones. One of the most salient features of this account is that it is explicitly not a top-down type of coordination, and in fact it is crucially “directed” by the empirical side. Reichenbach called this feature “the mutuality of coordination” but, in that work, did not elaborate sufficiently on how this is supposed to work. In a paper that he wrote less than two years afterwards (but that he published only in 1932), “The Principle of Causality and the Possibility of its Empirical Confirmation” (1923/1932), he described what seems to be a model for this idea, now within an analysis of causality that results in an account of scientific inference. Recent reassessments of his early proposal do not seem to capture the extent of Reichenbach's original worries. The present paper analyses Reichenbach's early account and suggests a new way to look at his early work. According to it, we perform measurements, individuate parameters, collect and analyse data, by using a “constructive” approach, such as the one with which we formulate and test hypotheses, which paradigmatically requires some simplicity assumptions. Reichenbach's attempt to account for all these aspects in 1923 was obviously limited and naive in many ways, but it shows that, in his view, there were multiple ways in which the idea of “constitution” is embodied in scientific practice.  相似文献   

19.
Different conceptions of scientific theories, such as the state spaces approach of Bas van Fraassen, the phase spaces approach of Frederick Suppe, the set-theoretical approach of Patrick Suppes, and the structuralist view of Joseph Sneed et al. are usually put together into one big family. In addition, the definite article is normally used, and thus we speak of the semantic conception (view or approach) of theories and of its different approaches (variants or versions). However, in The Semantic Conception of Theories and Scientific Realism (Urban and Chicago: University of Illinois Press, 1989), starting from certain remarks already made in “Theory Structure” (in P. Asquith and H. Kyburg (Eds.), Current Research in Philosophy of Science, East Lansing: Philosophy of Science Association, 1979, pp. 317–338), Frederick Suppe excludes the structuralist view as well as other “European” versions from the semantic conception of theories. In this paper I will critically examine the reasons put forward by Suppe for this decision and, later, I will provide a general characterization of the semantic family and of the structuralist view of theories in such a way as to justify the inclusion of the structuralist view (as well as other “European” versions) as a member of this family.  相似文献   

20.
The problem of measurement is a central issue in the epistemology and methodology of the physical sciences. In recent literature on scientific representation, large emphasis has been put on the “constitutive role” played by measurement procedures as forms of representation. Despite its importance, this issue hardly finds any mention in writings on constitutive principles, viz. in Michael Friedman׳s account of relativized a priori principles. This issue, instead, was at the heart of Reichenbach׳s analysis of coordinating principles that has inspired Friedman׳s interpretation. This paper suggests that these procedures should have a part in an account of constitutive principles of science, and that they could be interpreted following the intuition originally present (but ultimately not fully developed) in Reichenbach׳s early work.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号