首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In this paper, I argue for a distinction between two scales of coordination in scientific inquiry, through which I reassess Georg Simon Ohm's work on conductivity and resistance. Firstly, I propose to distinguish between measurement coordination, which refers to the specific problem of how to justify the attribution of values to a quantity by using a certain measurement procedure, and general coordination, which refers to the broader issue of justifying the representation of an empirical regularity by means of abstract mathematical tools. Secondly, I argue that the development of Ohm's measurement practice between the first and the second experimental phase of his work involved the change of the measurement coordination on which he relied to express his empirical results. By showing how Ohm relied on different calibration assumptions and practices across the two phases, I demonstrate that the concurrent change of both Ohm's experimental apparatus and the variable that Ohm measured should be viewed based on the different form of measurement coordination. Finally, I argue that Ohm's assumption that tension is equally distributed in the circuit is best understood as part of the general coordination between Ohm's law and the empirical regularity that it expresses, rather than measurement coordination.  相似文献   

2.
When considering controversial thermodynamic scenarios such as Maxwell's demon, it is often necessary to consider probabilistic mixtures of macrostates. This raises the question of how, if at all, to assign entropy to them. The information-theoretic entropy is often used in such cases; however, no general proof of the soundness of doing so has been given, and indeed some arguments against doing so have been presented. We offer a general proof of the applicability of the information-theoretic entropy to probabilistic mixtures of macrostates that is based upon a probabilistic generalisation of the Kelvin statement of the second law. We defend the latter and make clear the other assumptions on which our main result depends. We also briefly discuss the interpretation of our result.  相似文献   

3.
Empirical success is a central criterion for scientific decision-making. Yet its understanding in philosophical studies of science deserves renewed attention: Should philosophers think differently about the advancement of science when they deal with the uncertainty of outcome in ongoing research in comparison with historical episodes? This paper argues that normative appeals to empirical success in the evaluation of competing scientific explanations can result in unreliable conclusions, especially when we are looking at the changeability of direction in unsettled investigations. The challenges we encounter arise from the inherent dynamics of disciplinary and experimental objectives in research practice. In this paper we discuss how these dynamics inform the evaluation of empirical success by analyzing three of its requirements: data accommodation, instrumental reliability, and predictive power. We conclude that the assessment of empirical success in developing inquiry is set against the background of a model's interactive success and prospective value in an experimental context. Our argument is exemplified by the analysis of an apparent controversy surrounding the model of a quantum nose in research on olfaction. Notably, the public narrative of this controversy rests on a distorted perspective on measures of empirical success.  相似文献   

4.
Anna Leuschner argues that there is problematic circularity in Helen Longino's approach that postulates the existence of some shared norms as a necessary precondition for well-functioning pluralistic communities. As an alternative, Leuschner proposes to approach the establishing of more pluralistic communities through political means on a case-by-case basis, taking relevant epistemic and political factors into account. In this paper, I argue that there is an alternative understanding of norms that avoids circularity. I do so by drawing on Isabelle Peschard's discussion of shared practice. I go on to show that norms, so understood, are important in the cases where a political decision may not alone be sufficient for establishing a successful community. Specifically, I discuss pluralistic communities that include laypersons in possession of relevant expertise as an example.  相似文献   

5.
The paper presents an identification procedure for a dynamic model of am hydrologic process. The process involves solute transport in streams subject to aquifer interaction and unsteady flows and the intended use of the model is prediction. Detailed assumptions and results are provided to illustrate the level of comprehensive analysis required to assess model adequacy. The assessment procedure easily generalizes to any dynamic model which is linear-in-the-parameters. As a fundamental tool, instrumental variable algorithms can be adopted which have a number of attractive features. These algorithms make both model-order identification and specification among alternatives a straightforward task. They are known to be consistent estimators in the presence of a wide class of errors. It is seen that they can be made stable and robust in the presence of data outliers. Instrumental variable algorithms can also be used which are asymptotically efficient and provide a covariance matrix of parameter estimates. The paper shows how they aid the quantification of predictive uncertainty and investigates the validity of the underlying assumptions. Further, it illustrates that, when instrumental variable algorithms are used in recursive mode, they can be used not only as an additional tool to access model inadequacy but also as an aid to model improvements.  相似文献   

6.
During the 1860s, the Committee on Electrical Standards convened by the British Association for the Advancement of Science (BAAS) attempted to articulate, refine, and realize a system of absolute electrical measurement. I describe how this context led to the invention of the dimensional formula by James Clerk Maxwell and subsequently shaped its interpretation, in particular through the attempts of William Thomson and Fleeming Jenkin to make absolute electrical measurement intelligible to telegraph engineers. I identify unit conversion as the canonical purpose for dimensional formulae during the remainder of the nineteenth century and go on to explain how an operational interpretation was developed by the French physicist Gabriel Lippmann. The focus on the dimensional formula reveals how various conceptual, theoretical, and material aspects of absolute electrical measurement were taken up or resisted in experimental physics, telegraphic engineering, and electrical practice more broadly, which leads to the conclusion that the integration of electrical theory and telegraphic practice was far harder to achieve and maintain than historians have previously thought. This ultimately left a confusing legacy of dimensional concepts and practices in physics.  相似文献   

7.
The authors review and compare the papers in this issue on saturation and logistic growth with special emphasis on the structure of the underlying probability models. Influence diagrams are used to illustrate the dependence and independence assumptions and the way in which uncertainty is reflected in key model assumptions and parameters characterizing the models.  相似文献   

8.
Landauer's Principle asserts that there is an unavoidable cost in thermodynamic entropy creation when data is erased. It is usually derived from incorrect assumptions, most notably, that erasure must compress the phase space of a memory device or that thermodynamic entropy arises from the probabilistic uncertainty of random data. Recent work seeks to prove Landauer's Principle without using these assumptions. I show that the processes assumed in the proof, and in the thermodynamics of computation more generally, can be combined to produce devices that both violate the second law and erase data without entropy cost, indicating an inconsistency in the theoretical system. Worse, the standard repertoire of processes selectively neglects thermal fluctuations. Concrete proposals for how we might measure dissipationlessly and expand single molecule gases reversibly are shown to be fatally disrupted by fluctuations. Reversible, isothermal processes on molecular scales are shown to be disrupted by fluctuations that can only be overcome by introducing entropy creating, dissipative processes.  相似文献   

9.
Bogen and Woodward's distinction between data and phenomena raises the need to understand the structure of the data-to-phenomena and theory-to-phenomena inferences. I suggest that one way to study the structure of these inferences is to analyze the role of the assumptions involved in the inferences: What kind of assumptions are they? How do these assumptions contribute to the practice of identifying phenomena? In this paper, using examples from atmospheric dynamics, I develop an account of the practice of identifying the target in the data-to-phenomena and theory-to-phenomena inferences in which assumptions about spatiotemporal scales play a central role in the identification of parameters that describe the target system. I also argue that these assumptions are not only empirical but they are also idealizing and abstracting. I conclude the paper with a reflection on the role of idealizations in modeling.  相似文献   

10.
Everettian accounts of quantum mechanics entail that people branch; every possible result of a measurement actually occurs, and I have one successor for each result. Is there room for probability in such an account? The prima facie answer is no; there are no ontic chances here, and no ignorance about what will happen. But since any adequate quantum mechanical theory must make probabilistic predictions, much recent philosophical labor has gone into trying to construct an account of probability for branching selves. One popular strategy involves arguing that branching selves introduce a new kind of subjective uncertainty. I argue here that the variants of this strategy in the literature all fail, either because the uncertainty is spurious, or because it is in the wrong place to yield probabilistic predictions. I conclude that uncertainty cannot be the ground for probability in Everettian quantum mechanics.  相似文献   

11.
The metaphysical commitment to the circle as the essential element in the analysis of celestial motion has long been recognized as the hallmark of classical astronomy. Part I of this paper contains a discussion of how, for Kepler, the circle also functions in geometry to select the basic polygons, in music to select the basic harmonies, and in astrology to select the basic aspects. In Part II, the discussion centres on the question of how the replacement of circular planetary orbits by elliptical orbits in the Astronomia Nova of 1609 affected Kepler's metaphysical commitment to celestial circularity that was made manifest in the derivation of planetary radii in the Mysterium Cosmographicum of 1596. The answer is found in the new and much more accurate derivation of both the planetary radii and their eccentricities in the Harmonice Mundi of 1619. It is the relationship of the diurnal movements of single planets at aphelion and perihelion to specific musical consonances that provides the first step. Then, in the second step, these ratios are ‘tempered’ so that all six planets can provide a heavenly choir. The third and final step employs the ‘mean period’, which is obtained directly from the tempered ratios given by musical theory and diurnal (not annual) motion, in the 3/2 power law to calculate the planetary radii and eccentricities with amazing accuracy. Thus the ellipse is necessary to supply the variation in angular velocities that contain the Creator's archetypal celestial circularity.  相似文献   

12.
Typical worlds     
Hugh Everett III presented pure wave mechanics, sometimes referred to as the many-worlds interpretation, as a solution to the quantum measurement problem. While pure wave mechanics is an objectively deterministic physical theory with no probabilities, Everett sought to show how the theory might be understood as making the standard quantum statistical predictions as appearances to observers who were themselves described by the theory. We will consider his argument and how it depends on a particular notion of branch typicality. We will also consider responses to Everett and the relationship between typicality and probability. The suggestion will be that pure wave mechanics requires a number of significant auxiliary assumptions in order to make anything like the standard quantum predictions.  相似文献   

13.
以科学评估危机信息传播有效性为目标,结合国内外前沿文献对危机信息传播的影响因素和传播过程的不确定性进行了分析论述。分析认为,危机信息传播有效性是一个不确定性概念,是模糊性与随机性的统一;其评估问题也需要从模糊性与随机性两个方面结合进行考虑。应用模糊随机理论,构建了危机信息传播有效性的离散化评估模型,并给出了隶属函数的确定方法。这是对危机信息传播有效性评估理论与方法的拓展,为危机信息传播有效性的定量研究提供了思路。  相似文献   

14.
Diffusion of new products may be deterred by consumers' uncertainties about how they will perform. This paper introduces a decision-theoretic framework for modeling the diffusion of consumables, in which consumers choose between a current and new product so as to maximize expected utility. Consumers that are sufficiently risk-averse delay adoption, and change their prior uncertainties in a Bayesian fashion using information generated by early adopters. Under certain assumptions about the underlying consumer choice process and the market dynamics, the result is logistic growth in the share of consumers that choose the new product. The model can be generalized by allowing for consumer heterogeneity with respect to performance of the new product. The paper concludes with a discussion of applications for market forecasting, design of market trials and other extensions.  相似文献   

15.
Two complementary debates of the turn of the nineteenth and twentieth century are examined here: the debate on the legitimacy of hypotheses in the natural sciences and the debate on intentionality and ‘representations without object’ in philosophy. Both are shown to rest on two core issues: the attitude of the subject and the mode of presentation chosen to display a domain of phenomena. An orientation other than the one which contributed to shape twentieth-century philosophy of science is explored through the analysis of the role given to assumptions in Boltzmann’s research strategy, where assumptions are contrasted to hypotheses, axioms, and principles, and in Meinong’s criticism of the privileged status attributed to representations in mental activities. Boltzmann’s computational style in mathematics and Meinong’s criticism of the confusion between representation and judgment give prominence to an indirect mode of presentation, adopted in a state of suspended belief which is characteristic of assumptions and which enables one to grasp objects that cannot be reached through direct representation or even analogies. The discussion shows how assumptions and the movement to fiction can be essential steps in the quest for objectivity. The conclusion restates the issues of the two debates in a contemporary perspective and shows how recent developments in philosophy of science and philosophy of language and mind can be brought together by arguing for a twofold conception of reference.  相似文献   

16.
In recent papers, Zurek [(2005). Probabilities from entanglement, Born's rule pk=|ψk|2 from entanglement. Physical Review A, 71, 052105] has objected to the decision-theoretic approach of Deutsch [(1999) Quantum theory of probability and decisions. Proceedings of the Royal Society of London A, 455, 3129–3137] and Wallace [(2003). Everettian rationality: defending Deutsch's approach to probability in the Everett interpretation. Studies in History and Philosophy of Modern Physics, 34, 415–438] to deriving the Born rule for quantum probabilities on the grounds that it courts circularity. Deutsch and Wallace assume that the many worlds theory is true and that decoherence gives rise to a preferred basis. However, decoherence arguments use the reduced density matrix, which relies upon the partial trace and hence upon the Born rule for its validity. Using the Heisenberg picture and quantum Darwinism—the notion that classical information is quantum information that can proliferate in the environment pioneered in Ollivier et al. [(2004). Objective properties from subjective quantum states: Environment as a witness. Physical Review Letters, 93, 220401 and (2005). Environment as a witness: Selective proliferation of information and emergence of objectivity in a quantum universe. Physical Review A, 72, 042113]—I show that measurement interactions between two systems only create correlations between a specific set of commuting observables of system 1 and a specific set of commuting observables of system 2. This argument picks out a unique basis in which information flows in the correlations between those sets of commuting observables. I then derive the Born rule for both pure and mixed states and answer some other criticisms of the decision theoretic approach to quantum probability.  相似文献   

17.
Difficulties over probability have often been considered fatal to the Everett interpretation of quantum mechanics. Here I argue that the Everettian can have everything she needs from ‘probability’ without recourse to indeterminism, ignorance, primitive identity over time or subjective uncertainty: all she needs is a particular rationality principle.The decision-theoretic approach recently developed by Deutsch and Wallace claims to provide just such a principle. But, according to Wallace, decision theory is itself applicable only if the correct attitude to a future Everettian measurement outcome is subjective uncertainty. I argue that subjective uncertainty is not available to the Everettian, but I offer an alternative: we can justify the Everettian application of decision theory on the basis that an Everettian should care about all her future branches. The probabilities appearing in the decision-theoretic representation theorem can then be interpreted as the degrees to which the rational agent cares about each future branch. This reinterpretation, however, reduces the intuitive plausibility of one of the Deutsch–Wallace axioms (measurement neutrality).  相似文献   

18.
Model organisms are at once scientific models and concrete living things. It is widely assumed by philosophers of science that (1) model organisms function much like other kinds of models, and (2) that insofar as their scientific role is distinctive, it is in virtue of representing a wide range of biological species and providing a basis for generalizations about those targets. This paper uses the case of human embryonic stem cells (hESC) to challenge both assumptions. I first argue that hESC can be considered model organisms, analogous to classic examples such as Escherichia coli and Drosophila melanogaster. I then discuss four contrasts between the epistemic role of hESC in practice, and the assumptions about model organisms noted above. These contrasts motivate an alternative view of model organisms as a network of systems related constructively and developmentally to one another. I conclude by relating this result to other accounts of model organisms in recent philosophy of science.  相似文献   

19.
A question at the intersection of scientific modeling and public choice is how to deal with uncertainty about model predictions. This “high-level” uncertainty is necessarily value-laden, and thus must be treated as irreducibly subjective. Nevertheless, formal methods of uncertainty analysis should still be employed for the purpose of clarifying policy debates. I argue that such debates are best informed by models which integrate objective features (which model the world) with subjective ones (modeling the policy-maker). This integrated subjectivism is illustrated with a case study from the literature on monetary policy. The paper concludes with some morals for the use of models in determining climate policy.  相似文献   

20.
Scientific explanation is a perennial topic in philosophy of science, but the literature has fragmented into specialized discussions in different scientific disciplines. An increasing attention to scientific practice by philosophers is (in part) responsible for this fragmentation and has put pressure on criteria of adequacy for philosophical accounts of explanation, usually demanding some form of pluralism. This commentary examines the arguments offered by Fagan and Woody with respect to explanation and understanding in scientific practice. I begin by scrutinizing Fagan's concept of collaborative explanation, highlighting its distinctive advantages and expressing concern about several of its assumptions. Then I analyze Woody's attempt to reorient discussions of scientific explanation around functional considerations, elaborating on the wider implications of this methodological recommendation. I conclude with reflections on synergies and tensions that emerge when the two papers are juxtaposed and how these draw attention to critical issues that confront ongoing philosophical analyses of scientific explanation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号