首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 27 毫秒
1.
The term “analogy” stands for a variety of methodological practices all related in one way or another to the idea of proportionality. We claim that in his first substantial contribution to electromagnetism James Clerk Maxwell developed a methodology of analogy which was completely new at the time or, to borrow John North’s expression, Maxwell’s methodology was a “newly contrived analogue”. In his initial response to Michael Faraday’s experimental researches in electromagnetism, Maxwell did not seek an analogy with some physical system in a domain different from electromagnetism as advocated by William Thomson; rather, he constructed an entirely artificial one to suit his needs. Following North, we claim that the modification which Maxwell introduced to the methodology of analogy has not been properly appreciated. In view of our examination of the evidence, we argue that Maxwell gave a new meaning to analogy; in fact, it comes close to modeling in current usage.  相似文献   

2.
The symmetries of a physical theory are often associated with two things: conservation laws (via e.g. Noether׳s and Schur׳s theorems) and representational redundancies (“gauge symmetry”). But how can a physical theory׳s symmetries give rise to interesting (in the sense of non-trivial) conservation laws, if symmetries are transformations that correspond to no genuine physical difference? In this paper, I argue for a disambiguation in the notion of symmetry. The central distinction is between what I call “analytic” and “synthetic“ symmetries, so called because of an analogy with analytic and synthetic propositions. “Analytic“ symmetries are the turning of idle wheels in a theory׳s formalism, and correspond to no physical change; “synthetic“ symmetries cover all the rest. I argue that analytic symmetries are distinguished because they act as fixed points or constraints in any interpretation of a theory, and as such are akin to Poincaré׳s conventions or Reichenbach׳s ‘axioms of co-ordination’, or ‘relativized constitutive a priori principles’.  相似文献   

3.
Building on Norton's “material theory of induction,” this paper shows through careful historical analysis that analogy can act as a methodological principle or stratagem, providing experimentalists with a useful framework to assess data and devise novel experiments. Although this particular case study focuses on late eighteenth and early nineteenth-century experiments on the properties and composition of acids, the results of this investigation may be extended and applied to other research programs. A stage in-between what Steinle calls “exploratory experimentation” and robust theory, I argue that analogy encouraged research to substantiate why the likenesses should outweigh the differences (or vice versa) when evaluating results and designing experiments.  相似文献   

4.
Building upon work by Mary Hesse (1974), this paper aims to show that a single method of investigation lies behind Maxwell's use of physical analogies in his major scientific works before the Treatise on Electricity and Magnetism. Key to understanding the operation of this method is to recognize that Maxwell's physical analogies are intended to possess an ‘inductive’ function in addition to an ‘illustrative’ one. That is to say, they not only serve to clarify the equations proposed for an unfamiliar domain with a working interpretation drawn from a more familiar science, but can also be sources of defeasible yet relatively strong arguments from features of the more familiar domain to features of the less. Compared with the reconstructions by Achinstein (1991), Siegel (1991), Harman (1998) and others, which postulate a discontinuity in Maxwell's approach to physical analogy, the account defended in this paper i) makes sense of the continuity in Maxwell's remarks on scientific methodology, ii) explains his quest for a “mathematical classification of physical quantities” and iii) offers a new and more plausible interpretation of the debated episode of the introduction of the displacement current in Maxwell's “On Physical Lines of Forces”.  相似文献   

5.
Inferentialists about scientific representation hold that an apparatus's representing a target system consists in the apparatus allowing “surrogative inferences” about the target. I argue that a serious problem for inferentialism arises from the fact that many scientific theories and models contain internal inconsistencies. Inferentialism, left unamended, implies that inconsistent scientific models have unlimited representational power, since an inconsistency permits any conclusion to be inferred. I consider a number of ways that inferentialists can respond to this challenge before suggesting my own solution. I develop an analogy to exploitable glitches in a game. Even though inconsistent representational apparatuses may in some sense allow for contradictions to be generated within them, doing so violates the intended function of the apparatus's parts and hence violates representational “gameplay”.  相似文献   

6.
This paper analyses the practice of model-building “beyond the Standard Model” in contemporary high-energy physics and argues that its epistemic function can be grasped by regarding models as mediating between the phenomenology of the Standard Model and a number of “theoretical cores” of hybrid character, in which mathematical structures are combined with verbal narratives (“stories”) and analogies referring back to empirical results in other fields (“empirical references”). Borrowing a metaphor from a physics research paper, model-building is likened to the search for a Rosetta stone, whose significance does not lie in its immediate content, but rather in the chance it offers to glimpse at and manipulate the components of hybrid theoretical constructs. I shall argue that the rise of hybrid theoretical constructs was prompted by the increasing use of nonrigorous mathematical heuristics in high-energy physics. Support for my theses will be offered in form of a historical–philosophical analysis of the emergence and development of the theoretical core centring on the notion that the Higgs boson is a composite particle. I will follow the heterogeneous elements which would eventually come to form this core from their individual emergence in the 1960s and 1970s, through their collective life as a theoretical core from 1979 until the present day.  相似文献   

7.
In 1877 James Clerk Maxwell and his student Donald MacAlister refined Henry Cavendish's 1773 null experiment demonstrating the absence of electricity inside a charged conductor. This null result was a mathematical prediction of the inverse square law of electrostatics, and both Cavendish and Maxwell took the experiment as verifying the law. However, Maxwell had already expressed absolute conviction in the law, based on results of Michael Faraday's. So, what was the value to him of repeating Cavendish's experiment? After assessing whether the law was as secure as he claimed, this paper explores its central importance to the electrical programme that Maxwell was pursuing. It traces the historical and conceptual re-orderings through which Maxwell established the law by constructing a tradition of null tests and asserting the superior accuracy of the method. Maxwell drew on his developing ‘doctrine of method’ to identify Cavendish's experiment as a member of a wider class of null methods. By doing so, he appealed to the null practices of telegraph engineers, diverted attention from the flawed logic of the method, and sought to localise issues around the mapping of numbers onto instrumental indications, on the grounds that ‘no actual measurement … was required’.  相似文献   

8.
During the period 1860–1880, a number of physicists and mathematicians, including Maxwell, Stewart, Cournot and Boussinesq, used theories formulated in terms of physics to argue that the mind, the soul or a vital principle could have an impact on the body. This paper shows that what was primarily at stake for these authors was a concern about the irreducibility of life and the mind to physics, and that their theories can be regarded primarily as reactions to the law of conservation of energy, which was used among others by Helmholtz and Du Bois-Reymond as an argument against the possibility of vital and mental causes in physiology. In light of this development, Maxwell, Stewart, Cournot and Boussinesq showed that it was still possible to argue for the irreducibility of life and the mind to physics, through an appeal to instability or indeterminism in physics: if the body is an unstable or physically indeterministic system, an immaterial principle can act through triggering or directing motions in the body, without violating the laws of physics.  相似文献   

9.
Mathematical invariances, usually referred to as “symmetries”, are today often regarded as providing a privileged heuristic guideline for understanding natural phenomena, especially those of micro-physics. The rise of symmetries in particle physics has often been portrayed by physicists and philosophers as the “application” of mathematical invariances to the ordering of particle phenomena, but no historical studies exist on whether and how mathematical invariances actually played a heuristic role in shaping microphysics. Moreover, speaking of an “application” of invariances conflates the formation of concepts of new intrinsic degrees of freedom of elementary particles with the formulation of models containing invariances with respect to those degrees of freedom. I shall present here a case study from early particle physics (ca. 1930–1954) focussed on the formation of one of the earliest concepts of a new degree of freedom, baryon number, and on the emergence of the invariance today associated to it. The results of the analysis show how concept formation and “application” of mathematical invariances were distinct components of a complex historical constellation in which, beside symmetries, two further elements were essential: the idea of physically conserved quantities and that of selection rules. I shall refer to the collection of different heuristic strategies involving selection rules, invariances and conserved quantities as the “SIC-triangle” and show how different authors made use of them to interpret the wealth of new experimental data. It was only a posteriori that the successes of this hybrid “symmetry heuristics” came to be attributed exclusively to mathematical invariances and group theory, forgetting the role of selection rules and of the notion of physically conserved quantity in the emergence of new degrees of freedom and new invariances. The results of the present investigation clearly indicate that opinions on the role of symmetries in fundamental physics need to be critically reviewed in the spirit of integrated history and philosophy of science.  相似文献   

10.
Understanding complex physical systems through the use of simulations often takes on a narrative character. That is, scientists using simulations seek an understanding of processes occurring in time by generating them from a dynamic model, thereby producing something like a historical narrative. This paper focuses on simulations of the Diels-Alder reaction, which is widely used in organic chemistry. It calls on several well-known works on historical narrative to draw out the ways in which use of these simulations mirrors aspects of narrative understanding: Gallie for “followability” and “contingency”; Mink for “synoptic judgment”; Ricoeur for “temporal dialectic”; and Hawthorn for a related dialectic of the “actual and the possible”. Through these reflections on narrative, the paper aims for a better grasp of the role that temporal development sometimes plays in understanding physical processes and of how considerations of possibility enhance that understanding.  相似文献   

11.
“Teleosemantic” or “biosemantic” theories form a strong naturalistic programme in the philosophy of mind and language. They seek to explain the nature of mind and language by recourse to a natural history of “proper functions” as selected-for effects of language- and thought-producing mechanisms. However, they remain vague with respect to the nature of the proposed analogy between selected-for effects on the biological level and phenomena that are not strictly biological, such as reproducible linguistic and cultural forms. This essay critically explores various interpretations of this analogy. It suggests that these interpretations can be explicated by contrasting adaptationist with pluralist readings of the evolutionary concept of adaptation. Among the possible interpretations of the relations between biological adaptations and their analogues in language and culture, the two most relevant are a linear, hierarchical, signalling-based model that takes its cues from the evolution of co-operation and joint intentionality and a mutualistic, pluralist model that takes its cues from mimesis and symbolism in the evolution of human communication. Arguing for the merits of the mutualistic model, the present analysis indicates a path towards an evolutionary pluralist version of biosemantics that will align with theories of cognition as being environmentally “scaffolded”. Language and other cultural forms are partly independent reproducible structures that acquire proper functions of their own while being integrated with organism-based cognitive traits in co-evolutionary fashion.  相似文献   

12.
We utilize mixed‐frequency factor‐MIDAS models for the purpose of carrying out backcasting, nowcasting, and forecasting experiments using real‐time data. We also introduce a new real‐time Korean GDP dataset, which is the focus of our experiments. The methodology that we utilize involves first estimating common latent factors (i.e., diffusion indices) from 190 monthly macroeconomic and financial series using various estimation strategies. These factors are then included, along with standard variables measured at multiple different frequencies, in various factor‐MIDAS prediction models. Our key empirical findings as follows. (i) When using real‐time data, factor‐MIDAS prediction models outperform various linear benchmark models. Interestingly, the “MSFE‐best” MIDAS models contain no autoregressive (AR) lag terms when backcasting and nowcasting. AR terms only begin to play a role in “true” forecasting contexts. (ii) Models that utilize only one or two factors are “MSFE‐best” at all forecasting horizons, but not at any backcasting and nowcasting horizons. In these latter contexts, much more heavily parametrized models with many factors are preferred. (iii) Real‐time data are crucial for forecasting Korean gross domestic product, and the use of “first available” versus “most recent” data “strongly” affects model selection and performance. (iv) Recursively estimated models are almost always “MSFE‐best,” and models estimated using autoregressive interpolation dominate those estimated using other interpolation methods. (v) Factors estimated using recursive principal component estimation methods have more predictive content than those estimated using a variety of other (more sophisticated) approaches. This result is particularly prevalent for our “MSFE‐best” factor‐MIDAS models, across virtually all forecast horizons, estimation schemes, and data vintages that are analyzed.  相似文献   

13.
Projections of future climate change cannot rely on a single model. It has become common to rely on multiple simulations generated by Multi-Model Ensembles (MMEs), especially to quantify the uncertainty about what would constitute an adequate model structure. But, as Parker points out (2018), one of the remaining philosophically interesting questions is: “How can ensemble studies be designed so that they probe uncertainty in desired ways?” This paper offers two interpretations of what General Circulation Models (GCMs) are and how MMEs made of GCMs should be designed. In the first interpretation, models are combinations of modules and parameterisations; an MME is obtained by “plugging and playing” with interchangeable modules and parameterisations. In the second interpretation, models are aggregations of expert judgements that result from a history of epistemic decisions made by scientists about the choice of representations; an MME is a sampling of expert judgements from modelling teams. We argue that, while the two interpretations involve distinct domains from philosophy of science and social epistemology, they both could be used in a complementary manner in order to explore ways of designing better MMEs.  相似文献   

14.
The young Hermann Helmholtz, in an 1838 letter home, declared that he always appreciated music much more when he played it for himself. Though a frequent concert-goer, and celebrated for his highly influential 1863 work on the physiological basis of music theory, Die Lehre von den Tonempfindungen, it is likely that Helmholtz's enduring engagement with music began with his initial, personal experience of playing music for himself. I develop this idea, shifting the discussion of Helmholtz's work on sound sensation back to its origins, and examine the role of his material interaction with musical instruments and music itself. In his sound sensation studies, Helmholtz understood sound as an external, physical object. But Helmholtz also conceived of sound in musical terms. Further, Helmholtz's particular musical tastes as well as his deeply personal interaction with musical instruments allowed him to reconcile his conception of sound as physical object with his conception of sound as music. Helmholtz's physiological theory of sound sensation was both the product of and constitutive of how he heard and created sound. I argue that Helmholtz himself was the embodied reconciliation of his physiological theory of sound sensation and his belief that musical aesthetics were historically and culturally contingent.  相似文献   

15.
This paper discusses a crisis of accountability that arises when scientific collaborations are massively epistemically distributed. We argue that social models of epistemic collaboration, which are social analogs to what Patrick Suppes called a “model of the experiment,” must play a role in creating accountability in these contexts. We also argue that these social models must accommodate the fact that the various agents in a collaborative project often have ineliminable, messy, and conflicting interests and values; any story about accountability in a massively distributed collaboration must therefore involve models of such interests and values and their methodological and epistemic effects.  相似文献   

16.
In this paper I address Descartes’ use of analogy in physics. First, I introduce Descartes’ hypothetical reasoning, distinguishing between analogy and hypothesis. Second, I examine in detail Descartes’ use of analogy to both discover causes and add plausibility to his hypotheses—even though not always explicitly stated, Descartes’ practice assumes a unified view of the subject matter of physics as the extension of bodies in terms of their size, shape and the motion of their parts. Third, I present Descartes’ unique “philosophy of analogy”, where the absence of analogy serves as a criterion for falsifying proposed explanations in physics. I conclude by defending Descartes’ philosophy of analogy by appeal to the value scientists assign to simplicity in their explanations.  相似文献   

17.
This paper addresses issues surrounding the concept of geometric phase or “anholonomy.” Certain physical phenomena apparently require for their explanation and understanding reference to topological/geometric features of some abstract space of parameters. These issues are related to the question of how gauge structures are to be interpreted and whether or not the debate over their “reality” is really going to be fruitful.  相似文献   

18.
Theories are composed of multiple interacting components. I argue that some theories have narratives as essential components, and that narratives function as integrative devices of the mathematical components of theories. Narratives represent complex processes unfolding in time as a sequence of stages, and hold the mathematical elements together as pieces in the investigation of a given process. I present two case studies from population genetics: R. A. Fisher's “mas selection” theory, and Sewall Wright's shifting balance theory. I apply my analysis to an early episode of the “R. A. Fisher – Sewall Wright controversy.”  相似文献   

19.
A conventional wisdom about the progress of physics holds that successive theories wholly encompass the domains of their predecessors through a process that is often called “reduction.” While certain influential accounts of inter-theory reduction in physics take reduction to require a single “global” derivation of one theory׳s laws from those of another, I show that global reductions are not available in all cases where the conventional wisdom requires reduction to hold. However, I argue that a weaker “local” form of reduction, which defines reduction between theories in terms of a more fundamental notion of reduction between models of a single fixed system, is available in such cases and moreover suffices to uphold the conventional wisdom. To illustrate the sort of fixed-system, inter-model reduction that grounds inter-theoretic reduction on this picture, I specialize to a particular class of cases in which both models are dynamical systems. I show that reduction in these cases is underwritten by a mathematical relationship that follows a certain liberalized construal of Nagel/Schaffner reduction, and support this claim with several examples. Moreover, I show that this broadly Nagelian analysis of inter-model reduction encompasses several cases that are sometimes cited as instances of the “physicist׳s” limit-based notion of reduction.  相似文献   

20.
In The Theory of Relativity and A Priori Knowledge (1920b), Reichenbach developed an original account of cognition as coordination of formal structures to empirical ones. One of the most salient features of this account is that it is explicitly not a top-down type of coordination, and in fact it is crucially “directed” by the empirical side. Reichenbach called this feature “the mutuality of coordination” but, in that work, did not elaborate sufficiently on how this is supposed to work. In a paper that he wrote less than two years afterwards (but that he published only in 1932), “The Principle of Causality and the Possibility of its Empirical Confirmation” (1923/1932), he described what seems to be a model for this idea, now within an analysis of causality that results in an account of scientific inference. Recent reassessments of his early proposal do not seem to capture the extent of Reichenbach's original worries. The present paper analyses Reichenbach's early account and suggests a new way to look at his early work. According to it, we perform measurements, individuate parameters, collect and analyse data, by using a “constructive” approach, such as the one with which we formulate and test hypotheses, which paradigmatically requires some simplicity assumptions. Reichenbach's attempt to account for all these aspects in 1923 was obviously limited and naive in many ways, but it shows that, in his view, there were multiple ways in which the idea of “constitution” is embodied in scientific practice.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号