首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In this paper, the potentialities of transvariation (Gini, 1959) in measuring the separation between two groups of multivariate observations are explored. With this aim, a modified version of Gini’s notion of multidimensional transvariation is proposed. According to Gini (1959), two groups G1 and G2 are said to transvary on the k-dimensional variable X = (X1,...,Xh,...,Xk) if there exists at least one pair of units, belonging to different groups, such that for h = 1,...,k the sign of the difference between their Xh values is opposite to that of m1h −m2h, where m1h and m2h are the corresponding group mean values of Xh. We introduce a modification that allows us to derive a measure of group separation, which can be profitably used in discriminating between two groups. The performance of the measure is tested through simulation experiments. The results show that the proposed measure is not sensitive to distributional assumptions and highlight its robustness against outliers.  相似文献   

2.
3.
Investigations with electrometers in the 1770s led Volta to envision mobile charge in electrical conductors as a compressible fluid. A pressure-like condition in this fluid, which Volta described as the fluid’s “effort to push itself out” of its conducting container, was the causal agent that makes the fluid move. In this paper I discuss Volta’s use of analogy and imagery in model building, and compare with a successful contemporary conceptual approach to introducing ideas about electric potential in instruction. The concept that today is called “electric potential” was defined mathematically by Poisson in 1811. It was understood after about 1850 to predict the same results in conducting matter as Volta’s pressure-like concept—and to predict electrostatic effects in the exterior space where Volta’s concept had nothing to say. Complete quantification in addition to greater generality made the mathematical concept a superior research tool for scientists. However, its spreading use in instruction has marginalized approaches to model building based on the analogy and imagery resources that students bring into the classroom. Data from pre and post testing in high schools show greater conceptual and confidence gains using the new conceptual approach than using conventional instruction. This provides evidence for reviving Volta’s compressible fluid model as an intuitive foundation which can then be modified to include electrostatic distant action. Volta tried to modify his compressible fluid model to include distant action, using imagery borrowed from distant heating by a flame. This project remained incomplete, because he did not envision an external field mediating the heating. However, pursuing Volta’s strategy of model modification to completion now enables students taught with the new conceptual approach to add distant action to an initial compressible fluid model. I suggest that a partial correspondence to the evolving model sequence that works for beginning students can help illuminate Volta’s use of intermediate explanatory models.
Melvin S. SteinbergEmail:
  相似文献   

4.
Suppose y, a d-dimensional (d ≥ 1) vector, is drawn from a mixture of k (k ≥ 2) populations, given by ∏1, ∏2,…,∏ k . We wish to identify the population that is the most likely source of the point y. To solve this classification problem many classification rules have been proposed in the literature. In this study, a new nonparametric classifier based on the transvariation probabilities of data depth is proposed. We compare the performance of the newly proposed nonparametric classifier with classical and maximum depth classifiers using some benchmark and simulated data sets. The authors thank the editor and referees for comments that led to an improvement of this paper. This work is partially supported by the National Science Foundation under Grant No. DMS-0604726. Published online xx, xx, xxxx.  相似文献   

5.
6.
This short note develops some ideas along the lines of the stimulating paper by Heylighen (Found Sci 15 4(3):345–356, 2010a). It summarizes a theme in several writings with Francis Bailly, downloadable from this author’s web page. The “geometrization” of time and causality is the common ground of the analysis hinted here and in Heylighen’s paper. Heylighen adds a logical notion, consistency, in order to understand a possible origin of the selective process that may have originated this organization of natural phenomena. We will join our perspectives by hinting to some gnoseological complexes, common to mathematics and physics, which may shed light on the issues raised by Heylighen.  相似文献   

7.
Sciences are often regarded as providing the best, or, ideally, exact, knowledge of the world, especially in providing laws of nature. Ilya Prigogine, who was awarded the Nobel Prize for his theory of non-equilibrium chemical processes—this being also an important attempt to bridge the gap between exact and non-exact sciences [mentioned in the Presentation Speech by Professor Stig Claesson (nobelprize.org, The Nobel Prize in Chemistry 1977)]—has had this ideal in mind when trying to formulate a new kind of science. Philosophers of science distinguish theory and reality, examining relations between these two. Nancy Cartwright’s distinction of fundamental and phenomenological laws, Rein Vihalemm’s conception of the peculiarity of the exact sciences, and Ronald Giere’s account of models in science and science as a set of models are deployed in this article to criticise the common view of science and analyse Ilya Prigogine’s view in particular. We will conclude that on a more abstract, philosophical level, Prigogine’s understanding of science doesn’t differ from the common understanding.
Piret KuuskEmail:
  相似文献   

8.
In order to become aware of inconsistencies, one must first construe of the world in a way that reflects its consistencies. This paper begins with a tentative model for how a set of discrete memories transforms into an interconnected worldview, wherein relationships between memories are forged by way of abstractions. Inconsistencies prompt the invention of new abstractions. In regions of the conceptual network where inconsistencies abound, a cognitive analog of simulated annealing is in order; there is a willingness to question previous assumptions - to ‘loosen’ conceptual relationships - so as to let new concepts percolate through the worldview and exert the needed revolutionary effect. In so doing there is a risk of assimilating dangerous concepts. Repression arrests the process by which dangerous thoughts infiltrate the conceptual network, and deception blocks thoughts that have already been assimilated. These forms of self-initiated worldview inconsistency may evoke feelings of fragmentation at the level of the individual or the society. This revised version was published online in July 2006 with corrections to the Cover Date.  相似文献   

9.
Putnam in Realism in mathematics and Elsewhere, Cambridge University Press, Cambridge (1975) infers from the success of a scientific theory to its approximate truth and the reference of its key term. Laudan in Philos Sci 49:19–49 (1981) objects that some past theories were successful, and yet their key terms did not refer, so they were not even approximately true. Kitcher in The advancement of science, Oxford University Press, New York (1993) replies that the past theories are approximately true because their working posits are true, although their idle posits are false. In contrast, I argue that successful theories which cohere with each other are approximately true, and that their key terms refer. My position is immune to Laudan’s counterexamples to Putnam’s inference and yields a solution to a problem with Kitcher’s position.  相似文献   

10.
The foundation of statistical mechanics and the explanation of the success of its methods rest on the fact that the theoretical values of physical quantities (phase averages) may be compared with the results of experimental measurements (infinite time averages). In the 1930s, this problem, called the ergodic problem, was dealt with by ergodic theory that tried to resolve the problem by making reference above all to considerations of a dynamic nature. In the present paper, this solution will be analyzed first, highlighting the fact that its very general nature does not duly consider the specificities of the systems of statistical mechanics. Second, Khinchin’s approach will be presented, that starting with more specific assumptions about the nature of systems, achieves an asymptotic version of the result obtained with ergodic theory. Third, the statistical meaning of Khinchin’s approach will be analyzed and a comparison between this and the point of view of ergodic theory is proposed. It will be demonstrated that the difference consists principally of two different perspectives on the ergodic problem: that of ergodic theory puts the state of equilibrium at the center, while Khinchin’s attempts to generalize the result to non-equilibrium states.  相似文献   

11.
This commentary addresses the question of the meaning of critique in relation to objectivism or dogmatism. Inspired by Kant’s critical philosophy and Husserl’s phenomenology, it defines the first in terms of conditionality, the second in terms of oppositionality. It works out an application on the basis of Salthe’s (Found Sci 15 4(6):357–367, 2010a) paper on development and evolution, where competition is criticized in oppositional, more than in conditional terms.  相似文献   

12.
All the attempts to find the justification of the privileged evolution of phenomena exclusively in the external world need to refer to the inescapable fact that we are living in such an asymmetric universe. This leads us to look for the origin of the “arrow of time” in the relationship between the subject and the world. The anthropic argument shows that the arrow of time is the condition of the possibility of emergence and maintenance of life in the universe. Moreover, according to Bohr’s, Poincaré’s and Watanabe’s analysis, this agreement between the earlier-later direction of entropy increase and the past-future direction of life is the very condition of the possibility for meaningful action, representation and creation. Beyond this relationship of logical necessity between the meaning process and the arrow of time the question of their possible physical connection is explored. To answer affirmatively to this question, the meaning process is modelled as an evolving tree-like structure, called “Semantic Time”, where thermodynamic irreversibility can be shown. Time is the substance I am made of. Time is a river which sweeps me along, but I am the river ; it is a tiger which destroys me, but I am the tiger ; it is a fire which consumes me, but I am the fire. – (Jorge Luis Borges)  相似文献   

13.
We discuss the foundations of constructive mathematics, including recursive mathematics and intuitionism, in relation to classical mathematics. There are connections with the foundations of physics, due to the way in which the different branches of mathematics reflect reality. Many different axioms and their interrelationship are discussed. We show that there is a fundamental problem in BISH (Bishop’s school of constructive mathematics) with regard to its current definition of ‘continuous function’. This problem is closely related to the definition in BISH of ‘locally compact’. Possible approaches to this problem are discussed. Topology seems to be a key to understanding many issues. We offer several new simplifying axioms, which can form bridges between the various branches of constructive mathematics and classical mathematics (‘reuniting the antipodes’). We give a simplification of basic intuitionistic theory, especially with regard to so-called ‘bar induction’. We then plead for a limited number of axiomatic systems, which differentiate between the various branches of mathematics. Finally, in the appendix we offer BISH an elegant topological definition of ‘locally compact’, which unlike the current definition is equivalent to the usual classical and/or intuitionistic definition in classical and intuitionistic mathematics, respectively.  相似文献   

14.
Understanding Pluralism in Climate Modeling   总被引:1,自引:0,他引:1  
To study Earth’s climate, scientists now use a variety of computer simulation models. These models disagree in some of their assumptions about the climate system, yet they are used together as complementary resources for investigating future climatic change. This paper examines and defends this use of incompatible models. I argue that climate model pluralism results both from uncertainty concerning how to best represent the climate system and from difficulties faced in evaluating the relative merits of complex models. I describe how incompatible climate models are used together in ‘multi-model ensembles’ and explain why this practice is reasonable, given scientists’ inability to identify a ‘best’ model for predicting future climate. Finally, I characterize climate model pluralism as involving both an ontic competitive pluralism and a pragmatic integrative pluralism.  相似文献   

15.
Taxonomy Based modeling was applied to describe drivers’ mental models of variable message signs (VMS’s) displayed on expressways. Progress in road telematics has made it possible to introduce variable message signs (VMS’s). Sensors embedded in the carriageway every 500m record certain variables (speed, flow rate, etc.) that are transformed in real time into “driving times” to a given destination if road conditions do not change. VMS systems are auto-regulative Man-Machine (AMMI) systems which incorporate a model of the user: if the traffic flow is too high, then drivers should choose alternative routes. In so doing, the traffic flow should decrease. The model of the user is based on suppositions such as: people do not like to waste time, they fully understand the displayed messages, they trust the displayed values, they know of alternative routes. However, people also have a model of the way the system functions. And if they do not believe the contents of the message, they will not act as expected. We collected data through interviews with drivers using the critical incidents technique (Flanagan, 1985). Results show that the mental models that drivers have of the way the VMS system works are various but not numerous and that most of them differ from the“ideal expert” mental model. It is clear that users don’t have an adequate model of how the VMS system works and that VMS planners have a model of user behaviour that does not correspond to the behaviour of the drivers we interviewed. Finally, Taxonomy Based Modeling is discussed as a tool for mental model remediation.  相似文献   

16.
We put forward the hypothesis that there exist three basic attitudes towards inconsistencies within world views: (1) The inconsistency is tolerated temporarily and is viewed as an expression of a temporary lack of knowledge due to an incomplete or wrong theory. The resolution of the inconsistency is believed to be inherent to the improvement of the theory. This improvement ultimately resolves the contradiction and therefore we call this attitude the ‘regularising’ attitude; (2) The inconsistency is tolerated and both contradicting elements in the theory are retained. This attitude integrates the inconsistency and leads to a paraconsistent calculus; therefore we will call it the paraconsistent attitude. (3) In the third attitude, both elements of inconsistency are considered to be false and the ‘real situation’ is considered something different that can not be described by the theory constructively. This indicates the incompleteness of the theory, and leads us to a paracomplete calculus; therefore we call it the paracomplete attitude. We illustrate these three attitudes by means of two ‘paradoxical’ situations in quantum mechanics, the wave-particle duality and the situation of non locality. This revised version was published online in July 2006 with corrections to the Cover Date.  相似文献   

17.
Dimensionality reduction techniques are used for representing higher dimensional data by a more parsimonious and meaningful lower dimensional structure. In this paper we will study two such approaches, namely Carroll’s Parametric Mapping (abbreviated PARAMAP) (Shepard and Carroll, 1966) and Tenenbaum’s Isometric Mapping (abbreviated Isomap) (Tenenbaum, de Silva, and Langford, 2000). The former relies on iterative minimization of a cost function while the latter applies classical MDS after a preprocessing step involving the use of a shortest path algorithm to define approximate geodesic distances. We will develop a measure of congruence based on preservation of local structure between the input data and the mapped low dimensional embedding, and compare the different approaches on various sets of data, including points located on the surface of a sphere, some data called the "Swiss Roll data", and truncated spheres.  相似文献   

18.
In this paper we describe in some detail a formal computer model of inferential discourse based on a belief system. The key issue is that a logical model in a computer, based on rational sets, can usefully model a human situation based on irrational sets. The background of this work is explained elsewhere, as is the issue of rational and irrational sets (Billinge and Addis, in: Magnani and Dossena (eds.), Computing, philosophy and cognition, 2004; Stepney et al., Journey: Non-classical philosophy—socially sensitive computing in journeys non-classical computation: A grand challenge for computing research, 2004). The model is based on the Belief System (Addis and Gooding, Proceedings of the AISB’99 Symposium on Scientific Creativity, 1999) and it provides a mechanism for choosing queries based on a range of belief. We explain how it provides a way to update the belief based on query results, thus modelling others’ experience by inference. We also demonstrate that for the same internal experience, different models can be built for different actors.
Tom AddisEmail:
  相似文献   

19.
Vidal’s (Found Sci, 2010) and Rottiers’s (Found Sci, 2010) commentaries on my (2010) paper raised a number of important issues about the possible future trajectory of evolution and its implications for humanity. My response emphasizes that despite the inherent uncertainty involved in extrapolating the trajectory of evolution into the far future, the possibilities it reveals nonetheless have significant strategic implications for what we do with our lives here and now, individually and collectively. One important implication is the replacement of postmodern scepticism and relativism with an evolutionary grand narrative that can guide humanity to participate successfully in the future evolution of life in the universe.  相似文献   

20.
Development (and Evolution) of the Universe   总被引:2,自引:2,他引:0  
I distinguish Nature from the World. I also distinguish development from evolution. Development is progressive change and can be modeled as part of Nature, using a specification hierarchy. I have proposed a ‘canonical developmental trajectory’ of dissipative structures with the stages defined thermodynamically and informationally. I consider some thermodynamic aspects of the Big Bang, leading to a proposal for reviving final cause. This model imposes a ‘hylozooic’ kind of interpretation upon Nature, as all emergent features at higher levels would have been vaguely and episodically present primitively in the lower integrative levels, and were stabilized materially with the developmental emergence of new levels. The specification hierarchy’s form is that of a tree, with its trunk in its lowest level, and so this hierarchy is appropriate for modeling an expanding system like the Universe. It is consistent with this model of differentiation during Big Bang development to view emerging branch tips as having been entrained by multiple finalities because of the top-down integration of the various levels of organization by the higher levels.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号