共查询到20条相似文献,搜索用时 31 毫秒
1.
Massimiliano Badino 《Foundations of Science》2006,11(4):323-347
The foundation of statistical mechanics and the explanation of the success of its methods rest on the fact that the theoretical
values of physical quantities (phase averages) may be compared with the results of experimental measurements (infinite time
averages). In the 1930s, this problem, called the ergodic problem, was dealt with by ergodic theory that tried to resolve
the problem by making reference above all to considerations of a dynamic nature. In the present paper, this solution will
be analyzed first, highlighting the fact that its very general nature does not duly consider the specificities of the systems
of statistical mechanics. Second, Khinchin’s approach will be presented, that starting with more specific assumptions about
the nature of systems, achieves an asymptotic version of the result obtained with ergodic theory. Third, the statistical meaning
of Khinchin’s approach will be analyzed and a comparison between this and the point of view of ergodic theory is proposed.
It will be demonstrated that the difference consists principally of two different perspectives on the ergodic problem: that
of ergodic theory puts the state of equilibrium at the center, while Khinchin’s attempts to generalize the result to non-equilibrium
states. 相似文献
2.
Brigitte Cambon de Lavalette Charles Tijus Christine Leproux Olivier Bauer 《Foundations of Science》2005,10(1):25-45
Taxonomy Based modeling was applied to describe drivers’ mental models of variable message signs (VMS’s) displayed on expressways.
Progress in road telematics has made it possible to introduce variable message signs (VMS’s). Sensors embedded in the carriageway
every 500m record certain variables (speed, flow rate, etc.) that are transformed in real time into “driving times” to a given
destination if road conditions do not change.
VMS systems are auto-regulative Man-Machine (AMMI) systems which incorporate a model of the user: if the traffic flow is too
high, then drivers should choose alternative routes. In so doing, the traffic flow should decrease. The model of the user
is based on suppositions such as: people do not like to waste time, they fully understand the displayed messages, they trust
the displayed values, they know of alternative routes. However, people also have a model of the way the system functions.
And if they do not believe the contents of the message, they will not act as expected.
We collected data through interviews with drivers using the critical incidents technique (Flanagan, 1985). Results show that
the mental models that drivers have of the way the VMS system works are various but not numerous and that most of them differ
from the“ideal expert” mental model. It is clear that users don’t have an adequate model of how the VMS system works and that
VMS planners have a model of user behaviour that does not correspond to the behaviour of the drivers we interviewed. Finally,
Taxonomy Based Modeling is discussed as a tool for mental model remediation. 相似文献
3.
Giuseppe Longo 《Foundations of Science》2011,16(4):331-333
This short note develops some ideas along the lines of the stimulating paper by Heylighen (Found Sci 15 4(3):345–356, 2010a). It summarizes a theme in several writings with Francis Bailly, downloadable from this author’s web page. The “geometrization”
of time and causality is the common ground of the analysis hinted here and in Heylighen’s paper. Heylighen adds a logical
notion, consistency, in order to understand a possible origin of the selective process that may have originated this organization
of natural phenomena. We will join our perspectives by hinting to some gnoseological complexes, common to mathematics and
physics, which may shed light on the issues raised by Heylighen. 相似文献
4.
5.
In this paper we discuss two approaches to the axiomatization of scientific theories in the context of the so called semantic
approach, according to which (roughly) a theory can be seen as a class of models. The two approaches are associated respectively
to Suppes’ and to da Costa and Chuaqui’s works. We argue that theories can be developed both in a way more akin to the usual
mathematical practice (Suppes), in an informal set theoretical environment, writing the set theoretical predicate in the language
of set theory itself or, more rigorously (da Costa and Chuaqui), by employing formal languages that help us in writing the
postulates to define a class of structures. Both approaches are called internal, for we work within a mathematical framework, here taken to be first-order ZFC. We contrast these approaches with an external one, here discussed briefly. We argue that each one has its strong and weak points, whose discussion is relevant for the
philosophical foundations of science. 相似文献
6.
The issue of determining “the right number of clusters” in K-Means has attracted considerable interest, especially in the
recent years. Cluster intermix appears to be a factor most affecting the clustering results. This paper proposes an experimental
setting for comparison of different approaches at data generated from Gaussian clusters with the controlled parameters of
between- and within-cluster spread to model cluster intermix. The setting allows for evaluating the centroid recovery on par
with conventional evaluation of the cluster recovery. The subjects of our interest are two versions of the “intelligent” K-Means method, ik-Means, that find the “right” number of clusters by extracting “anomalous patterns” from the data one-by-one. We compare them
with seven other methods, including Hartigan’s rule, averaged Silhouette width and Gap statistic, under different between-
and within-cluster spread-shape conditions. There are several consistent patterns in the results of our experiments, such
as that the right K is reproduced best by Hartigan’s rule – but not clusters or their centroids. This leads us to propose an adjusted version
of iK-Means, which performs well in the current experiment setting. 相似文献
7.
Frank Waaldijk 《Foundations of Science》2005,10(3):249-324
We discuss the foundations of constructive mathematics, including recursive mathematics and intuitionism, in relation to classical
mathematics. There are connections with the foundations of physics, due to the way in which the different branches of mathematics
reflect reality. Many different axioms and their interrelationship are discussed. We show that there is a fundamental problem
in BISH (Bishop’s school of constructive mathematics) with regard to its current definition of ‘continuous function’. This problem
is closely related to the definition in BISH of ‘locally compact’. Possible approaches to this problem are discussed. Topology seems to be a key to understanding many
issues. We offer several new simplifying axioms, which can form bridges between the various branches of constructive mathematics
and classical mathematics (‘reuniting the antipodes’). We give a simplification of basic intuitionistic theory, especially
with regard to so-called ‘bar induction’. We then plead for a limited number of axiomatic systems, which differentiate between
the various branches of mathematics. Finally, in the appendix we offer BISH an elegant topological definition of ‘locally compact’, which unlike the current definition is equivalent to the usual classical
and/or intuitionistic definition in classical and intuitionistic mathematics, respectively. 相似文献
8.
Sciences are often regarded as providing the best, or, ideally, exact, knowledge of the world, especially in providing laws
of nature. Ilya Prigogine, who was awarded the Nobel Prize for his theory of non-equilibrium chemical processes—this being
also an important attempt to bridge the gap between exact and non-exact sciences [mentioned in the Presentation Speech by
Professor Stig Claesson (nobelprize.org, The Nobel Prize in Chemistry 1977)]—has had this ideal in mind when trying to formulate
a new kind of science. Philosophers of science distinguish theory and reality, examining relations between these two. Nancy
Cartwright’s distinction of fundamental and phenomenological laws, Rein Vihalemm’s conception of the peculiarity of the exact
sciences, and Ronald Giere’s account of models in science and science as a set of models are deployed in this article to criticise
the common view of science and analyse Ilya Prigogine’s view in particular. We will conclude that on a more abstract, philosophical
level, Prigogine’s understanding of science doesn’t differ from the common understanding.
相似文献
Piret KuuskEmail: |
9.
Gertrudis Van de Vijver 《Foundations of Science》2012,17(1):5-7
This commentary addresses the question of the meaning of critique in relation to objectivism or dogmatism. Inspired by Kant’s
critical philosophy and Husserl’s phenomenology, it defines the first in terms of conditionality, the second in terms of oppositionality.
It works out an application on the basis of Salthe’s (Found Sci 15 4(6):357–367, 2010a) paper on development and evolution, where competition is criticized in oppositional, more than in conditional terms. 相似文献
10.
Melvin S. Steinberg 《Foundations of Science》2008,13(2):163-175
Investigations with electrometers in the 1770s led Volta to envision mobile charge in electrical conductors as a compressible
fluid. A pressure-like condition in this fluid, which Volta described as the fluid’s “effort to push itself out” of its conducting
container, was the causal agent that makes the fluid move. In this paper I discuss Volta’s use of analogy and imagery in model
building, and compare with a successful contemporary conceptual approach to introducing ideas about electric potential in
instruction. The concept that today is called “electric potential” was defined mathematically by Poisson in 1811. It was understood
after about 1850 to predict the same results in conducting matter as Volta’s pressure-like concept—and to predict electrostatic
effects in the exterior space where Volta’s concept had nothing to say. Complete quantification in addition to greater generality
made the mathematical concept a superior research tool for scientists. However, its spreading use in instruction has marginalized
approaches to model building based on the analogy and imagery resources that students bring into the classroom. Data from
pre and post testing in high schools show greater conceptual and confidence gains using the new conceptual approach than using
conventional instruction. This provides evidence for reviving Volta’s compressible fluid model as an intuitive foundation
which can then be modified to include electrostatic distant action. Volta tried to modify his compressible fluid model to
include distant action, using imagery borrowed from distant heating by a flame. This project remained incomplete, because
he did not envision an external field mediating the heating. However, pursuing Volta’s strategy of model modification to completion
now enables students taught with the new conceptual approach to add distant action to an initial compressible fluid model.
I suggest that a partial correspondence to the evolving model sequence that works for beginning students can help illuminate
Volta’s use of intermediate explanatory models.
相似文献
Melvin S. SteinbergEmail: |
11.
John J. Sung 《Foundations of Science》2008,13(2):177-193
Scientific anomalies are observations and facts that contradict current scientific theories and they are instrumental in scientific
theory change. Philosophers of science have approached scientific theory change from different perspectives as Darden (Theory
change in science: Strategies from Mendelian genetics, 1991) observes: Lakatos (In: Lakatos, Musgrave (eds) Criticism and
the growth of knowledge, 1970) approaches it as a progressive “research programmes” consisting of incremental improvements
(“monster barring” in Lakatos, Proofs and refutations: The logic of mathematical discovery, 1976), Kuhn (The structure of
scientific revolutions, 1996) observes that changes in “paradigms” are instigated by a crisis from some anomaly, and Hanson
(In: Feigl, Maxwell (eds) Current issues in the philosophy of science, 1961) proposes that discovery does not begin with hypothesis
but with some “problematic phenomena requiring explanation”. Even though anomalies are important in all of these approaches
to scientific theory change, there have been only few investigations into the specific role anomalies play in scientific theory
change. Furthermore, much of these approaches focus on the theories themselves and not on how the scientists and their experiments
bring about scientific change (Gooding, Experiment and the making of meaning: Human agency in scientific observation and experiment,
1990). To address these issues, this paper approaches scientific anomaly resolution from a meaning construction point of view.
Conceptual integration theory (Fauconnier and Turner, Cogn Sci 22:133–187, 1996; The way we think: Conceptual blending and
mind’s hidden complexities, 2002) from cognitive linguistics describes how one constructs meaning from various stimuli, such
as text and diagrams, through conceptual integration or blending. The conceptual integration networks that describe the conceptual
integration process characterize cognition that occurs unconsciously during meaning construction. These same networks are
used to describe some of the cognition while resolving an anomaly in molecular genetics called RNA interference (RNAi) in
a case study. The RNAi case study is a cognitive-historical reconstruction (Nersessian, In: Giere (ed) Cognitive models of
science, 1992) that reconstructs how the RNAi anomaly was resolved. This reconstruction traces four relevant molecular genetics
publications in describing the cognition necessary in accounting for how RNAi was resolved through strategies (Darden 1991),
abductive reasoning (Peirce, In: Hartshorne, Weiss (eds) Collected papers, 1958), and experimental reasoning (Gooding 1990).
The results of the case study show that experiments play a crucial role in formulating an explanation of the RNAi anomaly
and the integration networks describe the experiments’ role. Furthermore, these results suggest that RNAi anomaly resolution
is embodied. It is embodied in a sense that cognition described in the cognitive-historical reconstruction is experientially
based.
相似文献
John J. SungEmail: |
12.
Pierre Uzan 《Foundations of Science》2007,12(2):109-137
All the attempts to find the justification of the privileged evolution of phenomena exclusively in the external world need
to refer to the inescapable fact that we are living in such an asymmetric universe. This leads us to look for the origin of the “arrow of time” in the relationship
between the subject and the world. The anthropic argument shows that the arrow of time is the condition of the possibility
of emergence and maintenance of life in the universe. Moreover, according to Bohr’s, Poincaré’s and Watanabe’s analysis, this
agreement between the earlier-later direction of entropy increase and the past-future direction of life is the very condition
of the possibility for meaningful action, representation and creation. Beyond this relationship of logical necessity between
the meaning process and the arrow of time the question of their possible physical connection is explored. To answer affirmatively
to this question, the meaning process is modelled as an evolving tree-like structure, called “Semantic Time”, where thermodynamic
irreversibility can be shown.
Time is the substance I am made of. Time is a river which sweeps me along, but I am the river ; it is a tiger which destroys
me, but I am the tiger ; it is a fire which consumes me, but I am the fire. – (Jorge Luis Borges) 相似文献
13.
Steffen Ducheyne 《Foundations of Science》2006,11(4):419-447
In this paper an analysis of Newton’s argument for universal gravitation is provided. In the past, the complexity of the argument
has not been fully appreciated. Recent authors like George E. Smith and William L. Harper have done a far better job. Nevertheless,
a thorough account of the argument is still lacking. Both authors seem to stress the importance of only one methodological
component. Smith stresses the procedure of approximative deductions backed-up by the laws of motion. Harper stresses “systematic
dependencies” between theoretical parameters and phenomena. I will argue that Newton used a variety of different inferential
strategies: causal parsimony considerations, deductions, demonstrative inductions, abductions and thought-experiments. Each
of these strategies is part of Newton’s famous argument. 相似文献
14.
Seungbae Park 《Foundations of Science》2011,16(1):21-30
Putnam in Realism in mathematics and Elsewhere, Cambridge University Press, Cambridge (1975) infers from the success of a
scientific theory to its approximate truth and the reference of its key term. Laudan in Philos Sci 49:19–49 (1981) objects
that some past theories were successful, and yet their key terms did not refer, so they were not even approximately true.
Kitcher in The advancement of science, Oxford University Press, New York (1993) replies that the past theories are approximately
true because their working posits are true, although their idle posits are false. In contrast, I argue that successful theories
which cohere with each other are approximately true, and that their key terms refer. My position is immune to Laudan’s counterexamples
to Putnam’s inference and yields a solution to a problem with Kitcher’s position. 相似文献
15.
16.
The Meaning of Life in a Developing Universe 总被引:4,自引:4,他引:0
John E. Stewart 《Foundations of Science》2010,15(4):395-409
The evolution of life on Earth has produced an organism that is beginning to model and understand its own evolution and the
possible future evolution of life in the universe. These models and associated evidence show that evolution on Earth has a
trajectory. The scale over which living processes are organized cooperatively has increased progressively, as has its evolvability.
Recent theoretical advances raise the possibility that this trajectory is itself part of a wider developmental process. According
to these theories, the developmental process has been shaped by a yet larger evolutionary dynamic that involves the reproduction
of universes. This evolutionary dynamic has tuned the key parameters of the universe to increase the likelihood that life
will emerge and produce outcomes that are successful in the larger process (e.g. a key outcome may be to produce life and
intelligence that intentionally reproduces the universe and tunes the parameters of ‘offspring’ universes). Theory suggests
that when life emerges on a planet, it moves along this trajectory of its own accord. However, at a particular point evolution
will continue to advance only if organisms emerge that decide to advance the developmental process intentionally. The organisms
must be prepared to make this commitment even though the ultimate nature and destination of the process is uncertain, and
may forever remain unknown. Organisms that complete this transition to intentional evolution will drive the further development
of life and intelligence in the universe. Humanity’s increasing understanding of the evolution of life in the universe is
rapidly bringing it to the threshold of this major evolutionary transition. 相似文献
17.
Gerard Jagers op Akkerhuis 《Foundations of Science》2011,16(4):327-329
The comments focus on a presumed circular reasoning in the operator hierarchy and the necessity of understanding life’s origin
for defining life. Below it is shown that its layered structure prevents the operator hierarchy from circular definitions.
It is argued that the origin of life is an insufficient basis for a definition of life that includes multicellular and neural
network organisms. 相似文献
18.
John E. Stewart 《Foundations of Science》2012,17(1):47-50
Vidal’s (Found Sci, 2010) and Rottiers’s (Found Sci, 2010) commentaries on my (2010) paper raised a number of important issues about the possible future trajectory of evolution and
its implications for humanity. My response emphasizes that despite the inherent uncertainty involved in extrapolating the
trajectory of evolution into the far future, the possibilities it reveals nonetheless have significant strategic implications
for what we do with our lives here and now, individually and collectively. One important implication is the replacement of
postmodern scepticism and relativism with an evolutionary grand narrative that can guide humanity to participate successfully
in the future evolution of life in the universe. 相似文献
19.
We put forward the hypothesis that there exist three basic attitudes towards inconsistencies within world views: (1) The inconsistency
is tolerated temporarily and is viewed as an expression of a temporary lack of knowledge due to an incomplete or wrong theory.
The resolution of the inconsistency is believed to be inherent to the improvement of the theory. This improvement ultimately
resolves the contradiction and therefore we call this attitude the ‘regularising’ attitude; (2) The inconsistency is tolerated
and both contradicting elements in the theory are retained. This attitude integrates the inconsistency and leads to a paraconsistent
calculus; therefore we will call it the paraconsistent attitude. (3) In the third attitude, both elements of inconsistency
are considered to be false and the ‘real situation’ is considered something different that can not be described by the theory
constructively. This indicates the incompleteness of the theory, and leads us to a paracomplete calculus; therefore we call
it the paracomplete attitude. We illustrate these three attitudes by means of two ‘paradoxical’ situations in quantum mechanics,
the wave-particle duality and the situation of non locality.
This revised version was published online in July 2006 with corrections to the Cover Date. 相似文献
20.
Pierre Uzan 《Foundations of Science》2010,15(1):1-28
This paper suggests an epistemic interpretation of Belnap’s branching space-times theory based on Everett’s relative state
formulation of the measurement operation in quantum mechanics. The informational branching models of the universe are evolving
structures defined from a partial ordering relation on the set of memory states of the impersonal observer. The totally ordered
set of their information contents defines a linear “time” scale to which the decoherent alternative histories of the informational
universe can be referred—which is quite necessary for assigning them a probability distribution. The “historical” state of
a physical system is represented in an appropriate extended Hilbert space and an algebra of multi-branch operators is developed.
An age operator computes the informational depth of historical states and its standard deviation can be used to provide a
universal information/energy uncertainty relation. An information operator computes the encoding complexity of historical
states, the rate of change of its average value accounting for the process of correlation destruction inherent to the branching
dynamics. In the informational branching models of the universe, the asymmetry of phenomena in nature appears as a mere consequence
of the subject’s activity of measuring, which defines the flow of time-information. 相似文献