首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 906 毫秒
1.
One important role of belief systems is to allow us to represent information about a certain domain of inquiry. This paper presents a formal framework to accommodate such information representation. Three cognitive models to represent information are discussed: conceptual spaces (Gärdenfors in Conceptual spaces: the geometry of thought. MIT Press, Cambridge, 2000), state-spaces (van Fraassen in Quantum mechanics: an empiricist view. Clarendon Press, Oxford, 1991), and the problem spaces familiar from artificial intelligence. After indicating their weakness to deal with partial information, it is argued that an alternative, formulated in terms of partial structures (da Costa and French in Science and partial truth. Oxford University Press, New York, 2003), can be provided which not only captures the positive features of these models, but also accommodates the partiality of information ubiquitous in science and mathematics.  相似文献   

2.
Let \( \mathcal{G} \) = (G,w) be a weighted simple finite connected graph, that is, let G be a simple finite connected graph endowed with a function w from the set of the edges of G to the set of real numbers. For any subgraph G′ of G, we define w(G′) to be the sum of the weights of the edges of G′. For any i, j vertices of G, we define D {i,j}(\( \mathcal{G} \)) to be the minimum of the weights of the simple paths of G joining i and j. The D {i,j}(\( \mathcal{G} \)) are called 2-weights of \( \mathcal{G} \). Weighted graphs and their reconstruction from 2-weights have applications in several disciplines, such as biology and psychology.Let \( {\left\{{m}_I\right\}}_{I\in \left(\frac{\left\{1,\dots, n\right\}}{2}\right)} \) and \( {\left\{{M}_I\right\}}_{I\in \left(\frac{\left\{1,\dots, n\right\}}{2}\right)} \) be two families of positive real numbers parametrized by the 2-subsets of {1, …, n} with m I M I for any I; we study when there exist a positive-weighted graph G and an n-subset {1, …, n} of the set of its vertices such that D I (\( \mathcal{G} \)) ∈ [m I ,M I ] for any \( I\in \left(\frac{\left\{1,\dots, n\right\}}{2}\right) \). Then we study the analogous problem for trees, both in the case of positive weights and in the case of general weights.  相似文献   

3.
In compositional data analysis, an observation is a vector containing nonnegative values, only the relative sizes of which are considered to be of interest. Without loss of generality, a compositional vector can be taken to be a vector of proportions that sum to one. Data of this type arise in many areas including geology, archaeology, biology, economics and political science. In this paper we investigate methods for classification of compositional data. Our approach centers on the idea of using the α-transformation to transform the data and then to classify the transformed data via regularized discriminant analysis and the k-nearest neighbors algorithm. Using the α-transformation generalizes two rival approaches in compositional data analysis, one (when α=1) that treats the data as though they were Euclidean, ignoring the compositional constraint, and another (when α = 0) that employs Aitchison’s centered log-ratio transformation. A numerical study with several real datasets shows that whether using α = 1 or α = 0 gives better classification performance depends on the dataset, and moreover that using an intermediate value of α can sometimes give better performance than using either 1 or 0.  相似文献   

4.
We argue from the Church-Turing thesis (Kleene Mathematical logic. New York: Wiley 1967) that a program can be considered as equivalent to a formal language similar to predicate calculus where predicates can be taken as functions. We can relate such a calculus to Wittgenstein’s first major work, the Tractatus, and use the Tractatus and its theses as a model of the formal classical definition of a computer program. However, Wittgenstein found flaws in his initial great work and he explored these flaws in a new thesis described in his second great work; the Philosophical Investigations. The question we address is “can computer science make the same leap?” We are proposing, because of the flaws identified by Wittgenstein, that computers will never have the possibility of natural communication with people unless they become active participants of human society. The essential difference between formal models used in computing and human communication is that formal models are based upon rational sets whereas people are not so restricted. We introduce irrational sets as a concept that requires the use of an abductive inference system. However, formal models are still considered central to our means of using hypotheses through deduction to make predictions about the world. These formal models are required to continually be updated in response to peoples’ changes in their way of seeing the world. We propose that one mechanism used to keep track of these changes is the Peircian abductive loop.  相似文献   

5.
In the election of a hierarchical clustering method, theoretic properties may give some insight to determine which method is the most suitable to treat a clustering problem. Herein, we study some basic properties of two hierarchical clustering methods: α-unchaining single linkage or SL(α) and a modified version of this one, SL?(α). We compare the results with the properties satisfied by the classical linkage-based hierarchical clustering methods.  相似文献   

6.
Many important concepts of the calculus are difficult to grasp, and they may appear epistemologically unjustified. For example, how does a real function appear in “small” neighborhoods of its points? How does it appear at infinity? Diagrams allow us to overcome the difficulty in constructing representations of mathematical critical situations and objects. For example, they actually reveal the behavior of a real function not “close to” a point (as in the standard limit theory) but “in” the point. We are interested in our research in the diagrams which play an optical role –microscopes and “microscopes within microscopes”, telescopes, windows, a mirror role (to externalize rough mental models), and an unveiling role (to help create new and interesting mathematical concepts, theories, and structures). In this paper we describe some examples of optical diagrams as a particular kind of epistemic mediator able to perform the explanatory abductive task of providing a better understanding of the calculus, through a non-standard model of analysis. We also maintain they can be used in many other different epistemological and cognitive situations.  相似文献   

7.
A rapidly emerging hegemonic neuro-culture and a booming neural subjectivity signal the entry point for an inquiry into the status of the signifier neuro as a universal passe-partout. The wager of this paper is that the various (mis)appropriations of the neurosciences in the media and in academia itself point to something essential, if not structural, in connection with both the discipline of the neurosciences and the current socio-cultural and ideological climate. Starting from the case of neuroeducation (the application of neuroscience within education), the genealogy of the neurological turn is linked to the history of psychology and its inextricable bond with processes of psychologisation. If the neurological turn risks not merely neglecting the dimension of critique, but also obviating its possibility, then revivifying a psy-critique (understanding the academified modern subject as grounded in the scientific point of view from nowhere) might be necessary in order to understand today’s neural subjectivity and its place within current biopolitics.  相似文献   

8.
The Mathematical Intelligencer recently published a note by Y. Sergeyev that challenges both mathematics and intelligence. We examine Sergeyev’s claims concerning his purported Infinity computer. We compare his grossone system with the classical Levi-Civita fields and with the hyperreal framework of A. Robinson, and analyze the related algorithmic issues inevitably arising in any genuine computer implementation. We show that Sergeyev’s grossone system is unnecessary and vague, and that whatever consistent subsystem could be salvaged is subsumed entirely within a stronger and clearer system (IST). Lou Kauffman, who published an article on a grossone, places it squarely outside the historical panorama of ideas dealing with infinity and infinitesimals.  相似文献   

9.
Infinity, in various guises, has been invoked recently in order to ‘explain’ a number of important questions regarding observable phenomena in science, and in particular in cosmology. Such explanations are by their nature speculative. Here we introduce the notions of relative infinity, closure, and economy of explanation and ask: to what extent explanations involving relative or real constructed infinities can be treated as reasonable?  相似文献   

10.
This paper introduces a novel mixture model-based approach to the simultaneous clustering and optimal segmentation of functional data, which are curves presenting regime changes. The proposed model consists of a finite mixture of piecewise polynomial regression models. Each piecewise polynomial regression model is associated with a cluster, and within each cluster, each piecewise polynomial component is associated with a regime (i.e., a segment). We derive two approaches to learning the model parameters: the first is an estimation approach which maximizes the observed-data likelihood via a dedicated expectation-maximization (EM) algorithm, then yielding a fuzzy partition of the curves into K clusters obtained at convergence by maximizing the posterior cluster probabilities. The second is a classification approach and optimizes a specific classification likelihood criterion through a dedicated classification expectation-maximization (CEM) algorithm. The optimal curve segmentation is performed by using dynamic programming. In the classification approach, both the curve clustering and the optimal segmentation are performed simultaneously as the CEM learning proceeds. We show that the classification approach is a probabilistic version generalizing the deterministic K-means-like algorithm proposed in Hébrail, Hugueney, Lechevallier, and Rossi (2010). The proposed approach is evaluated using simulated curves and real-world curves. Comparisons with alternatives including regression mixture models and the K-means-like algorithm for piecewise regression demonstrate the effectiveness of the proposed approach.  相似文献   

11.
Measurements of p variables for n samples are collected into a n×p matrix X, where the samples belong to one of k groups. The group means are separated by Mahalanobis distances. CVA optimally represents the group means of X in an r-dimensional space. This can be done by maximizing a ratio criterion (basically one- dimensional) or, more flexibly, by minimizing a rank-constrained least-squares fitting criterion (which is not confined to being one-dimensional but depends on defining an appropriate Mahalanobis metric). In modern n < p problems, where W is not of full rank, the ratio criterion is shown not to be coherent but the fit criterion, with an attention to associated metrics, readily generalizes. In this context we give a unified generalization of CVA, introducing two metrics, one in the range space of W and the other in the null space of W, that have links with Mahalanobis distance. This generalization is computationally efficient, since it requires only the spectral decomposition of a n×n matrix.  相似文献   

12.
The Classification Literature Automated Search Service, an annual bibliography based on citation of one or more of a set of around 80 book or journal publications, ran from 1972 to 2012. We analyze here the years 1994 to 2012. The Classification Society’s Service, as it was termed, was produced by the Classification Society. In earlier decades it was distributed as a diskette or CD with the Journal of Classification. Among our findings are the following: an enormous increase in scholarly production in this area post approximately 2000; and another big increase in quantity of publications from approximately 2004. The over 93,000 bibliographic records used is the basis for determining the research disciplines that we analyze. We make all this data available for download, formatted in text and in XML, with an accompanying Apache Lucene/Solr search interface.  相似文献   

13.
14.
In this article I take the US television series Mad Men (2007—present) as an exemplary ‘approximation’, a term I adopt to signal the way in which certain texts construct a changeable, fluid ‘truth’ resulting from collisions, exchange and dialectical argument. Approximations are layered, their formal layerings mirroring a layered, multifaceted argument. Mad Men integrates and represents real historical events within a fictional setting, and act that suggests that an event or action can never be finished, fixed and not open to reassessment. Specifically, this article examines ‘The Grown Ups’, Episode 12 of Season 3, which charts the events of 22 November 1963, the day Kennedy was assassinated. Although we might be able to bring to mind the images and conspiracy theories that have been made available since (such Abraham Zapruder’s 8 mm home movie footage of the assassination), these images were not available at the time. Mad Men as a series always strives to represent its historical milieu as authentically as possible, so the characters re-enact 22 November 1963 as authentically as possible by watching only what was on television that day (the news bulletin, Walter Kronkite’s announcement that Kennedy is dead). The contemporary backdrop to these events, including the resonances of ‘9/11’ through Mad Men, inform and collide with the authenticity on the screen.  相似文献   

15.
This article deals with the aesthetics of the art documentary of the 1940s and 1950s, which can be considered as the Golden Age of the genre. Prior to the breakthrough of television in Europe, which would usurp and standardize the art documentary, cinematic reproductions of artworks resulted in experimental shorts that were highly self-reflexive. These films became visual laboratories to investigate the tensions between movement and stasis, the two- and three-dimensional, and the real and the artificial—a film on art was self-consciously presented as an art film. Focusing on La Leggenda di Sant’Orsola (1948) by Luciano Emmer and Le Monde de Paul Delvaux (1946) by Henri Storck, this article also investigates how the animation of the static image by the film medium relates to Surrealist practices.  相似文献   

16.
This commentary on Edwin Carels’ essay “Revisiting Tom Tom: Performative anamnesis and autonomous vision in Ken Jacobs’ appropriations of Tom Tom the Piper’s Son” broadens up the media-archaeological framework in which Carels places his text. Notions such as Huhtamo’s topos and Zielinski’s “deep time” are brought into the discussion in order to point out the difficulty to see what there is to see and to question the position of the viewer in front of experimental films like Tom Tom the Piper’s Son and its remakes.  相似文献   

17.
18.
In his Foundations of a General Theory of Manifolds, Georg Cantor praised Bernard Bolzano as a clear defender of actual infinity who had the courage to work with infinite numbers. At the same time, he sharply criticized the way Bolzano dealt with them. Cantor’s concept was based on the existence of a one-to-one correspondence, while Bolzano insisted on Euclid’s Axiom of the whole being greater than a part. Cantor’s set theory has eventually prevailed, and became a formal basis of contemporary mathematics, while Bolzano’s approach is generally considered a step in the wrong direction. In the present paper, we demonstrate that a fragment of Bolzano’s theory of infinite quantities retaining the part-whole principle can be extended to a consistent mathematical structure. It can be interpreted in several possible ways. We obtain either a linearly ordered ring of finite and infinitely great quantities, or a partially ordered ring containing infinitely small, finite and infinitely great quantities. These structures can be used as a basis of the infinitesimal calculus similarly as in non-standard analysis, whether in its full version employing ultrafilters due to Abraham Robinson, or in the recent “cheap version” avoiding ultrafilters due to Terence Tao.  相似文献   

19.
The question of Heidegger’s reflections on technology is explored in terms of ‘living with’ technology and including the socio-theoretical (Edinburgh) notion of ‘entanglement’ towards a review of Heidegger’s understanding of technology and media, including the entertainment industry and modern digital life. I explore Heidegger’s reflections on Gelassenheit by way of the Japanese aesthetic conception of life and of art as wabi-sabi understood with respect to Heidegger’s Gelassenheit as the art of Verfallenheit.  相似文献   

20.
I agree with my readers on the necessary alliance of personal agency and collective agency. My point is to prioritize the former. The reasons to prioritize the latter were excellent, and it was undoubtedly a sound decision to start with this scenario: political and institutional improvement of the collectives, enlightened by progressive social sciences. My argument for suggesting a different priority—toward personal microactions and their emergent effects—relies on the opinion that we are stuck in a sustainability crisis due to our current approach. In the question “whose agency now?”, my stress is on “now”.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号