首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 328 毫秒
1.
Many important concepts of the calculus are difficult to grasp, and they may appear epistemologically unjustified. For example, how does a real function appear in “small” neighborhoods of its points? How does it appear at infinity? Diagrams allow us to overcome the difficulty in constructing representations of mathematical critical situations and objects. For example, they actually reveal the behavior of a real function not “close to” a point (as in the standard limit theory) but “in” the point. We are interested in our research in the diagrams which play an optical role –microscopes and “microscopes within microscopes”, telescopes, windows, a mirror role (to externalize rough mental models), and an unveiling role (to help create new and interesting mathematical concepts, theories, and structures). In this paper we describe some examples of optical diagrams as a particular kind of epistemic mediator able to perform the explanatory abductive task of providing a better understanding of the calculus, through a non-standard model of analysis. We also maintain they can be used in many other different epistemological and cognitive situations.  相似文献   

2.
In his Foundations of a General Theory of Manifolds, Georg Cantor praised Bernard Bolzano as a clear defender of actual infinity who had the courage to work with infinite numbers. At the same time, he sharply criticized the way Bolzano dealt with them. Cantor’s concept was based on the existence of a one-to-one correspondence, while Bolzano insisted on Euclid’s Axiom of the whole being greater than a part. Cantor’s set theory has eventually prevailed, and became a formal basis of contemporary mathematics, while Bolzano’s approach is generally considered a step in the wrong direction. In the present paper, we demonstrate that a fragment of Bolzano’s theory of infinite quantities retaining the part-whole principle can be extended to a consistent mathematical structure. It can be interpreted in several possible ways. We obtain either a linearly ordered ring of finite and infinitely great quantities, or a partially ordered ring containing infinitely small, finite and infinitely great quantities. These structures can be used as a basis of the infinitesimal calculus similarly as in non-standard analysis, whether in its full version employing ultrafilters due to Abraham Robinson, or in the recent “cheap version” avoiding ultrafilters due to Terence Tao.  相似文献   

3.
4.
One important role of belief systems is to allow us to represent information about a certain domain of inquiry. This paper presents a formal framework to accommodate such information representation. Three cognitive models to represent information are discussed: conceptual spaces (Gärdenfors in Conceptual spaces: the geometry of thought. MIT Press, Cambridge, 2000), state-spaces (van Fraassen in Quantum mechanics: an empiricist view. Clarendon Press, Oxford, 1991), and the problem spaces familiar from artificial intelligence. After indicating their weakness to deal with partial information, it is argued that an alternative, formulated in terms of partial structures (da Costa and French in Science and partial truth. Oxford University Press, New York, 2003), can be provided which not only captures the positive features of these models, but also accommodates the partiality of information ubiquitous in science and mathematics.  相似文献   

5.
The Mathematical Intelligencer recently published a note by Y. Sergeyev that challenges both mathematics and intelligence. We examine Sergeyev’s claims concerning his purported Infinity computer. We compare his grossone system with the classical Levi-Civita fields and with the hyperreal framework of A. Robinson, and analyze the related algorithmic issues inevitably arising in any genuine computer implementation. We show that Sergeyev’s grossone system is unnecessary and vague, and that whatever consistent subsystem could be salvaged is subsumed entirely within a stronger and clearer system (IST). Lou Kauffman, who published an article on a grossone, places it squarely outside the historical panorama of ideas dealing with infinity and infinitesimals.  相似文献   

6.
Let \( \mathcal{G} \) = (G,w) be a weighted simple finite connected graph, that is, let G be a simple finite connected graph endowed with a function w from the set of the edges of G to the set of real numbers. For any subgraph G′ of G, we define w(G′) to be the sum of the weights of the edges of G′. For any i, j vertices of G, we define D {i,j}(\( \mathcal{G} \)) to be the minimum of the weights of the simple paths of G joining i and j. The D {i,j}(\( \mathcal{G} \)) are called 2-weights of \( \mathcal{G} \). Weighted graphs and their reconstruction from 2-weights have applications in several disciplines, such as biology and psychology.Let \( {\left\{{m}_I\right\}}_{I\in \left(\frac{\left\{1,\dots, n\right\}}{2}\right)} \) and \( {\left\{{M}_I\right\}}_{I\in \left(\frac{\left\{1,\dots, n\right\}}{2}\right)} \) be two families of positive real numbers parametrized by the 2-subsets of {1, …, n} with m I M I for any I; we study when there exist a positive-weighted graph G and an n-subset {1, …, n} of the set of its vertices such that D I (\( \mathcal{G} \)) ∈ [m I ,M I ] for any \( I\in \left(\frac{\left\{1,\dots, n\right\}}{2}\right) \). Then we study the analogous problem for trees, both in the case of positive weights and in the case of general weights.  相似文献   

7.
In the election of a hierarchical clustering method, theoretic properties may give some insight to determine which method is the most suitable to treat a clustering problem. Herein, we study some basic properties of two hierarchical clustering methods: α-unchaining single linkage or SL(α) and a modified version of this one, SL?(α). We compare the results with the properties satisfied by the classical linkage-based hierarchical clustering methods.  相似文献   

8.
In compositional data analysis, an observation is a vector containing nonnegative values, only the relative sizes of which are considered to be of interest. Without loss of generality, a compositional vector can be taken to be a vector of proportions that sum to one. Data of this type arise in many areas including geology, archaeology, biology, economics and political science. In this paper we investigate methods for classification of compositional data. Our approach centers on the idea of using the α-transformation to transform the data and then to classify the transformed data via regularized discriminant analysis and the k-nearest neighbors algorithm. Using the α-transformation generalizes two rival approaches in compositional data analysis, one (when α=1) that treats the data as though they were Euclidean, ignoring the compositional constraint, and another (when α = 0) that employs Aitchison’s centered log-ratio transformation. A numerical study with several real datasets shows that whether using α = 1 or α = 0 gives better classification performance depends on the dataset, and moreover that using an intermediate value of α can sometimes give better performance than using either 1 or 0.  相似文献   

9.
In this article I take the US television series Mad Men (2007—present) as an exemplary ‘approximation’, a term I adopt to signal the way in which certain texts construct a changeable, fluid ‘truth’ resulting from collisions, exchange and dialectical argument. Approximations are layered, their formal layerings mirroring a layered, multifaceted argument. Mad Men integrates and represents real historical events within a fictional setting, and act that suggests that an event or action can never be finished, fixed and not open to reassessment. Specifically, this article examines ‘The Grown Ups’, Episode 12 of Season 3, which charts the events of 22 November 1963, the day Kennedy was assassinated. Although we might be able to bring to mind the images and conspiracy theories that have been made available since (such Abraham Zapruder’s 8 mm home movie footage of the assassination), these images were not available at the time. Mad Men as a series always strives to represent its historical milieu as authentically as possible, so the characters re-enact 22 November 1963 as authentically as possible by watching only what was on television that day (the news bulletin, Walter Kronkite’s announcement that Kennedy is dead). The contemporary backdrop to these events, including the resonances of ‘9/11’ through Mad Men, inform and collide with the authenticity on the screen.  相似文献   

10.
The question of Heidegger’s reflections on technology is explored in terms of ‘living with’ technology and including the socio-theoretical (Edinburgh) notion of ‘entanglement’ towards a review of Heidegger’s understanding of technology and media, including the entertainment industry and modern digital life. I explore Heidegger’s reflections on Gelassenheit by way of the Japanese aesthetic conception of life and of art as wabi-sabi understood with respect to Heidegger’s Gelassenheit as the art of Verfallenheit.  相似文献   

11.
This commentary on Edwin Carels’ essay “Revisiting Tom Tom: Performative anamnesis and autonomous vision in Ken Jacobs’ appropriations of Tom Tom the Piper’s Son” broadens up the media-archaeological framework in which Carels places his text. Notions such as Huhtamo’s topos and Zielinski’s “deep time” are brought into the discussion in order to point out the difficulty to see what there is to see and to question the position of the viewer in front of experimental films like Tom Tom the Piper’s Son and its remakes.  相似文献   

12.
Infinity, in various guises, has been invoked recently in order to ‘explain’ a number of important questions regarding observable phenomena in science, and in particular in cosmology. Such explanations are by their nature speculative. Here we introduce the notions of relative infinity, closure, and economy of explanation and ask: to what extent explanations involving relative or real constructed infinities can be treated as reasonable?  相似文献   

13.
Abraham Robinson’s framework for modern infinitesimals was developed half a century ago. It enables a re-evaluation of the procedures of the pioneers of mathematical analysis. Their procedures have been often viewed through the lens of the success of the Weierstrassian foundations. We propose a view without passing through the lens, by means of proxies for such procedures in the modern theory of infinitesimals. The real accomplishments of calculus and analysis had been based primarily on the elaboration of novel techniques for solving problems rather than a quest for ultimate foundations. It may be hopeless to interpret historical foundations in terms of a punctiform continuum, but arguably it is possible to interpret historical techniques and procedures in terms of modern ones. Our proposed formalisations do not mean that Fermat, Gregory, Leibniz, Euler, and Cauchy were pre-Robinsonians, but rather indicate that Robinson’s framework is more helpful in understanding their procedures than a Weierstrassian framework.  相似文献   

14.
15.
This article deals with the aesthetics of the art documentary of the 1940s and 1950s, which can be considered as the Golden Age of the genre. Prior to the breakthrough of television in Europe, which would usurp and standardize the art documentary, cinematic reproductions of artworks resulted in experimental shorts that were highly self-reflexive. These films became visual laboratories to investigate the tensions between movement and stasis, the two- and three-dimensional, and the real and the artificial—a film on art was self-consciously presented as an art film. Focusing on La Leggenda di Sant’Orsola (1948) by Luciano Emmer and Le Monde de Paul Delvaux (1946) by Henri Storck, this article also investigates how the animation of the static image by the film medium relates to Surrealist practices.  相似文献   

16.
Why Axiomatize?     
Axiomatization is uncommon outside mathematics, partly for being often viewed as embalming, partly because the best-known axiomatizations have serious shortcomings, and partly because it has had only one eminent champion, namely David Hilbert (Math Ann 78:405–415, 1918). The aims of this paper are (a) to describe what will be called dual axiomatics, for it concerns not just the formalism, but also the meaning (reference and sense) of the key concepts; and (b) to suggest that every instance of dual axiomatics presupposes some philosophical view or other. To illustrate these points, a theory of solidarity will be crafted and axiomatized, and certain controversies in both classical and quantum physics, as well as in the philosophy of mind, will be briefly discussed. The upshot of this paper is that dual axiomatics, unlike the purely formal axiomatics favored by the structuralists school, is not a luxury but a tool helping resolve some scientific controversies.  相似文献   

17.
This paper introduces a novel mixture model-based approach to the simultaneous clustering and optimal segmentation of functional data, which are curves presenting regime changes. The proposed model consists of a finite mixture of piecewise polynomial regression models. Each piecewise polynomial regression model is associated with a cluster, and within each cluster, each piecewise polynomial component is associated with a regime (i.e., a segment). We derive two approaches to learning the model parameters: the first is an estimation approach which maximizes the observed-data likelihood via a dedicated expectation-maximization (EM) algorithm, then yielding a fuzzy partition of the curves into K clusters obtained at convergence by maximizing the posterior cluster probabilities. The second is a classification approach and optimizes a specific classification likelihood criterion through a dedicated classification expectation-maximization (CEM) algorithm. The optimal curve segmentation is performed by using dynamic programming. In the classification approach, both the curve clustering and the optimal segmentation are performed simultaneously as the CEM learning proceeds. We show that the classification approach is a probabilistic version generalizing the deterministic K-means-like algorithm proposed in Hébrail, Hugueney, Lechevallier, and Rossi (2010). The proposed approach is evaluated using simulated curves and real-world curves. Comparisons with alternatives including regression mixture models and the K-means-like algorithm for piecewise regression demonstrate the effectiveness of the proposed approach.  相似文献   

18.
Dendrograms used in data analysis are ultrametric spaces, hence objects of nonarchimedean geometry. It is known that there exist p-adic representations of dendrograms. Completed by a point at infinity, they can be viewed as subtrees of the Bruhat-Tits tree associated to the p-adic projective line. The implications are that certain moduli spaces known in algebraic geometry are in fact p-adic parameter spaces of dendrograms, and stochastic classification can also be handled within this framework. At the end, we calculate the topology of the hidden part of a dendrogram.  相似文献   

19.
I agree with my readers on the necessary alliance of personal agency and collective agency. My point is to prioritize the former. The reasons to prioritize the latter were excellent, and it was undoubtedly a sound decision to start with this scenario: political and institutional improvement of the collectives, enlightened by progressive social sciences. My argument for suggesting a different priority—toward personal microactions and their emergent effects—relies on the opinion that we are stuck in a sustainability crisis due to our current approach. In the question “whose agency now?”, my stress is on “now”.  相似文献   

20.
Love and Realism     
In this reply I try to show that, contrary to Milberry’s apparent assertion, the general intellect of the multitude does not have the explanatory robustness she accredits to it (following both Virno and the Hardt and Negri of the Empire trilogy). Digital network technologies are currently overwhelmingly effective in proletarianizing and disempowering the cognitariat and only an active technopolitics of deproletarianization could reverse this hegemonic situation. In my response to Verbeek, I attempt to correct his misinterpretation (shared by Milberry) of the Stieglerian approach as being dialectical in nature and show that, far from reinstating the humanist dichotomy between human beings and technologies, my analysis assumes their original, albeit fundamentally ambiguous and even ‘uncanny’ [unheimlich] interconnection. I conclude with pointing out some implications of this view for a ‘really realistic’ political theory of technology.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号