首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In the 1930s, Carnap set out to incorporate psychology into the unity of science, by showing that all cognitively meaningful sentences of psychology can be translated into the language of physics. I will argue that Carnap, relying on his notion of protocol languages, defends a physicalistic philosophy of psychology that shows due appreciation of ‘introspection’ as a strictly subjective, but reliable way to verify sentences about one’s own mind. Second, I will point out that Carnap’s philosophy of psychology not only takes into account overt behaviour, but must comprise neurophysiological processes as well. Last, I will show that Carnap aims to develop a philosophy of psychology that does justice to the ongoing changeability of scientific knowledge.  相似文献   

2.
Online auctions have become increasingly popular in recent years. There is a growing body of research on this topic, whereas modeling online auction price curves constitutes one of the most interesting problems. Most research treats price curves as deterministic functions, which ignores the random effects of external and internal factors. To account for the randomness, a more realistic model using stochastic differential equations is proposed in this paper. The online auction price is modeled by a stochastic differential equation in which the deterministic part is equivalent to the second‐order differential equation model proposed in Wang et al. (Journal of the American Statistical Association, 2008, 103, 1100–1118). The model also includes a component representing the measurement errors. Explicit expressions for the likelihood function are also obtained, from which statistical inference can be conducted. Forecast accuracy of the proposed model is compared with the ODE (ordinary differential equation) approach. Simulation results show that the proposed model performs better.  相似文献   

3.
The most recent results in the dynamics of infinite systems of particles have shown there is no necessary connection between the performance of a deterministic supertask and the non-conservation of energy. Much more difficult to prove is the non-existence of any necessary connection between the performance of an indeterministic supertask and the non-conservation of energy; however, this paper does just that. So the requirement that energy be conserved does not mean we can exclude as “non-physical” either of the two major types of supertask considered to date. The paper also proposes, in this context, a “hidden variables method” to prove general results of compatibility in dynamics.  相似文献   

4.
5.
The Copenhagen interpretation of quantum mechanics is the dominant view of the theory among working physicists, if not philosophers. There are, however, several strains of Copenhagenism extant, each largely accepting Born's assessment of the wave function as the most complete possible specification of a system and the notion of collapse as a completely random event. This paper outlines three of these sub-interpretations, typing them by what the author of each names as the trigger of quantum-mechanical collapse. Visions of the theory from von Neumann, Heisenberg, and Wheeler offer different mechanisms to break the continuous, deterministic, superposition-laden quantum chain and yield discrete, probabilistic, classical results in response to von Neumann's catastrophe of infinite regress.  相似文献   

6.
It is generally thought that objective chances for particular events different from 1 and 0 and determinism are incompatible. However, there are important scientific theories whose laws are deterministic but which also assign non-trivial probabilities to events. The most important of these is statistical mechanics whose probabilities are essential to the explanations of thermodynamic phenomena. These probabilities are often construed as ‘ignorance’ probabilities representing our lack of knowledge concerning the microstate. I argue that this construal is incompatible with the role of probability in explanation and laws. This is the ‘paradox of deterministic probabilities’. After surveying the usual list of accounts of objective chance and finding them inadequate I argue that an account of chance sketched by David Lewis can be modified to solve the paradox of deterministic probabilities and provide an adequate account of the probabilities in deterministic theories like statistical mechanics.  相似文献   

7.
Can stable regularities be explained without appealing to governing laws or any other modal notion? In this paper, I consider what I will call a ‘Humean system’—a generic dynamical system without guiding laws—and assess whether it could display stable regularities. First, I present what can be interpreted as an account of the rise of stable regularities, following from Strevens (2003), which has been applied to explain the patterns of complex systems (such as those from meteorology and statistical mechanics). Second, since this account presupposes that the underlying dynamics displays deterministic chaos, I assess whether it can be adapted to cases where the underlying dynamics is not chaotic but truly random—that is, cases where there is no dynamics guiding the time evolution of the system. If this is so, the resulting stable, apparently non-accidental regularities are the fruit of what can be called statistical necessity rather than of a primitive physical necessity.  相似文献   

8.
Based on da Costa's and French's notions of partial structures and pragmatic truth, this paper examines two possible characterizations of the concept of empirical adequacy, one depending on the notion of partial isomorphism, the other on the hierarchy of partial models of phenomena, and both compatible with an empiricist view. These formulations can then be employed to illuminate certain aspects of scientific practice.An empirical theory must single out a specific part of the world, establish reference to that part, and say—by way of contingent, substantial claim about the world—that its models fit that. Now, how exactly can this be done? Bas C. van Fraassen (1993, p. 9)  相似文献   

9.
The basic notion of an objective probability is that of a probability determined by the physical structure of the world. On this understanding, there are subjective credences that do not correspond to objective probabilities, such as credences concerning rival physical theories. The main question for objective probabilities is how they are determined by the physical structure.In this paper, I survey three ways of understanding objective probability: stochastic dynamics, humean chances, and deterministic chances (typicality). The first is the obvious way to understand the probabilities of quantum mechanics via a collapse theory such as GRW, the last is the way to understand the probabilities in the context of a deterministic theory such as Bohmian mechanics. Humean chances provide a more abstract and general account of chances locutions that are independent of dynamical considerations.  相似文献   

10.
The purpose of this paper is to simultaneously investigate several important issues that feature the dynamic and stochastic behavior of beta coefficients for individual stocks and affect the forecasting of stock returns. The issues include randomness, nonstantionarity, and shifts in the mean and variance parameters of the beta coefficient, and are addressed within the framework of variable-mean-response (VMR) random coefficients models in which the problem of heteroscedasticity is present. Estimation is done using a four-step generalized least squares method. The hypotheses concerning randomness and nonstationarity of betas are tested. The time paths, sizes, and marginal rates of mean shifts are determined. The issue of variance shift is examined on the basis of five special tests, called T*, B, S', G and W. Then the impacts of the dynamic and stochastic instability on the estimation of betas is tested by a nonparametric procedure. Finally, the VMR models' ability of forecasting stock returns is evaluated against the standard capital asset pricing model. The empirical findings shed new light on the continuing debate as to whether the beta coefficient is random and nonstationary and have important implications for modeling and forecasting the measurement of performance and the determination of stock returns.  相似文献   

11.
James McAllister’s 2003 article, ‘Algorithmic randomness in empirical data’, claims that empirical data sets are algorithmically random, and hence incompressible. We show that this claim is mistaken. We present theoretical arguments and empirical evidence for compressibility, and discuss the matter in the framework of Minimum Message Length (MML) inference.  相似文献   

12.
One thing about technical artefacts that needs to be explained is how their physical make-up, or structure, enables them to fulfil the behaviour associated with their function, or, more colloquially, how they work. In this paper I develop an account of such explanations based on the familiar notion of mechanistic explanation. To accomplish this, I (1) outline two explanatory strategies that provide two different types of insight into an artefact’s functioning, and (2) show how human action inevitably plays a role in artefact explanation. I then use my own account to criticize other recent work on mechanistic explanation and conclude with some general implications for the philosophy of explanation.  相似文献   

13.
By the middle of the eighteenth century the new science had challenged the intellectual primacy of common experience in favor of recondite, expert and even counter-intuitive knowledge increasingly mediated by specialized instruments. Meanwhile modern philosophy had also problematized the perceptions of common experience — in the case of David Hume this included our perception of causal relations in nature, a fundamental precondition of scientific endeavor.In this article I argue that, in responding to the ‘problem of induction’ as advanced by Hume, Reid reformulated Aristotelian foundationalism in distinctly modern terms. An educator and mathematician self-consciously working within the framework of the new science, Reid articulated a philosophical foundation for natural knowledge anchored in the human constitution and in processes of adjudication in an emerging modern public sphere of enlightened discourse. Reid thereby transformed one of the bases of Aristotelian science — common experience — into a philosophically and socially justified notion of ‘common sense’. Reid's intellectual concerns had as much to do with the philosophy of science as they did with moral philosophy or epistemology proper, and were bound up with wider social and scientific changes taking place in the early modern period.  相似文献   

14.
This paper takes another look at a case study which has featured prominently in a variety of arguments for rival realist positions. After critically reviewing the previous commentaries of the theory shift that took place in the transition from Fresnel’s ether to Maxwell’s electromagnetic theory of optics, it will defend a slightly different reading of this historical case study. Central to this task is the notion of explanatory approximate truth, a concept which must be carefully analysed to begin with. With this notion properly understood, it will be finally argued, the popular Fresnel–Maxwell case study points towards a novel formulation of scientific realism.  相似文献   

15.
The (Strong) Free Will Theorem (fwt) of Conway and Kochen (2009) on the one hand follows from uncontroversial parts of modern physics and elementary mathematical and logical reasoning, but on the other hand seems predicated on an undefined notion of free will (allowing physicists to “freely choose” the settings of their experiments). This makes the theorem philosophically vulnerable, especially if it is construed as a proof of indeterminism or even of libertarian free will (as Conway & Kochen suggest).However, Cator and Landsman (Foundations of Physics 44, 781–791, 2014) previously gave a reformulation of the fwt that does not presuppose indeterminism, but rather assumes a mathematically specific form of such “free choices” even in a deterministic world (based on a non-probabilistic independence assumption). In the present paper, which is a philosophical sequel to the one just mentioned, I argue that the concept of free will used in the latter version of the fwt is essentially the one proposed by Lewis (1981), also known as ‘local miracle compatibilism’ (of which I give a mathematical interpretation that might be of some independent interest also beyond its application to the fwt). As such, the (reformulated) fwt in my view challenges compatibilist free will à la Lewis (albeit in a contrived way via bipartite epr-type experiments), falling short of supporting libertarian free will.  相似文献   

16.
According to my interpretation, based on the entirety of Michael Polanyi's epistemological works, his theory of tacit knowing is conceived of as three models tied together by the central feature of Intellectual Passions as integrator. The models are progressively refined forms of his first conception of tacit knowing: ‘we know more than we can tell’. The three models are: the Gestalt-Perception Model based on the gestalt notion of part-whole relations, the Action-Guiding Model incorporating the phenomenological-existential notion of intentional action, and the Semiotic Model, an abstract conception of action directed to meaning showing that tacit knowing has a ‘from-to structure’ (from subsidiary awareness to focal awareness). In the Semiotic Model integration is named by the logical term ‘inference’. Polanyi's conception of reality and his theory of truth are introduced linked to the models, to show why his epistemology is not subjectivist and his theory of truth is not relativist.  相似文献   

17.
Chromogranin A (CHGA) is ubiquitously expressed in secretory cells of the endocrine, neuroendocrine, and neuronal tissues. Although this protein has long been known as a marker for neuroendocrine tumors, its role in cardiovascular disease states including essential hypertension (EH) has only recently been recognized. It acts as a prohormone giving rise to bioactive peptides such as vasostatin-I (human CHGA1–76) and catestatin (human CHGA352–372) that exhibit several cardiovascular regulatory functions. CHGA is over-expressed but catestatin is diminished in EH. Moreover, genetic variants in the promoter, catestatin, and 3′-untranslated regions of the human CHGA gene alter autonomic activity and blood pressure. Consistent with these findings, targeted ablation of this gene causes severe arterial hypertension and ventricular hypertrophy in mice. Transgenic expression of the human CHGA gene or exogenous administration of catestatin restores blood pressure in these mice. Thus, the accumulated evidence establishes CHGA as a novel susceptibility gene for EH.  相似文献   

18.
The continental drift research programme reigns supreme within the geological community. The programme achieved its regal status only within the last decade. Its ascension to the summit took over fifty years, and required numerous switchbacks. Although its climb may seem haphazard, I argue that there is an overall rationale to its development which is partially elucidated by the account of scientific growth and change as put forth by Imre Lakatos. However, at least two alterations must be made in Lakatos' analysis. One concerns his analysis of ‘novel fact’, and the other is concerned with his thesis that the hard core of a research programme remains the same throughout the programme's lifetime. I consider and reject Elie Zahar's notion of ‘novel fact’, introduce an alternative notion of ‘novel fact’, and argue that Lakatos and his followers must abandon the thesis that a research programme's hard core is immune from change, but that they can do so without endangering Lakatos' overall account of scientific growth and change.  相似文献   

19.
It is argued that we cannot understand the notion of proper functions of artefacts independently of social notions. Functions of artefacts are related to social facts via the use of artefacts. The arguments in this article can be used to improve existing function theories that look to the causal history of artefacts to determine the function. A view that takes the intentions of designers into account to determine the proper function is both natural and often correct, but it is shown that there are exceptions to this. Taking a social constitutive element into account may amend these backwards looking theories. An improved theory may either have a disjunctive form—either the history or collective intentions determine the proper function—or, as is suggested in the article, be in the form of an encompassing account that views the designers’ intentions as social, in so far as they are accepted by the users. Designers have authority, which is a social fact. The views argued for here are applied to two existing theories of artefact functions, a causal historic approach and an action theoretic approach.  相似文献   

20.
Typical worlds     
Hugh Everett III presented pure wave mechanics, sometimes referred to as the many-worlds interpretation, as a solution to the quantum measurement problem. While pure wave mechanics is an objectively deterministic physical theory with no probabilities, Everett sought to show how the theory might be understood as making the standard quantum statistical predictions as appearances to observers who were themselves described by the theory. We will consider his argument and how it depends on a particular notion of branch typicality. We will also consider responses to Everett and the relationship between typicality and probability. The suggestion will be that pure wave mechanics requires a number of significant auxiliary assumptions in order to make anything like the standard quantum predictions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号