首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This essay explores an alternative pathway to Alzheimer’s dementia that focuses on damage to small blood vessels rather than late-stage toxic amyloid deposits as the primary pathogenic mechanism that leads to irreversible dementia. While the end-stage pathology of AD is well known, the pathogenic processes that lead to disease are often assumed to be due to toxic amyloid peptides that act on neurons, leading to neuronal dysfunction and eventually neuronal cell death. Speculations as to what initiates the pathogenic cascade have included toxic abeta peptide aggregates, oxidative damage, and inflammation, but none explain why neurons die. Recent high-resolution NMR studies of living patients show that lesions in white matter regions of the brain precede the appearance of amyloid deposits and are correlated with damaged small blood vessels. To appreciate the pathogenic potential of damaged small blood vessels in the brain, it is useful to consider the clinical course and the pathogenesis of CADASIL, a heritable arteriopathy that leads to damaged small blood vessels and irreversible dementia. CADASIL is strikingly similar to early onset AD in that it is caused by germ line mutations in NOTCH 3 that generate toxic protein aggregates similar to those attributed to mutant forms of the amyloid precursor protein and presenilin genes. Since NOTCH 3 mutants clearly damage small blood vessels of white matter regions of the brain that lead to dementia, we speculate that both forms of dementia may have a similar pathogenesis, which is to cause ischemic damage by blocking blood flow or by impeding the removal of toxic protein aggregates by retrograde vascular clearance mechanisms.  相似文献   

2.
The aim of the paper is threefold. Its first aim is to defend Eric Watkins's claim that for Kant, a cause is not an event but a causal power: a power that is borne by a substance, and that, when active, brings about its effect, i.e. a change of the states of another substance, by generating a continuous flow of intermediate states of that substance. The second aim of the paper is to argue against Watkins that the Kantian concept of causal power is not the pre-critical concept of real ground but the category of causality, and that Kant holds with Hume that causal laws cannot be inferred non-inductively (that he accordingly has no intention to show in the Second analogy or elsewhere that events fall under causal laws). The third aim of the paper is to compare the Kantian position on causality with central tenets of contemporary powers ontology: it argues that unlike the variants endorsed by contemporary powers theorists, the Kantian variants of these tenets are resistant to objections that neo-Humeans raise to these tenets.  相似文献   

3.
Every model leaves out or distorts some factors that are causally connected to its target phenomenon—the phenomenon that it seeks to predict or explain. If we want to make predictions, and we want to base decisions on those predictions, what is it safe to omit or to simplify, and what ought a causal model to describe fully and correctly? A schematic answer: the factors that matter are those that make a difference to the target phenomenon. There are several ways to understand differencemaking. This paper advances a view as to which is the most relevant to the forecaster and the decision-maker. It turns out that the right notion of differencemaking for thinking about idealization in prediction is also the right notion for thinking about idealization in explanation; this suggests a carefully circumscribed version of Hempel’s famous thesis that there is a symmetry between explanation and prediction.  相似文献   

4.
It is frequently said that belief aims at truth, in an explicitly normative sense—that is, that one ought to believe the proposition that p if, and only if, p is true. This truth norm is frequently invoked to explain why we should seek evidential justification in our beliefs, or why we should try to be rational in our belief formation—it is because we ought to believe the truth that we ought to follow the evidence in belief revision. In this paper, I argue that this view is untenable. The truth norm clashes with plausible evidential norms in a wide range of cases, such as when we have excellent but misleading evidence for a falsehood or no evidence for a truth. I will consider various ways to resolve this conflict and argue that none of them work. However, I will ultimately attempt to vindicate the love of truth, by arguing that knowledge is the proper epistemic goal. The upshot is that we should not aim merely to believe the truth; we should aim to know it.  相似文献   

5.
Structuralists typically appeal to some variant of the widely popular ‘mapping’ account of mathematical representation to suggest that mathematics is applied in modern science to represent the world’s physical structure. However, in this paper, I argue that this realist interpretation of the ‘mapping’ account presupposes that physical systems possess an ‘assumed structure’ that is at odds with modern physical theory. Through two detailed case studies concerning the use of the differential and variational calculus in modern dynamics, I show that the formal structure that we need to assume in order to apply the mapping account is inconsistent with the way in which mathematics is applied in modern physics. The problem is that a realist interpretation of the ‘mapping’ account imposes too severe of a constraint on the conformity that must exist between mathematics and nature in order for mathematics to represent the structure of a physical system.  相似文献   

6.
The view that the fundamental kind properties are intrinsic properties enjoys reflexive endorsement by most metaphysicians of science. But ontic structural realists deny that there are any fundamental intrinsic properties at all. Given that structuralists distrust intuition as a guide to truth, and given that we currently lack a fundamental physical theory that we could consult instead to order settle the issue, it might seem as if there is simply nowhere for this debate to go at present. However, I will argue that there exists an as-yet untapped resource for arguing for ontic structuralism – namely, the way that fundamentality is conceptualized in our most fundamental physical frameworks. By arguing that physical objects must be subject to the ‘Goldilock's principle’ if they are to count as fundamental at all, I argue that we can no longer view the majority of properties defining them as intrinsic. As such, ontic structural realism can be regarded as the most promising metaphysics for fundamental physics, and that this is so even though we do not yet claim to know precisely what that fundamental physics is.  相似文献   

7.
Dingle contended that Einstein’s special theory of relativity was physically impossible for the simple reason that it required clocks to be simultaneously faster and slower than each other. McCrea refuted Dingle using an operationalist argument. An operational response did not satisfy Popper, who wrote an unpublished essay to counter Dingle’s claim. Popper developed an analysis that avoided operationalism by using a system of coinciding clocks, contending that this system showed that special relativity withstood Dingle’s criticism that it was not a symmetrical and consistent physical theory. However, Popper mistakenly included an asymmetric calculation in his analysis. Once this is corrected, the amended result supports Dingle’s position. Popper went on to argue that to avoid determinism, special relativity had to be reconciled with absolute time; this too supports Dingle. Popper’s failure to refute Dingle calls into question his claim that ‘the observer’ is superfluous to special relativity.  相似文献   

8.
Extensional scientific realism is the view that each believable scientific theory is supported by the unique first-order evidence for it and that if we want to believe that it is true, we should rely on its unique first-order evidence. In contrast, intensional scientific realism is the view that all believable scientific theories have a common feature and that we should rely on it to determine whether a theory is believable or not. Fitzpatrick argues that extensional realism is immune, while intensional realism is not, to the pessimistic induction. I reply that if extensional realism overcomes the pessimistic induction at all, that is because it implicitly relies on the theoretical resource of intensional realism. I also argue that extensional realism, by nature, cannot embed a criterion for distinguishing between believable and unbelievable theories.  相似文献   

9.
This article critically appraises David Bloor’s recent attempts to refute criticisms levelled at the Strong Programme’s social constructionist approach to scientific knowledge. Bloor has tried to argue, contrary to some critics, that the Strong Programme is not idealist in character, and it does not involve a challenge to the credibility of scientific knowledge. I argue that Bloor’s attempt to deflect the charge of idealism, which calls on the self-referential theory of social institutions, is partially successful. However, I suggest that although the Strong Programme should not be accused of ‘strong idealism’, it is still vulnerable to the criticism that it entails a form of ‘weak idealism’. The article moves on to argue that, contrary to Bloor, constructionist approaches do challenge the credibility of the scientific knowledge that they analyse. I conclude the article by arguing that sociological analyses of scientific knowledge can be conducted without the weak idealism and the credibility-challenging assumptions of the Strong Programme approach.  相似文献   

10.
This article discusses the intersection of science and culture in the marketplace and explores the ways in which radium quack and medicinal products were packaged and labelled in the early twentieth century US. Although there is an interesting growing body of literature by art historians on package design, historians of science and medicine have paid little to no attention to the ways scientific and medical objects that were turned into commodities were packaged and commercialized. Thinking about packages not as mere containers but as multifunctional tools adds to historical accounts of science as a sociocultural enterprise and reminds us that science has always been part of consumer culture. This paper suggests that far from being receptacles that preserve their content and facilitate their transportation, bottles and boxes that contained radium products functioned as commercial and epistemic devices. It was the 1906 Pure Food and Drug Act that enforced such functions. Packages worked as commercial devices in the sense that they were used to boost sales. In addition, 'epistemic' points to the fact that the package is an artefact that ascribes meaning to and shapes its content while at the same time working as a device for distinguishing between patent and orthodox medicines.  相似文献   

11.
This article discusses the intersection of science and culture in the marketplace and explores the ways in which radium quack and medicinal products were packaged and labelled in the early twentieth century US. Although there is an interesting growing body of literature by art historians on package design, historians of science and medicine have paid little to no attention to the ways scientific and medical objects that were turned into commodities were packaged and commercialized. Thinking about packages not as mere containers but as multifunctional tools adds to historical accounts of science as a sociocultural enterprise and reminds us that science has always been part of consumer culture. This paper suggests that far from being receptacles that preserve their content and facilitate their transportation, bottles and boxes that contained radium products functioned as commercial and epistemic devices. It was the 1906 Pure Food and Drug Act that enforced such functions. Packages worked as commercial devices in the sense that they were used to boost sales. In addition, ‘epistemic’ points to the fact that the package is an artefact that ascribes meaning to and shapes its content while at the same time working as a device for distinguishing between patent and orthodox medicines.  相似文献   

12.
There is a long-standing debate in the philosophy of mind and philosophy of science regarding how best to interpret the relationship between neuroscience and psychology. It has traditionally been argued that either the two domains will evolve and change over time until they converge on a single unified account of human behaviour, or else that they will continue to work in isolation given that they identify properties and states that exist autonomously from one another (due to the multiple-realizability of psychological states). In this paper, I argue that progress in psychology and neuroscience is contingent on the fact that both of these positions are false. Contra the convergence position, I argue that the theories of psychology and the theories of neuroscience are scientifically valuable as representational tools precisely because they cannot be integrated into a single account. However, contra the autonomy position, I propose that the theories of psychology and neuroscience are deeply dependent on one another for further refinement and improvement. In this respect, there is an irreconcilable codependence between psychology and neuroscience that is necessary for both domains to improve and progress. The two domains are forever linked while simultaneously being unable to integrate.  相似文献   

13.
This discussion note responds to objections by Twardy, Gardner, and Dowe to my earlier claim that empirical data sets are algorithmically incompressible. Twardy, Gardner, and Dowe hold that many empirical data sets are compressible by Minimum Message Length technique and offer this as evidence that these data sets are algorithmically compressible. I reply that the compression achieved by Minimum Message Length technique is different from algorithmic compression. I conclude that Twardy, Gardner, and Dowe fail to establish that empirical data sets are algorithmically compressible.  相似文献   

14.
I argue that the Oxford school Everett interpretation is internally incoherent, because we cannot claim that in an Everettian universe the kinds of reasoning we have used to arrive at our beliefs about quantum mechanics would lead us to form true beliefs. I show that in an Everettian context, the experimental evidence that we have available could not provide empirical confirmation for quantum mechanics, and moreover that we would not even be able to establish reference to the theoretical entities of quantum mechanics. I then consider a range of existing Everettian approaches to the probability problem and show that they do not succeed in overcoming this incoherence.  相似文献   

15.
Under what circumstances, if any, are we warranted to assert that a theory is true or at least has some truth content? Scientific realists answer that such assertions are warranted only for those theories or theory-parts that enjoy explanatory and predictive success. A number of challenges to this answer have emerged, chief among them those arising from scientific theory change. For example, if, as scientific realists suggest, successive theories are to increasingly get closer to the truth, any theory changes must not undermine (i) the accumulation of explanatory and predictive success and (ii) the theoretical content responsible for that success. In this paper we employ frame theory to test to what extent certain theoretical claims made by the outdated caloric theory of heat and that, prima facie at least, were used to produce some of that theory’s success have survived into the theory that superseded it, i.e. the kinetic theory of heat. Our findings lend credence to structural realism, the view that scientific theories at best reveal only structural features of the unobservable world.  相似文献   

16.
According to what has become a standard history of quantum mechanics, in 1932 von Neumann persuaded the physics community that hidden variables are impossible as a matter of principle, after which leading proponents of the Copenhagen interpretation put the situation to good use by arguing that the completeness of quantum mechanics was undeniable. This state of affairs lasted, so the story continues, until Bell in 1966 exposed von Neumann’s proof as obviously wrong. The realization that von Neumann’s proof was fallacious then rehabilitated hidden variables and made serious foundational research possible again. It is often added in recent accounts that von Neumann’s error had been spotted almost immediately by Grete Hermann, but that her discovery was of no effect due to the dominant Copenhagen Zeitgeist.We shall attempt to tell a story that is more historically accurate and less ideologically charged. Most importantly, von Neumann never claimed to have shown the impossibility of hidden variables tout court, but argued that hidden-variable theories must possess a structure that deviates fundamentally from that of quantum mechanics. Both Hermann and Bell appear to have missed this point; moreover, both raised unjustified technical objections to the proof. Von Neumann’s argument was basically that hidden-variables schemes must violate the “quantum principle” that physical quantities are to be represented by operators in a Hilbert space. As a consequence, hidden-variables schemes, though possible in principle, necessarily exhibit a certain kind of contextuality.As we shall illustrate, early reactions to Bohm’s theory are in agreement with this account. Leading physicists pointed out that Bohm’s theory has the strange feature that pre-existing particle properties do not generally reveal themselves in measurements, in accordance with von Neumann’s result. They did not conclude that the “impossible was done” and that von Neumann had been shown wrong.  相似文献   

17.
According to recent commentators, medieval natural philosophers endorsed immanent teleology, the view that natural agents possess immanent active powers to achieve certain ends. Moreover, some scholars have argued that Robert Boyle, despite his intentions, failed to eliminate immanent teleology from his natural philosophy. I argue in this paper that it is not at all clear that immanent teleology was widely endorsed in the medieval period. Moreover, I argue that a proper understanding of immanent teleology, and why it was rejected by mainstream medieval natural philosophers, reveals that Boyle did not fail to eliminate immanent teleology from his natural philosophy. I conclude that any attempt to describe the break between medieval and early modern natural philosophy in terms of a break with immanent teleology is likely not on target.  相似文献   

18.
There is growing evidence that explanatory considerations influence how people change their degrees of belief in light of new information. Recent studies indicate that this influence is systematic and may result from people’s following a probabilistic update rule. While formally very similar to Bayes’ rule, the rule or rules people appear to follow are different from, and inconsistent with, that better-known update rule. This raises the question of the normative status of those updating procedures. Is the role explanation plays in people’s updating their degrees of belief a bias? Or are people right to update on the basis of explanatory considerations, in that this offers benefits that could not be had otherwise? Various philosophers have argued that any reasoning at deviance with Bayesian principles is to be rejected, and so explanatory reasoning, insofar as it deviates from Bayes’ rule, can only be fallacious. We challenge this claim by showing how the kind of explanation-based update rules to which people seem to adhere make it easier to strike the best balance between being fast learners and being accurate learners. Borrowing from the literature on ecological rationality, we argue that what counts as the best balance is intrinsically context-sensitive, and that a main advantage of explanatory update rules is that, unlike Bayes’ rule, they have an adjustable parameter which can be fine-tuned per context. The main methodology to be used is agent-based optimization, which also allows us to take an evolutionary perspective on explanatory reasoning.  相似文献   

19.
In this paper we will try to explain how Leibniz justified the idea of an exact arithmetical quadrature. We will do this by comparing Leibniz's exposition with that of John Wallis. In short, we will show that the idea of exactitude in matters of quadratures relies on two fundamental requisites that, according to Leibniz, the infinite series have, namely, that of regularity and that of completeness. In the first part of this paper, we will go deeper into three main features of Leibniz's method, that is: it is an infinitesimal method, it looks for an arithmetical quadrature and it proposes a result that is not approximate, but exact. After that, we will deal with the requisite of the regularity of the series, pointing out that, unlike the inductive method proposed by Wallis, Leibniz propounded some sort of intellectual recognition of what is invariant in the series. Finally, we will consider the requisite of completeness of the series. We will see that, although both Wallis and Leibniz introduced the supposition of completeness, the German thinker went beyond the English mathematician, since he recognized that it is not necessary to look for a number for the quadrature of the circle, given that we have a series that is equal to the area of that curvilinear figure.  相似文献   

20.
When it comes to supporting the main ontic structural realist thesis, that we are better off with a metaphysics purged of objects, its proponents have to meet several challenges, three of which are to ensure that objects can be recast in terms of structure alone at both the level of theory and the level of ontology, to justify on physical grounds that structure exists in the world in a way that affects the goings-on in it, and to show that the relation between objects and structure is non-reciprocal, so that structure is ontologically prior to objects but not the converse. Assuming—tacitly or explicitly—that the objects of physics can be thus recast using symmetry group structure, supporters of the thesis have, therefore, to meet the remaining challenges. The present paper discusses and contests two such attempts, which typify arguments in favor of ontic structural realism from high-energy physics.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号