共查询到20条相似文献,搜索用时 15 毫秒
1.
Bert Schroer 《Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics》2010,41(4):293-308
The main topics of this second part of a two-part essay are some consequences of the phenomenon of vacuum polarization as the most important physical manifestation of modular localization. Besides philosophically unexpected consequences, it has led to a new constructive “outside-inwards approach” in which the pointlike fields and the compactly localized operator algebras which they generate only appear from intersecting much simpler algebras localized in noncompact wedge regions whose generators have extremely mild almost free field behavior.Another consequence of vacuum polarization presented in this essay is the localization entropy near a causal horizon which follows a logarithmically modified area law in which a dimensionless area (the area divided by the square of dR where dR is the thickness of a light-sheet) appears. There are arguments that this logarithmically modified area law corresponds to the volume law of the standard heat bath thermal behavior. We also explain the symmetry enhancing effect of holographic projections onto the causal horizon of a region and show that the resulting infinite dimensional symmetry groups contain the Bondi–Metzner–Sachs group. This essay is the second part of a partitioned longer paper. 相似文献
2.
Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computational quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990. 相似文献
3.
Amit Hagar 《Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics》2007,38(4):906-919
Among the alternatives of non-relativistic quantum mechanics (NRQM) there are those that give different predictions than quantum mechanics in yet-untested circumstances, while remaining compatible with current empirical findings. In order to test these predictions, one must isolate one's system from environmental induced decoherence, which, on the standard view of NRQM, is the dynamical mechanism that is responsible for the ‘apparent’ collapse in open quantum systems. But while recent advances in condensed-matter physics may lead in the near future to experimental setups that will allow one to test the two hypotheses, namely genuine collapse vs. decoherence, hence make progress toward a solution to the quantum measurement problem, those philosophers and physicists who are advocating an information-theoretic approach to the foundations of quantum mechanics are still unwilling to acknowledge the empirical character of the issue at stake. Here I argue that in doing so they are displaying an unwarranted double standard. 相似文献
4.
Fritz London's seminal idea of “quantum mechanisms of macroscopic scale”, first articulated in 1946, was the unanticipated result of two decades of research, during which London pursued quantum-mechanical explanations of various kinds of systems of particles at different scales. He started at the microphysical scale with the hydrogen molecule, generalized his approach to chemical bonds and intermolecular forces, then turned to macrophysical systems like superconductors and superfluid helium. Along this path, he formulated a set of concepts—the quantum mechanism of exchange, the rigidity of the wave function, the role of quantum statistics in multi-particle systems, the possibility of order in momentum space—that eventually coalesced into a new conception of systems of equal particles. In particular, it was London's clarification of Bose-Einstein condensation that enabled him to formulate the notion of superfluids, and led him to the recognition that quantum mechanics was not, as it was commonly assumed, relevant exclusively as a micromechanics. 相似文献
5.
Tung Ten Yong 《Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics》2010,41(4):318-321
This article presents a discussion of the notion of quantum ontological excess baggage, first proposed by Hardy. It is argued here that this idea does not have the significance suggested in his paper. It is shown that if this concept is properly analyzed, it fails to pose any threat to the ontic approach to quantum theory in general. 相似文献
6.
This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. 相似文献
7.
I show explicitly how concerns about wave function collapse and ontology can be decoupled from the bulk of technical analysis necessary to recover localized, approximately Newtonian trajectories from quantum theory. In doing so, I demonstrate that the account of classical behavior provided by decoherence theory can be straightforwardly tailored to give accounts of classical behavior on multiple interpretations of quantum theory, including the Everett, de Broglie–Bohm and GRW interpretations. I further show that this interpretation-neutral, decoherence-based account conforms to a general view of inter-theoretic reduction in physics that I have elaborated elsewhere, which differs from the oversimplified picture that treats reduction as a matter of simply taking limits. This interpretation-neutral account rests on a general three-pronged strategy for reduction between quantum and classical theories that combines decoherence, an appropriate form of Ehrenfest׳s Theorem, and a decoherence-compatible mechanism for collapse. It also incorporates a novel argument as to why branch-relative trajectories should be approximately Newtonian, which is based on a little-discussed extension of Ehrenfest׳s Theorem to open systems, rather than on the more commonly cited but less germane closed-systems version. In the Conclusion, I briefly suggest how the strategy for quantum-classical reduction described here might be extended to reduction between other classical and quantum theories, including classical and quantum field theory and classical and quantum gravity. 相似文献
8.
The Quantum Hall Effects offer a rich variety of theoretical and experimental advances. They provide interesting insights on such topics as gauge invariance, strong interactions in Condensed Matter physics, emergence of new paradigms. This paper focuses on some related philosophical questions. Various brands of positivism or agnosticism are confronted with the physics of the Quantum Hall Effects. Hacking׳s views on Scientific Realism, Chalmers׳ on Non-Figurative Realism are discussed. It is argued that the difficulties with those versions of realism may be resolved within a dialectical materialist approach. The latter is argued to provide a rational approach to the phenomena, theory and ontology of the Quantum Hall Effects. 相似文献
9.
Over the last three decades, string theory has emerged as one of the leading hopes for a consistent theory of quantum gravity that unifies particle physics with general relativity. Despite the fact that string theory has been a thriving research program for the better part of three decades, it has been subjected to extensive criticism from a number of prominent physicists. The aim of this paper is to obtain a clearer picture of where the conflict lies in competing assessments of string theory, through a close reading of the argumentative strategies employed by protagonists on both sides. Although it has become commonplace to construe this debate as stemming from different attitudes to the absence of testable predictions, we argue that this presents an overly simplified view of the controversy, which ignores the critical role of heuristic appraisal. While string theorists and their defenders see the theoretical achievements of the string theory program as providing strong indication that it is ‘on the right track’, critics have challenged such claims, by calling into question the status of certain ‘solved problems’ and its purported ‘explanatory coherence’. The debates over string theory are therefore particularly instructive from a philosophical point of view, not only because they offer important insights into the nature of heuristic appraisal and theoretical progress, but also because they raise deep questions about what constitutes a solved problem and an explanation in fundamental physics. 相似文献
10.
Operational frameworks are very useful to study the foundations of quantum mechanics, and are sometimes used to promote antirealist attitudes towards the theory. The aim of this paper is to review three arguments aiming at defending an antirealist reading of quantum physics based on various developments of standard quantum mechanics appealing to notions such as quantum information, non-causal correlations and indefinite causal orders. Those arguments will be discussed in order to show that they are not convincing. Instead, it is argued that there is conceptually no argument that could favour realist or antirealist attitudes towards quantum mechanics based solely on some features of some formalism. In particular, both realist and antirealist views are well accomodable within operational formulations of the theory. The reason for this is that the realist/antirealist debate is located at a purely epistemic level, which is not engaged by formal aspects of theories. As such, operational formulations of quantum mechanics are epistmologically and ontologically neutral. This discussion aims at clarifying the limits of the historical and methodological affinities between scientific antirealism and operational physics while engaging with recent discoveries in quantum foundations. It also aims at presenting various realist strategies to account for those developments. 相似文献
11.
12.
Ulrich Meyer’s book The Nature of Time uses tense logic to argue for a ‘modal’ view of time, which replaces substantial times (as in Newton’s Absolute Time) with ‘ersatz times’ constructed using conceptually basic tense operators. He also argues against Bertrand Russell’s relationist theory, in which times are classes of events, and against the idea that relativity compels the integration of time and space (called by Meyer the Inseparability Argument). I find fault with each of these negative arguments, as well as with Meyer’s purported reconstruction of empty spacetime from tense operators and substantial spatial points. I suggest that Meyer’s positive project is best conceived as an elimination of time in the mode of Julian Barbour's The End of Time. 相似文献
13.
In this paper we make an empirical investigation of the relationship between the consistency, coherence and validity of probability judgements in a real-world forecasting context. Our results indicate that these measures of the adequacy of an individual's probability assessments are not closely related as we anticipated. Twenty-nine of our thirty-six subjects were better calibrated in point probabilities than in odds and our subjects were, in general more coherent using point probabilities than odds forecasts. Contrary to our expectations we found very little difference in forecasting response and performance between simple and compound holistic forecasts. This result is evidence against the ‘divide-and-conquer’ rationale underlying most applications of normative decision theory. In addition, our recompositions of marginal and conditional assessments into compound forecasts were no better calibrated or resolved than their holistic counterparts. These findings convey two implications for forecasting. First, untrained judgemental forecasters should use point probabilities in preference to odds. Second, judgemental forecasts of complex compound probabilities may be as well assessed holistically as they are using methods of decomposition and recomposition. In addition, our study provides a paradigm for further studies of the relationship between consistency, coherence and validity in judgemental probability forecasting. 相似文献
14.
The development of evolutionary game theory (EGT) is closely linked with two interdisciplinary exchanges: the import of game theory into biology, and the import of biologists’ version of game theory into economics. This paper traces the history of these two import episodes. In each case the investigation covers what exactly was imported, what the motives for the import were, how the imported elements were put to use, and how they related to existing practices in the respective disciplines. Two conclusions emerged from this study. First, concepts derived from the unity of science discussion or the unification accounts of explanation are too strong and too narrow to be useful for analysing these interdisciplinary exchanges. Secondly, biology and economics—at least in relation to EGT—show significant differences in modelling practices: biologists seek to link EGT models to concrete empirical situations, whereas economists pursue conceptual exploration and possible explanation. 相似文献
15.
There is growing evidence that explanatory considerations influence how people change their degrees of belief in light of new information. Recent studies indicate that this influence is systematic and may result from people’s following a probabilistic update rule. While formally very similar to Bayes’ rule, the rule or rules people appear to follow are different from, and inconsistent with, that better-known update rule. This raises the question of the normative status of those updating procedures. Is the role explanation plays in people’s updating their degrees of belief a bias? Or are people right to update on the basis of explanatory considerations, in that this offers benefits that could not be had otherwise? Various philosophers have argued that any reasoning at deviance with Bayesian principles is to be rejected, and so explanatory reasoning, insofar as it deviates from Bayes’ rule, can only be fallacious. We challenge this claim by showing how the kind of explanation-based update rules to which people seem to adhere make it easier to strike the best balance between being fast learners and being accurate learners. Borrowing from the literature on ecological rationality, we argue that what counts as the best balance is intrinsically context-sensitive, and that a main advantage of explanatory update rules is that, unlike Bayes’ rule, they have an adjustable parameter which can be fine-tuned per context. The main methodology to be used is agent-based optimization, which also allows us to take an evolutionary perspective on explanatory reasoning. 相似文献
16.
17.
18.
Alloxan diabetes caused a decrease in cyclic AMP phosphodiesterase in all affected rat tissues. Cyclic GMP phosphodiesterase activity was, however, decreased in adipose and liver, but increased increased in heart and uterus. 相似文献
19.
Robert Kowalenko 《Studies in history and philosophy of science》2011,42(3):445-452
Standard objections to the notion of a hedged, or ceteris paribus, law of nature usually boil down to the claim that such laws would be either (1) irredeemably vague, (2) untestable, (3) vacuous, (4) false, or a combination thereof. Using epidemiological studies in nutrition science as an example, I show that this is not true of the hedged law-like generalizations derived from data models used to interpret large and varied sets of empirical observations. Although it may be ‘in principle impossible’ to construct models that explicitly identify all potential causal interferers with the relevant generalization, the view that our failure to do so is fatal to the very notion of a cp-law is plausible only if one illicitly infers metaphysical impossibility from epistemic impossibility. I close with the suggestion that a model-theoretic approach to cp-laws poses a problem for recent attempts to formulate a Mill–Ramsey–Lewis theory of cp-laws. 相似文献