首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We distinguish two orientations in Weyl's analysis of the fundamental role played by the notion of symmetry in physics, namely an orientation inspired by Klein's Erlangen program and a phenomenological-transcendental orientation. By privileging the former to the detriment of the latter, we sketch a group(oid)-theoretical program—that we call the Klein-Weyl program—for the interpretation of both gauge theories and quantum mechanics in a single conceptual framework. This program is based on Weyl's notion of a “structure-endowed entity” equipped with a “group of automorphisms”. First, we analyze what Weyl calls the “problem of relativity” in the frameworks provided by special relativity, general relativity, and Yang-Mills theories. We argue that both general relativity and Yang-Mills theories can be understood in terms of a localization of Klein's Erlangen program: while the latter describes the group-theoretical automorphisms of a single structure (such as homogenous geometries), local gauge symmetries and the corresponding gauge fields (Ehresmann connections) can be naturally understood in terms of the groupoid-theoretical isomorphisms in a family of identical structures. Second, we argue that quantum mechanics can be understood in terms of a linearization of Klein's Erlangen program. This stance leads us to an interpretation of the fact that quantum numbers are “indices characterizing representations of groups” ((Weyl, 1931a), p.xxi) in terms of a correspondence between the ontological categories of identity and determinateness.  相似文献   

2.
In this paper, we investigate the performance of a class of M‐estimators for both symmetric and asymmetric conditional heteroscedastic models in the prediction of value‐at‐risk. The class of estimators includes the least absolute deviation (LAD), Huber's, Cauchy and B‐estimator, as well as the well‐known quasi maximum likelihood estimator (QMLE). We use a wide range of summary statistics to compare both the in‐sample and out‐of‐sample VaR estimates of three well‐known stock indices. Our empirical study suggests that in general Cauchy, Huber and B‐estimator have better performance in predicting one‐step‐ahead VaR than the commonly used QMLE. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

3.
We propose a new class of limited information estimators built upon an explicit trade‐off between data fitting and a priori model specification. The estimators offer the researcher a continuum of estimators that range from an extreme emphasis on data fitting and robust reduced‐form estimation to the other extreme of exact model specification and efficient estimation. The approach used to generate the estimators illustrates why ULS often outperforms 2SLS‐PRRF even in the context of a correctly specified model, provides a new interpretation of 2SLS, and integrates Wonnacott and Wonnacott's (1970) least weighted variance estimators with other techniques. We apply the new class of estimators to Klein's Model I and generate forecasts. We find for this example that an emphasis on specification (as opposed to data fitting) produces better out‐of‐sample predictions. Copyright © 1999 John Wiley & Sons, Ltd.  相似文献   

4.
Quine's “naturalized epistemology” presents a challenge to Carnapian explication: why try to rationally reconstruct probabilistic concepts instead of just doing psychology? This paper tracks the historical development of Richard C. Jeffrey who, on the one hand, voiced worries similar to Quine's about Carnapian explication but, on the other hand, claims that his own work in formal epistemology—what he calls “radical probabilism”—is somehow continuous with both Carnap's method of explication and logical empiricism. By examining how Jeffrey's claim could possibly be accurate, the paper suggests that Jeffrey's radical probabilism can be seen as a sort of alternative explication project to Carnap's own inductive logic. In so doing, it deflates both Quine's worries about Carnapian explication and so also, by extension, similar worries about formal epistemology.  相似文献   

5.
Google Trends data is a dataset increasingly employed for many statistical investigations. However, care should be placed in handling this tool, especially when applied for quantitative prediction purposes. Being by design Internet user dependent, estimators based on Google Trends data embody many sources of uncertainty and instability. They are related, for example, to technical (e.g., cross-regional disparities in the degree of computer alphabetization, time dependency of Internet users), psychological (e.g., emotionally driven spikes and other form of data perturbations), linguistic (e.g., noise generated by double-meaning words). Despite the stimulating literature available today on how to use Google Trends data as a forecasting tool, surprisingly, to the best of the author's knowledge, it appears that to date no articles specifically devoted to the prediction of these data have been published. In this paper, a novel forecasting method, based on a denoiser of the wavelet type employed in conjunction with a forecasting model of the class SARIMA (seasonal autoregressive integrated moving average), is presented. The wavelet filter is iteratively calibrated according to a bounded search algorithm, until a minimum of a suitable loss function is reached. Finally, empirical evidence is presented to support the validity of the proposed method.  相似文献   

6.
This re-examination of the earliest version of Maxwell's most important argument for the electromagnetic theory of light—the equality between the speed of wave propagation in the electromagnetic ether and the ratio of electrostatic to electromagnetic measures of electrical quantity—establishes unforeseen connections between Maxwell's theoretical electrical metrology and his mechanical theory of the electromagnetic field. Electrical metrology was not neutral with respect to field-theoretic versus action-at-a-distance conceptions of electro-magnetic interaction. Mutual accommodation between these conceptions was reached by Maxwell on the British Association for the Advancement of Science (BAAS) Committee on Electrical Standards by exploiting the measurement of the medium parameters—electric inductive capacity and magnetic permeability—on an arbitrary scale. While he always worked within this constraint in developing the ‘ratio-of-units’ argument mathematically, I maintain that Maxwell came to conceive of the ratio ‘as a velocity’ by treating the medium parameters as physical quantities that could be measured absolutely, which was only possible via the correspondences between electrical and mechanical quantities established in the mechanical theory. I thereby correct two closely-related misconceptions of the ratio-of-units argument—the counterintuitive but widespread notion that the ratio is naturally a speed, and the supposition that Maxwell either inferred or proved this from its dimensional formula.  相似文献   

7.
In this paper I take a sceptical view of the standard cosmological model and its variants, mainly on the following grounds: (i) The method of mathematical modelling that characterises modern natural philosophy—as opposed to Aristotle's—goes well with the analytic, piecemeal approach to physical phenomena adopted by Galileo, Newton and their followers, but it is hardly suited for application to the whole world. (ii) Einstein's first cosmological model (1917) was not prompted by the intimations of experience but by a desire to satisfy Mach's Principle. (iii) The standard cosmological model—a Friedmann–Lemaı̂tre–Robertson–Walker spacetime expanding with or without end from an initial singularity—is supported by the phenomena of redshifted light from distant sources and very nearly isotropic thermal background radiation provided that two mutually inconsistent physical theories are jointly brought to bear on these phenomena, viz the quantum theory of elementary particles and Einstein's theory of gravity. (iv) While the former is certainly corroborated by high-energy experiments conducted under conditions allegedly similar to those prevailing in the early world, precise tests of the latter involve applications of the Schwarzschild solution or the PPN formalism for which there is no room in a Friedmann–Lemaı̂tre–Robertson–Walker spacetime.  相似文献   

8.
We propose a new portfolio optimization method combining the merits of the shrinkage estimation, vine copula structure, and Black–Litterman model. It is useful for many investors to satisfy simultaneously the three investment objectives: estimation sensitivity, asymmetric risks appreciation, and portfolio stability. A typical investor with such objectives is a sovereign wealth fund (SWF). We use China's SWF as an example to empirically test the method based on a 15‐asset strategic asset allocation problem. Robustness tests using subsamples not only show the method's overall effectiveness but also manifest that the function of each component is as expected.  相似文献   

9.
In this paper, I compare Pierre-Simon Laplace's celebrated formulation of the principle of determinism in his 1814 Essai philosophique sur les probabilités with the formulation of the same principle offered by Roger Joseph Boscovich in his Theoria philosophiae naturalis, published 56 years earlier. This comparison discloses a striking general similarity between the two formulations of determinism as well as certain important differences. Regarding their similarities, both Boscovich's and Laplace's conceptions of determinism involve two mutually interdependent components—ontological and epistemic—and they are both intimately linked with the principles of causality and continuity. Regarding their differences, however, Boscovich's formulation of the principle of determinism turns out not only to be temporally prior to Laplace's but also—being founded on fewer metaphysical principles and more rooted in and elaborated by physical assumptions—to be more precise, complete and comprehensive than Laplace's somewhat parenthetical statement of the doctrine. A detailed analysis of these similarities and differences, so far missing in the literature on the history and philosophy of the concept of determinism, is the main goal of the present paper.  相似文献   

10.
By now, the story of T. D. Lysenko's phantasmagoric career in the Soviet life sciences is widely familiar. While Lysenko's attempts to identify I. V. Michurin, the horticulturist, as the source of his own inductionist ideas about heredity are recognized as a gambit calculated to enhance his legitimacy, the real roots of those ideas are still shrouded in mystery. This paper suggests those roots may be found in a tradition in Russian biology that stretches back to the 1840s—a tradition inspired by the doctrines of Jean-Baptiste Lamarck and Etienne and Isidore Geoffroy Saint-Hilaire. The enthusiastic reception of those doctrines in Russia and of their practical application—acclimatization of exotic life forms—gave rise to the durable scientific preoccupation with transforming nature which now seems implicated in creating the context for Lysenko's successful bid to become an arbiter of the biological sciences.  相似文献   

11.
A univariate structural time series model based on the traditional decomposition into trend, seasonal and irregular components is defined. A number of methods of computing maximum likelihood estimators are then considered. These include direct maximization of various time domain likelihood function. The asymptotic properties of the estimators are given and a comparison between the various methods in terms of computational efficiency and accuracy is made. The methods are then extended to models with explanatory variables.  相似文献   

12.
In this paper I examine the notion and role of metaphors and illustrations in Maxwell's works in exact science as a pathway into a broader and richer philosophical conception of a scientist and scientific practice. While some of these notions and methods are still at work in current scientific research—from economics and biology to quantum computation and quantum field theory—, here I have chosen to attest to their entrenchment and complexity in actual science by attempting to make some conceptual sense of Maxwell's own usage; this endeavour includes situating Maxwell's conceptions and applications in his own culture of Victorian science and philosophy. I trace Maxwell's notions to the formulation of the problem of understanding, or interpreting, abstract representations such as potential functions and Lagrangian equations. I articulate the solution in terms of abstract-concrete relations, where the concrete, in tune with Victorian British psychology and engineering, includes the muscular as well as the pictorial. This sets the basis for a conception of understanding in terms of unification and concrete modelling, or representation. I examine the relation of illustration to analogies and metaphors on which this account rests. Lastly, I stress and explain the importance of context-dependence, its consequences for realism-instrumentalism debates, and Maxwell's own emphasis on method.  相似文献   

13.
This paper proposes a new approach to forecasting intermittent demand by considering the effects of external factors. We classify intermittent demand data into two parts—zero value and nonzero value—and fit nonzero values into a mixed zero-truncated Poisson model. All the parameters in this model are obtained by an EM algorithm, which regards external factors as independent variables of a logistic regression model and log-linear regression model. We then calculate the probability of occurrence of zero value at each period and predict demand occurrence by comparing it with critical value. When demand occurs, we use the weighted average of the mixed zero-truncated Poisson model as predicted nonzero demands, which are combined with predicted demand occurrences to form the final forecasting demand series. Two performance measures are developed to assess the forecasting methods. By presenting a case study of electric power material from the State Grid Shanghai Electric Power Company in China, we show that our approach provides greater accuracy in forecasting than the Poisson model, the hurdle shifted Poisson model, the hurdle Poisson model, and Croston's method.  相似文献   

14.
Successful prototype marine chronometers, developed by Harrison and others in the eighteenth century, stimulated a sector of the British watchmaking industry to supply Admiralty and commercial demand for this instrument. Chronometers, like other British-made timepieces, were constructed by an elaborate pre-industrial method of production. The instrument's static technology and extreme durability meant replacement demand was minimal, and new demand was low relative to existing stock and the industry's capacity. The First World War created a final surge of demand that left supplies far in excess of peacetime needs; and a new technology—radio transmissions of time signals—offered an alternative method of determining Greenwich time, and thus longitude, at sea.  相似文献   

15.
Forecasting methods are often valued by means of simulation studies. For intermittent demand items there are often very few non–zero observations, so it is hard to check any assumptions, because statistical information is often too weak to determine, for example, distribution of a variable. Therefore, it seems important to verify the forecasting methods on the basis of real data. The main aim of the article is an empirical verification of several forecasting methods applicable in case of intermittent demand. Some items are sold only in specific subperiods (in given month in each year, for example), but most forecasting methods (such as Croston's method) give non–zero forecasts for all periods. For example, summer work clothes should have non–zero forecasts only for summer months and many methods will usually provide non–zero forecasts for all months under consideration. This was the motivation for proposing and testing a new forecasting technique which can be applicable to seasonal items. In the article six methods were applied to construct separate forecasting systems: Croston's, SBA (Syntetos–Boylan Approximation), TSB (Teunter, Syntetos, Babai), MA (Moving Average), SES (Simple Exponential Smoothing) and SESAP (Simple Exponential Smoothing for Analogous subPeriods). The latter method (SESAP) is an author's proposal dedicated for companies facing the problem of seasonal items. By analogous subperiods the same subperiods in each year are understood, for example, the same months in each year. A data set from the real company was used to apply all the above forecasting procedures. That data set contained monthly time series for about nine thousand products. The forecasts accuracy was tested by means of both parametric and non–parametric measures. The scaled mean and the scaled root mean squared error were used to check biasedness and efficiency. Also, the mean absolute scaled error and the shares of best forecasts were estimated. The general conclusion is that in the analyzed company a forecasting system should be based on two forecasting methods: TSB and SESAP, but the latter method should be applied only to seasonal items (products sold only in specific subperiods). It also turned out that Croston's and SBA methods work worse than much simpler methods, such as SES or MA. The presented analysis might be helpful for enterprises facing the problem of forecasting intermittent items (and seasonal intermittent items as well).  相似文献   

16.
This paper presents a methodology for modelling and forecasting multivariate time series with linear restrictions using the constrained structural state‐space framework. The model has natural applications to forecasting time series of macroeconomic/financial identities and accounts. The explicit modelling of the constraints ensures that model parameters dynamically satisfy the restrictions among items of the series, leading to more accurate and internally consistent forecasts. It is shown that the constrained model offers superior forecasting efficiency. A testable identification condition for state space models is also obtained and applied to establish the identifiability of the constrained model. The proposed methods are illustrated on Germany's quarterly monetary accounts data. Results show significant improvement in the predictive efficiency of forecast estimators for the monetary account with an overall efficiency gain of 25% over unconstrained modelling. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

17.
This paper compares the axiomatic method of David Hilbert and his school with Rudolf Carnap's general axiomatics that was developed in the late 1920s, and that influenced his understanding of logic of science throughout the 1930s, when his logical pluralism developed. The distinct perspectives become visible most clearly in how Richard Baldus, along the lines of Hilbert, and Carnap and Friedrich Bachmann analyzed the axiom system of Hilbert's Foundations of Geometry—the paradigmatic example for the axiomatization of science. Whereas Hilbert's axiomatic method started from a local analysis of individual axiom systems in which the foundations of mathematics as a whole entered only when establishing the system's consistency, Carnap and his Vienna Circle colleague Hans Hahn instead advocated a global analysis of axiom systems in general. A primary goal was to evade, or formalize ex post, mathematicians' ‘material’ talk about axiom systems for such talk was held to be error-prone and susceptible to metaphysics.  相似文献   

18.
This paper concerns Long‐term forecasts for cointegrated processes. First, it considers the case where the parameters of the model are known. The paper analytically shows that neither cointegration nor integration constraint matters in Long‐term forecasts. It is an alternative implication of Long‐term forecasts for cointegrated processes, extending the results of previous influential studies. The appropriate Mote Carlo experiment supports our analytical result. Secondly, and more importantly, it considers the case where the parameters of the model are estimated. The paper shows that accuracy of the estimation of the drift term is crucial in Long‐term forecasts. Namely, the relative accuracy of various Long‐term forecasts depends upon the relative magnitude of variances of estimators of the drift term. It further experimentally shows that in finite samples the univariate ARIMA forecast, whose drift term is estimated by the simple time average of differenced data, is better than the cointegrated system forecast, whose parameters are estimated by the well‐known Johansen's ML method. Based upon finite sample experiments, it recommends the univariate ARIMA forecast rather than the conventional cointegrated system forecast in finite samples for its practical usefulness and robustness against model misspecifications. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

19.
There is a substantial literature on Feyerabend's relativism—including a few papers in this collection—but fewer specific studies of the ways that his writings and ideas have been taken up among the non-academic public. This is odd, given his obvious interest in the lives and concerns of persons who were not ‘intellectuals’—a term that, for him, had a pejorative ring to it. It is also odd, given the abundance of evidence of how Feyerabend's relativism played a role in a specific national and cultural context—namely, contemporary Italian debates about relativism. This paper offers a study of how Feyerabend's ideas have been deployed by Italian intellectuals and cultural commentators—including the current Pope—and critically assesses them.  相似文献   

20.
I reappraise in detail Hertz's cathode ray experiments. I show that, contrary to Buchwald's (1995) evaluation, the core experiment establishing the electrostatic properties of the rays was successfully replicated by Perrin (probably) and Thomson (certainly). Buchwald's discussion of ‘current purification’ is shown to be a red herring. My investigation of the origin of Buchwald's misinterpretation of this episode reveals that he was led astray by a focus on what Hertz ‘could do’—his experimental resources. I argue that one should focus instead on what Hertz wanted to achieve—his experimental goals. Focusing on these goals, I find that his explicit and implicit requirements for a successful investigation of the rays’ properties are met by Perrin and Thomson. Thus, even by Hertz's standards, they did indeed replicate his experiment.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号