首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Summary On the basis of the graphical calculation of the confidence interval for the ratio of two standard deviations and of the standard error of the sum or difference of two independent random variables, the possibility is shown of an effective use of normal probability paper for other statistical calculations than the traditional estimation of mean and standard deviation or the checking of normality.  相似文献   

2.
吹熄极限的研究对于超燃冲压发动机燃烧室中稳焰凹腔和燃料喷注方案的设计具有重要学术意义和工程应用价值.针对凹腔上游喷注燃料的火焰稳定过程进行了研究,从理论上分析了超声速气流中凹腔稳定燃烧的贫燃与富燃吹熄机制,基于剪切层稳燃模式,进一步考虑流场的三维结构,并结合横向射流穿透与混合模型,在有效当量比的计算、凹腔的卷吸过程以及燃料射流与凹腔剪切层/回流区的质量交换等方面改进了已有模型,重新定义了与吹熄过程密切相关的Damokhler数和有效当量比,并以两者关系为准则建立了描述富燃和贫燃吹熄极限的数学模型,进而通过实验数据验证了模型的有效性.  相似文献   

3.
高速铁路路堤传递地基差异沉降特性及控制限值研究   总被引:1,自引:0,他引:1  
为研究高速铁路地基差异沉降在路堤中的传递扩散特性,运用有限元方法,分析了路基在小变形条件下地基差异沉降与路基面不均匀变形的映射关系,并在得到土工离心模型试验校正的基础上,讨论了地基差异沉降模式、路堤高度等因素对路基面不均匀变形的影响规律.研究表明:地基差异沉降在路堤中的扩散程度随地基差异沉降渐变段长度与路堤高度比值的增大而减小,比值大于3~5以后,地基差异沉降与路基面不均匀变形具有良好的一致性;路基面不均匀变形与地基差异沉降呈正相关性,并随路堤高度、地基差异沉降渐变段长度的增加而减小;基于路基面不均匀变形限值提出了地基差异沉降控制标准,丰富了高速铁路路基沉降变形控制技术指标体系.  相似文献   

4.
结合RNA序列的核苷酸分类提出了一种新的RNA二级结构的2D图形表示法。该图形表示法能够唯一地表示RNA二级结构,而且图形是无退化的,即不存在重叠或交叉现象。把该图形表示法应用于9个病毒RNA二级结构,通过计算这些序列的数字特征和序列之间的相似性,实验表明该图形表示法是可行的。  相似文献   

5.
The construction of forecasts using interactive data analysis systems is greatly aided by the availability of graphical procedures. Data exploration, model identification and estimation, and interpretation of final forecasts are made considerably easier by the visual relay of information. This article discusses some recent developments in time series graphics designed to assist in the forecasting process. A discussion of requirerients for effective use of graphics in interactive forecasting is included as illustrated through an application of the Box-Jenkins methodology. Illustrations are included from the STATGRAPHICS system, a prototype implementation in APL.  相似文献   

6.
Forecast regions are a common way to summarize forecast accuracy. They usually consist of an interval symmetric about the forecast mean. However, symmetric intervals may not be appropriate forecast regions when the forecast density is not symmetric and unimodal. With many modern time series models, such as those which are non-linear or have non-normal errors, the forecast densities are often asymmetric or multimodal. The problem of obtaining forecast regions in such cases is considered and it is proposed that highest-density forecast regions be used. A graphical method for presenting the results is discussed.  相似文献   

7.
三维可视化是地球物理领域研究的一个重要方向。应用三维可视化技术,可以对原始地震数据做出多方位的图形化展示,为下一步的数据解释分析提供充分的条件。本文是在QT和OpenGL的图形软件开发环境下,研究并实现了速度模型和地震数据的三维显示,从而可以更加直观的看到速度模型和地震数据所反映的地下地层的真实分布。  相似文献   

8.
The field of glycobiology is concerned with the study of the structure, properties, and biological functions of the family of biomolecules called carbohydrates. Bioinformatics for glycobiology is a particularly challenging field, because carbohydrates exhibit a high structural diversity and their chains are often branched. Significant improvements in experimental analytical methods over recent years have led to a tremendous increase in the amount of carbohydrate structure data generated. Consequently, the availability of databases and tools to store, retrieve and analyze these data in an efficient way is of fundamental importance to progress in glycobiology. In this review, the various graphical representations and sequence formats of carbohydrates are introduced, and an overview of newly developed databases, the latest developments in sequence alignment and data mining, and tools to support experimental glycan analysis are presented. Finally, the field of structural glycoinformatics and molecular modeling of carbohydrates, glycoproteins, and protein–carbohydrate interaction are reviewed.  相似文献   

9.
In this paper an investigation is made of the properties and use of two aggregate measures of forecast bias and accuracy. These are metrics used in business to calculate aggregate forecasting performance for a family (group) of products. We find that the aggregate measures are not particularly informative if some of the one‐step‐ahead forecasts are biased. This is likely to be the case in practice if frequently employed forecasting methods are used to generate a large number of individual forecasts. In the paper, examples are constructed to illustrate some potential problems in the use of the metrics. We propose a simple graphical display of forecast bias and accuracy to supplement the information yielded by the accuracy measures. This support includes relevant boxplots of measures of individual forecasting success. This tool is simple but helpful as the graphic display has the potential to indicate forecast deterioration that can be masked by one or both of the aggregate metrics. The procedures are illustrated with data representing sales of food items. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

10.
This paper reports the results of studies concerning the accuracy and efficiency of time-series extrapolation decisions made with the assistance of an interactive graphical tool called GRAFFECT. The tool facilitates the decomposition of the extrapolation task by permitting the serial decomposition of the cue data as the task proceeds. GRAFFECT uses an interactive graphical interface controlled substantially with the use of a mouse. The extrapolation task is divided into the following: (1) trend modelling and extrapolation, (2) seasonal pattern modelling, and (3) extrapolation from the noise residual series. As each component is modelled its effect is stored and the information is washed out of the cue series. The ultimate forecast is produced by automatic recomposition of the judgementally determined components. The results show a significant improvement in forecast accuracy over unaided judgment, resulting in a subjective extrapolation that betters deseasonalized single exponential smoothing.  相似文献   

11.
Spherical geometry studies the sphere not simply as a solid object in itself, but chiefly as the spatial context of the elements which interact on it in a complex three-dimensional arrangement. This compels to establish graphical conventions appropriate for rendering on the same plane—the plane of the diagram itself—the spatial arrangement of the objects under consideration. We will investigate such “graphical choices” made in the Theodosius’ Spherics from antiquity to the Renaissance. Rather than undertaking a minute analysis of every particular element or single variant, we will try to uncover the more general message each author attempted to convey through his particular graphical choices. From this analysis, it emerges that the different kinds of representation are not the result of merely formal requirements but mirror substantial geometrical requirements expressing different ways of interpreting the sphere and testify to different ways of reasoning about the elements that interact on it.  相似文献   

12.
Whitlock and Queen (1998) developed a dynamic graphical model for forecasting traffic flows at a number of sites at a busy traffic junction in Kent, UK. Some of the data collection sites at this junction have been faulty over the data collection period and so there are missing series in the multivariate problem. Here we adapt the model developed in Whitlock and Queen ( 1998 ) to accommodate these missing data. Markov chain Monte Carlo methods are used to provide forecasts of the missing series, which in turn are used to produce forecasts for some of the other series. The methods are used on part of the network and shown to be very promising. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

13.
This article discusses methods of inductive inferences that are methods of visualizations designed in such a way that the “eye” can be employed as a reliable tool for judgment. The term “eye” is used as a stand-in for visual cognition and perceptual processing. In this paper “meaningfulness” has a particular meaning, namely accuracy, which is closeness to truth. Accuracy consists of precision and unbiasedness. Precision is dealt with by statistical methods, but for unbiasedness one needs expert judgment. The common view at the beginning of the twentieth century was to make the most efficient use of this kind of judgment by representing the data in shapes and forms in such a way that the “eye” can function as a reliable judge to reduce bias. The need for judgment of the “eye” is even more necessary when the background conditions of the observations are heterogeneous. Statistical procedures require a certain minimal level of homogeneity, but the “eye” does not. The “eye” is an adequate tool for assessing topological similarities when, due to heterogeneity of the data, metric assessment is not possible. In fact, graphical assessments precedes measurement, or to put it more forcefully, the graphic method is a necessary prerequisite for measurement.  相似文献   

14.
A combination of VAR estimation and state space model reduction techniques are examined by Monte Carlo methods in order to find good, simple to use, procedures for determining models which have reasonable prediction properties. The presentation is largely graphical. This helps focus attention on the aspects of the model determination problem which are relatively important for forecasting. One surprising result is that, for prediction purposes, knowledge of the true structure of the model generating the data is not particularly useful unless parameter values are also known. This is because the difficulty in estimating parameters of the true model causes more prediction error than results from a more parsimonious approximate model.  相似文献   

15.
We present a methodology for estimation, prediction, and model assessment of vector autoregressive moving-average (VARMA) models in the Bayesian framework using Markov chain Monte Carlo algorithms. The sampling-based Bayesian framework for inference allows for the incorporation of parameter restrictions, such as stationarity restrictions or zero constraints, through appropriate prior specifications. It also facilitates extensive posterior and predictive analyses through the use of numerical summary statistics and graphical displays, such as box plots and density plots for estimated parameters. We present a method for computationally feasible evaluation of the joint posterior density of the model parameters using the exact likelihood function, and discuss the use of backcasting to approximate the exact likelihood function in certain cases. We also show how to incorporate indicator variables as additional parameters for use in coefficient selection. The sampling is facilitated through a Metropolis–Hastings algorithm. Graphical techniques based on predictive distributions are used for informal model assessment. The methods are illustrated using two data sets from business and economics. The first example consists of quarterly fixed investment, disposable income, and consumption rates for West Germany, which are known to have correlation and feedback relationships between series. The second example consists of monthly revenue data from seven different geographic areas of IBM. The revenue data exhibit seasonality, strong inter-regional dependence, and feedback relationships between certain regions.© 1997 John Wiley & Sons, Ltd.  相似文献   

16.
Operational frameworks are very useful to study the foundations of quantum mechanics, and are sometimes used to promote antirealist attitudes towards the theory. The aim of this paper is to review three arguments aiming at defending an antirealist reading of quantum physics based on various developments of standard quantum mechanics appealing to notions such as quantum information, non-causal correlations and indefinite causal orders. Those arguments will be discussed in order to show that they are not convincing. Instead, it is argued that there is conceptually no argument that could favour realist or antirealist attitudes towards quantum mechanics based solely on some features of some formalism. In particular, both realist and antirealist views are well accomodable within operational formulations of the theory. The reason for this is that the realist/antirealist debate is located at a purely epistemic level, which is not engaged by formal aspects of theories. As such, operational formulations of quantum mechanics are epistmologically and ontologically neutral. This discussion aims at clarifying the limits of the historical and methodological affinities between scientific antirealism and operational physics while engaging with recent discoveries in quantum foundations. It also aims at presenting various realist strategies to account for those developments.  相似文献   

17.
机器学习面临的挑战   总被引:1,自引:0,他引:1  
该文讨论了机器学习目前面临的几个挑战,包括:高维特征空间和数据量问题,大数据量的计算困难,寻求最优解的困难和可解释性差等问题.然后针对当前很多人关心的几个重要问题,例如大数据问题,深度学习,概率图模型等做了分析,以引起深入思考.  相似文献   

18.
Community science—scientific investigation conducted partly or entirely by non-professional scientists—has many advantages. For example, community science mobilizes large numbers of volunteers who can, at low cost, collect more data than traditional teams of professional scientists. Participation in research can also increase volunteers’ knowledge about and appreciation of science. At the same time, there are worries about the quality of data that community science projects produce. Can the work of non-professionals really deliver trustworthy results? Attempts to answer this question generally compare data collected by volunteers to data collected by professional scientists. When volunteer data is more variable or less accurate than professionally collected data, then the community science project is judged to be inferior to traditional science. I argue that this is not the right standard to use when evaluating community science, because it relies on a false assumption about the aims of science. I show that if we adopt the view that science has diverse aims which are often in tension with one another, then we cannot justify holding community science data to an expert accuracy standard. Instead, we should evaluate the quality of community science data based on its adequacy-for-purpose.  相似文献   

19.
A similarity‐based classification model is proposed whereby densities of positive and negative returns in a delay‐embedded input space are estimated from a graphical representation of the data using an eigenvector centrality measure, and subsequently combined under Bayes' theorem to predict the probability of upward/downward movements. Application to directional forecasting of the daily close price of the Dow Jones Industrial Average over a 20‐year out‐of‐sample period yields performance superior to random walk and logistic regression models, and on a par with that of multilayer perceptrons. A feature of the classifier is that it is parameter free, parameters entering the model only via the measure used to determine pairwise similarity between data points. This allows intuitions about the nature of time series to be elegantly integrated into the model. The recursive nature of eigenvector centrality makes it better able to deal with sparsely populated input spaces than conventional approaches based on density estimation. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

20.
Curie’s Principle says that any symmetry property of a cause must be found in its effect. In this article, I consider Curie’s Principle from the point of view of graphical causal models, and demonstrate that, under one definition of a symmetry transformation, the causal modeling framework does not require anything like Curie’s Principle to be true. On another definition of a symmetry transformation, the graphical causal modeling formalism does imply a version of Curie’s Principle. These results yield a better understanding of the logical landscape with respect to the relationship between Curie’s Principle and graphical causal modeling.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号