首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 40 毫秒
1.
E C Jazwinska  K Adam 《Experientia》1985,41(12):1533-1535
Sleep deprivation was associated with decreased stature and it blunted the normal 24-h rhythm in young and in middle-aged men. Loss in stature was regained during the first recovery night of sleep. The 24-h rhythm in height is not an endogenous circadian rhythm but depends upon the periods of recumbency over the sleep/wake cycle.  相似文献   

2.
Summary Sleep deprivation was associated with decreased stature and it blunted the normal 24-h rhythm in young and in middle-aged men. Loss in stature was regained during the first recovery night of sleep. The 24-h rhythm in height is not an endogenous circadian rhythm but depends upon the periods of recumbency over the sleep/wake cycle.Acknowledgments. The first author was a Medical Research Council scholar; the second was supported by the Scottish Hospital Endowments Research Trust.  相似文献   

3.
纳米压印加工技术发展综述   总被引:5,自引:0,他引:5  
纳米技术是一项有望为21世纪人类生活的各个方面带来革命的技术。纳米技术不是在一夜之间产生出来的;它是在业已发展多年的、为我们带来了微芯片和其它微米产品的基础上产生的。任何纳米技术均依赖通过纳米加工技术将物体加工至纳米尺度。许多具有100纳米以下加工能力的技术已被开发出来。纳米压印技术就是其中的一项很有希望的技术;它具有低成本、高产量和高分辨率的特点。本文对纳米压印技术的发展进行了综述,描述了纳米压印的基本原理,然后对近年的新进展进行了介绍,并特别强调了纳米压印的产业化问题。我们希望这篇综述能够引起国内工业界和学术界的关注,并致力于在中国发展纳米压印技术。  相似文献   

4.
Energy expenditure was investigated in 15 patients with liver cirrhosis and 20 healthy controls by three methods: indirect calorimetry, anthropometry using the Harris-Benedict equation and bioelectrical impedance analysis. The energy expenditure was expressed in kcal/day, kcal/kg BW/day (BW — body weight), kcal/kg LBM/day (LBM — lean body mass, derived by bioelectrical impedance analysis) or in kcal/m2/day. We did not find statistical differences between values of resting energy expenditure obtained in patients with cirrhosis of the liver and healthy controls whichever method we used. We also did not find statistical differences between values obtained by indirect calorimetry, anthropometry and bioelectrical impedance analysis. There was a significant correlation between indirect calorimetry and anthropometry in both groups. We found significant correlations between indirect calorimetry and anthropometry, and between indirect calorimetry and bioelectrical impedance analysis, in the control group only. We can conclude that (1) resting energy expenditure of patients with cirrhosis of the liver is not changed when compared with healthy controls, and (2) bioelectrical impedance is a useful method to calculate body composition from which energy expenditure is derived; however, it gives an appropriate result only in healthy people, and only approximate values in patients with cirrhosis.  相似文献   

5.
在服务组合场景下,SLA约束下的服务组合优选是实现服务质量管理的重要基础之一.服务组合优选既是一个组合优化问题同时也是多目标决策问题,需要高效的算法以支持大规模的候选组合方案集合,同时也需要有效的评价模型来明确定义候选方案的优劣程度,从而为决策者提供支持.现有的服务组合优选方法多基于两种模型定义其优化目标:线性效用函数和Pareto最优.前者需要量化权重配置以定义效用函数,但是对于用户来说精确的量化权重配置是一个非常困难的任务,特别是在涉及到的QoS维度较多时;后者无需权重配置,将Pareto最优的skyline集合作为优化结果返回给用户,然而skyline集合的规模是不可控的,最优结果集的规模会随着问题的规模增大而显著增大,过大的最优结果集显然难以为决策者提供有效参考,针对上述传统方法的不足,在本文中,我们将PROMETHEE方法弓I入服务组合优选问题,结合Pareto和PROMETHEE两种评价模型,将skyline集合基础上的Top—kPROMETHEE最优方案作为优化目标,提出并实现了一个高效的遗传算法:P—MOEA.算法可以针对大规模问题,高效地返回Top—kPROMETHEE最优组合方案集合,从而为进一步的决策提供有效的参考.我们实验验证了算法的效率和有效性.  相似文献   

6.
Although brain tumours have been documented and recorded since the nineteenth century, 2016 marked 90 years since Percival Bailey and Harvey Cushing coined the term “glioblastoma multiforme”. Since that time, although extensive developments in diagnosis and treatment have been made, relatively little improvement on prognosis has been achieved. The resilience of GBM thus makes treating this tumour one of the biggest challenges currently faced by neuro-oncology. Aggressive and robust development, coupled with difficulties of complete resection, drug delivery and therapeutic resistance to treatment are some of the main issues that this nemesis presents today. Current treatments are far from satisfactory with poor prognosis, and focus on palliative management rather than curative intervention. However, therapeutic research leading to developments in novel treatment stratagems show promise in combating this disease. Here we present a review on GBM, looking at the history and advances which have shaped neurosurgery over the last century that cumulate to the present day management of GBM, while also exploring future perspectives in treatment options that could lead to new treatments on the road to a cure.  相似文献   

7.
This paper proposes a new approach to forecasting intermittent demand by considering the effects of external factors. We classify intermittent demand data into two parts—zero value and nonzero value—and fit nonzero values into a mixed zero-truncated Poisson model. All the parameters in this model are obtained by an EM algorithm, which regards external factors as independent variables of a logistic regression model and log-linear regression model. We then calculate the probability of occurrence of zero value at each period and predict demand occurrence by comparing it with critical value. When demand occurs, we use the weighted average of the mixed zero-truncated Poisson model as predicted nonzero demands, which are combined with predicted demand occurrences to form the final forecasting demand series. Two performance measures are developed to assess the forecasting methods. By presenting a case study of electric power material from the State Grid Shanghai Electric Power Company in China, we show that our approach provides greater accuracy in forecasting than the Poisson model, the hurdle shifted Poisson model, the hurdle Poisson model, and Croston's method.  相似文献   

8.
量子关联是量子系统具有的一种重要的非经典性质,被普遍研究的量子纠缠就是量子信息处理中重要的量子关联.随着研究的深入,最近发现了很多不需要纠缠的非经典现象在量子信息中扮演了关键角色.文章介绍了基于量子失协及其相关的非经典量子关联,讨论了各种量子关联在量子信息模型中各种物理解释与应用.  相似文献   

9.
Pierre Duhem's (1861–1916) lifelong opposition to 19th century atomic theories of matter has been traditionally attributed to his conventionalist and/or positivist philosophy of science. Relatively recently, this traditional view position has been challenged by the claim that Duhem's opposition to atomism was due to the precarious state of atomic theories during the beginning of the 20th century. In this paper I present some of the difficulties with both the traditional and the new interpretation of Duhem's opposition to atomism and provide a new framework in which to understand his rejection of atomic hypotheses. I argue that although not positivist, instrumentalist, or conventionalist, Duhem's philosophy of physics was not compatible with belief in unobservable atoms and molecules. The key for understanding Duhem's resistance to atomism during the final phase of his career is the historicist arguments he presented in support of his ideal of physics.  相似文献   

10.
This paper analyses the revival of Pliny's Naturalis historia within the scientific culture of the late eighteenth and early nineteenth centuries, focusing on a French effort to produce an edition with annotations by scientists and scholars. Between the Renaissance and the early eighteenth century, the Naturalis historia had declined in scientific importance. Increasingly, it was relegated to the humanities, as we demonstrate with a review of editions. For a variety of reasons, however, scientific interest in the Naturalis historia grew in the second half of the eighteenth century. Epitomizing this interest was a plan for a scientifically annotated, Latin-French edition of the Naturalis historia. Initially coordinated by the French governmental minister Malesherbes in the 1750s, the edition was imperfectly realized by Poinsinet a few decades later. It was intended to rival two of the period's other distinguished multi-volume books of knowledge, Diderot and D'Alembert's Encyclopédie and Buffon's Histoire naturelle, to which we compare it. Besides narrating the scientific revival of the Historia naturalis during this period, we examine its causes and the factors contributing to its end in the first half of the nineteenth century.  相似文献   

11.
Early observations of the southern celestial sky were reported in many sixteenth-century books and compilations of voyages of discovery. Here we analyse these accounts in order to find out what was really seen and reported by the first navigators. Our analysis had resulted in new interpretations of the phenomena reported by Amerigo Vespucci and Andreas Corsali. Thus, a reassessment of the discovery of the Coalsack Nebula, the Magellanic Clouds, and the Southern Cross can be made. From a comparative review of the observations of the latter constellation as published between 1500 and 1600, we demonstrate that only questionable records found their way to contemporary compilations of voyages of discovery, and that as a result public knowledge about this constellation at the end of the sixteenth century was entirely unreliable. Another problem we discuss is that although the stars of the Southern Cross were the first to be discovered, and were observed again and again by many navigators, it was not until 1678 that their proper positions were found in stellar atlases and star catalogues accessible to astronomers. We explain how negligence of and subsequently confidence in Ptolemy's astronomy by, respectively, the early navigators and cartographers, were at the root of this amazingly long-lasting gap in the knowledge of the southern celestial sky.  相似文献   

12.
In the case of US national accounts the data are revised for the first few years and every decade, which implies that we do not really have the final data. In this paper we aim to predict the final data, using the preliminary data and/or the revised data. The following predictors are introduced and derived from a context of the non-linear filtering or smoothing problem, which are: (1) prediction of the final data of time t given the preliminary data up to time t- 1, (2) prediction of the final data of time t given the preliminary data up to time t, (3) prediction of the final data of time t given the preliminary data up to time T, (4) prediction of the final data of time t given the revised data up to time t -1 and the preliminary data up to time t- 1, and (5) prediction of the final data of time t given the revised data up to time t-1 and the preliminary data up to time t. It is shown that (5) is the best predictor but not too different from (3). The prediction problem is illustrated using US per capita consumption data.  相似文献   

13.
Combe-Varin     
Physics first became established in Australia and Japan at the same period, during the final quarter of the nineteenth and the first years of the twentieth century. A comparison of the processes by which this happened in these two developing countries on the Pacific rim shows that, despite the great cultural differences that existed, and that might have been expected to have been a source of major differences in national receptiveness to the new science, there were in fact many parallels between the patterns of development in the two cases. Identifying these enables us to draw attention to a number of significant features of the physics discipline more generally at this period. Such differences as emerge in the early history of physics in the two countries seem to have arisen more from the different political situations that prevailed than from anything else; in particular they reflect the fact that Australia was a part of the British Empire while Japan was an independent political power.  相似文献   

14.
 In this paper we examine the contributions of the Italian geometrical school to the Foundations of Projective Geometry. Starting from De Paolis' work we discuss some papers by Segre, Peano, Veronese, Fano and Pieri. In particular we try to show how a totally abstract and general point of view was clearly adopted by the Italian scholars many years before the publication of Hilbert's Grundlagen. We are particularly interested in the interrelations between the Italian and the German schools (mainly the influence of Staudt's and Klein's works). We try also to understand the reason of the steady decline of the Italian school during the twentieth century. (Received Febuary 25, 2000)  相似文献   

15.
脊椎动物基因注释中的大基因问题   总被引:2,自引:0,他引:2  
为了找出编码蛋白质的基因,注释流程结合了“从头开始的基因预测方法”和“与已知基因相似性比较”这两种方法。“从头开始的基因预测方法”虽然有很高的假阳性但是假阴性却很低;相形之下,结合了相似性比对的方法之后虽然能够降低假阳性,但是却大大提高了假阴性。我们发现,在这当中与基因预测正确率相关的最重要因素就是基因大小(包括内含子在内)——大基因尤其容易产生预测错误。  相似文献   

16.
We all know that, nowadays, physics and philosophy are housed in separate departments on university campuses. They are distinct disciplines with their own journals and conferences, and in general they are practiced by different people, using different tools and methods. We also know that this was not always the case: up until the early 17th century (at least), physics was a part of philosophy. So what happened? And what philosophical lessons should we take away? We argue that the split took place long after Newton's Principia (rather than before, as many standard accounts would have it), and offer a new account of the philosophical reasons that drove the separation. We argue that one particular problem, dating back to Descartes and persisting long into the 18th century, played a pivotal role. The failure to solve it, despite repeated efforts, precipitates a profound change in the relationship between physics and philosophy. The culprit is the problem of collisions. Innocuous though it may seem, this problem becomes the bellwether of deeper issues concerning the nature and properties of bodies in general. The failure to successfully address the problem led to a reconceptualization of the goals and subject-matter of physics, a change in the relationship between physics and mechanics, and a shift in who had authority over the most fundamental issues in physics.  相似文献   

17.
Conclusion 79. This study of the interaction between mechanics and differential geometry does not pretend to be exhaustive. In particular, there is probably more to be said about the mathematical side of the history from Darboux to Ricci and Levi Civita and beyond. Statistical mechanics may also be of interest and there is definitely more to be said about Hertz (I plan to continue in this direction) and about Poincaré's geometric and topological reasonings for example about the three body problem [Poincaré 1890] (cf. also [Poincaré 1993], [Andersson 1994] and [Barrow-Green 1994]). Moreover, it would be interesting to find out how the 19th century ideas discussed here influenced the developments in the 20th century. Einstein himself is a hotly debated case.Yet, despite these shortcommings, I hope that this paper has shown that the interactions between mechanics and differential geometry is not a 20th century invention. Klein's view (see my Introduction) that Riemannian geometry grew out of mechanics, more specifically the principle of least action, cannot be maintained. On the other hand, when Riemannian geometry became known around 1870 it was immediately used in mechanics by Lipschitz. He began a continued tradition in this field, which had several elements in common with the new view of mechanics conceived by the physicists and explicitly carried out by Hertz.Before 1870 we found only scattered interactions between differential geometry and mechanics and only direct ones for systems of two or three degrees of freedom. For more degrees of freedom the geometrical ideas were in some interesting cases taken over by analogy, but these analogies did not lead to formal introduction of geometries of more than three dimensions.  相似文献   

18.
By the middle of the nineteenth century, the opinion of science, as well as of philosophy and even religion, was, at least in Britain, firmly in the camp of the plurality of worlds, the view that intelligent life exists on other celestial bodies. William Whewell, considered an expert on science, philosophy and religion (among other areas), would have been expected to support this position. Yet he surprised everyone in 1853 by publishing a work arguing strongly against the plurality view. This was even stranger given that he had endorsed pluralism twenty years earlier in his contribution to the Bridgewater Treatises. In this paper I show that the shift in Whewell’s view was motivated by three factors: the influence of Richard Owen’s theory of archetypes on Whewell’s view of the argument from design, and Whewell’s perception of the need to strengthen such arguments in light of evolutionary accounts of human origins; important developments in his view of philosophy and his role as a scientific expert; and new findings in astronomy. An examination of the development of Whewell’s position provides a lens through which we can view the interplay of theology, philosophy and science in the plurality of worlds debate.  相似文献   

19.
This paper uses Markov switching models to capture volatility dynamics in exchange rates and to evaluate their forecasting ability. We identify that increased volatilities in four euro‐based exchange rates are due to underlying structural changes. Also, we find that currencies are closely related to each other, especially in high‐volatility periods, where cross‐correlations increase significantly. Using Markov switching Monte Carlo approach we provide evidence in favour of Markov switching models, rejecting random walk hypothesis. Testing in‐sample and out‐of‐sample Markov trading rules based on Dueker and Neely (Journal of Banking and Finance, 2007) we find that using econometric methodology is able to forecast accurately exchange rate movements. When applied to the Euro/US dollar and the euro/British pound daily returns data, the model provides exceptional out‐of‐sample returns. However, when applied to the euro/Brazilian real and the euro/Mexican peso, the model loses power. Higher volatility exercised in the Latin American currencies seems to be a critical factor for this failure. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

20.
Most economic forecast evaluations dating back 20 years show that professional forecasters add little to the forecasts generated by the simplest of models. Using various types of forecast error criteria, these evaluations usually conclude that the professional forecasts are little better than the no-change or ARIM A type forecast. It is our contention that this conclusion is mistaken because the conventional error criteria may not capture why forecasts are ma& or how they are used. Using forecast directional accuracy, the criterion which has been found to be highly correlated with profits in an interest rate setting, we find that professional GNP forecasts dominate the cheaper alternatives. Moreover, there appears to be no systematic relationship between this preferred criterion and the error measures used in previous studies.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号