Listing 1 - 10 of 396 | << page >> |
Sort by
|
Choose an application
Traffic flow. --- Traffic estimation --- Computer simulation.
Choose an application
Traffic flow. --- Traffic estimation --- Computer simulation.
Choose an application
Flooding has been increased significantly in all over the world from the past few years because of the climate change and economic losses due to flooding have been increased more significantly from the last few decades. It is necessary to estimate the flood losses in the domain of flood risk management and to adopt the best practices for the collection, storage, and analysis of the flood damage data in order to develop the risk mitigation strategies for the severe flood events. In this study, one of the best practices has been presented for the collection and estimation of the flood damage data of the residential buildings through field surveys. In this regard, the study was divided into two phases: (1) introduction of the pilot study for the understanding of real field conditions, identifying the strengths and weaknesses in the survey questionnaire, and improving the field strategy; and (2) organization of the detailed study based on the previous experience of the pilot study and conducting field surveys on a large scale by adopting improved field strategy through a well-structured paper-based survey questionnaire. Through field surveys, the data for socio-demographic characteristics and damage information including building features, hazard variables, building damage cost, building damage extend, financial compensation, precautionary measures, and warning systems of the population was collected. The collected flood damage data was encoded in the Moodle and the python script was used for decoding any errors between encoding and verification phases based on a timestamp and mostly graphs were generated based on readily available python scripts. The analysis and the interpretations of the graphs have been done for developing the relationships and dependencies between different variables and building features and conclusions have been drawn at the end of this study.
Choose an application
Die Kenntnis der taktfrequenten Änderung der Maschinenströme ist für viele Verfahren zur Regelung moderner, hoch ausgenutzter elektrischer Maschinen von großem Vorteil. Doch wie können diese Änderungen der Maschinenströme im Betrieb überhaupt zuverlässig gemessen werden? Reicht die sehr kurze Zeit der einzelnen Schaltzustände dazu überhaupt aus? Und wie können die Änderungen der Ströme mathematisch beschrieben und physikalisch interpretiert werden? Auf diese Fragen gibt diese Arbeit eine Antwort. - Knowledge of the inverter induced current ripple of machine currents is of great advantage for many control schemes of modern, highly utilized electrical machines. But how can the current ripple be measured reliably during operation? Is the very short time of each switching state long enough? And how can the current ripple be described mathematical and interpreted physical? This book gives an answer to those questions.
Choose an application
Die Kenntnis der taktfrequenten Änderung der Maschinenströme ist für viele Verfahren zur Regelung moderner, hoch ausgenutzter elektrischer Maschinen von großem Vorteil. Doch wie können diese Änderungen der Maschinenströme im Betrieb überhaupt zuverlässig gemessen werden? Reicht die sehr kurze Zeit der einzelnen Schaltzustände dazu überhaupt aus? Und wie können die Änderungen der Ströme mathematisch beschrieben und physikalisch interpretiert werden? Auf diese Fragen gibt diese Arbeit eine Antwort. - Knowledge of the inverter induced current ripple of machine currents is of great advantage for many control schemes of modern, highly utilized electrical machines. But how can the current ripple be measured reliably during operation? Is the very short time of each switching state long enough? And how can the current ripple be described mathematical and interpreted physical? This book gives an answer to those questions.
Choose an application
This book addresses contemporary statistical inference issues when no or minimal assumptions on the nature of studied phenomenon are imposed. Information theory methods play an important role in such scenarios. The approaches discussed include various high-dimensional regression problems, time series and dependence analyses.
high-dimensional time series --- nonstationarity --- network estimation --- change points --- kernel estimation --- high-dimensional regression --- loss function --- random predictors --- misspecification --- consistent selection --- subgaussianity --- generalized information criterion --- robustness --- statistical learning theory --- information theory --- entropy --- parameter estimation --- learning systems --- privacy --- prediction methods --- misclassification risk --- model misspecification --- penalized estimation --- supervised classification --- variable selection consistency --- archimedean copula --- consistency --- estimation --- extreme-value copula --- tail dependency --- multivariate analysis --- conditional mutual information --- CMI --- information measures --- nonparametric variable selection criteria --- gaussian mixture --- conditional infomax feature extraction --- CIFE --- joint mutual information criterion --- JMI --- generative tree model --- Markov blanket --- minimum distance estimation --- maximum likelihood estimation --- influence functions --- adaptive splines --- B-splines --- right-censored data --- semiparametric regression --- synthetic data transformation --- time series --- n/a
Choose an application
Yi-Tang Lin presents the historical process by which statistics became the language of global health for local and international health organizations. Drawing on archival material from three continents, this study investigates efforts by public health schools, philanthropic foundations, and international organizations to turn numbers into an international language for public health. Lin shows how these initiatives produced an international network of public health experts who, across various socioeconomic and political contexts, opted for different strategies when it came to setting global standards and translating local realities into numbers. Focusing on China and Taiwan between 1917 and 1960, Lin examines the reception, adaptation, and appropriation of international health statistics. She presents the dynamic interplay between numbers, experts, and policy-making in international health organizations and administrations in China and Taiwan. This title is also available as Open Access.
Medical statistics. --- Medical policy. --- Public health. --- Health Policy --- Government Programs --- Foundations --- Statistics as Topic. --- Public Health --- MEDICAL / History. --- History, 20th Century. --- International Agencies --- history. --- statistics & numerical data. --- China. --- 20th Cent. History (Medicine) --- 20th Cent. History of Medicine --- 20th Cent. Medicine --- Historical Events, 20th Century --- History of Medicine, 20th Cent. --- History, Twentieth Century --- Medical History, 20th Cent. --- Medicine, 20th Cent. --- 20th Century History --- 20th Cent. Histories (Medicine) --- 20th Century Histories --- Cent. Histories, 20th (Medicine) --- Cent. History, 20th (Medicine) --- Century Histories, 20th --- Century Histories, Twentieth --- Century History, 20th --- Century History, Twentieth --- Histories, 20th Cent. (Medicine) --- Histories, 20th Century --- Histories, Twentieth Century --- History, 20th Cent. (Medicine) --- Twentieth Century Histories --- Twentieth Century History --- Area Analysis --- Estimation Technics --- Estimation Techniques --- Indirect Estimation Technics --- Indirect Estimation Techniques --- Multiple Classification Analysis --- Service Statistics --- Statistical Study --- Statistics, Service --- Tables and Charts as Topic --- Analyses, Area --- Analyses, Multiple Classification --- Area Analyses --- Classification Analyses, Multiple --- Classification Analysis, Multiple --- Estimation Technic, Indirect --- Estimation Technics, Indirect --- Estimation Technique --- Estimation Technique, Indirect --- Estimation Techniques, Indirect --- Indirect Estimation Technic --- Indirect Estimation Technique --- Multiple Classification Analyses --- Statistical Studies --- Studies, Statistical --- Study, Statistical --- Technic, Indirect Estimation --- Technics, Estimation --- Technics, Indirect Estimation --- Technique, Estimation --- Technique, Indirect Estimation --- Techniques, Estimation --- Techniques, Indirect Estimation --- Community health --- Health services --- Hygiene, Public --- Hygiene, Social --- Public health services --- Public hygiene --- Social hygiene --- Health --- Human services --- Biosecurity --- Health literacy --- Medicine, Preventive --- National health services --- Sanitation --- Health care policy --- Health policy --- Medical care --- Medicine and state --- Policy, Medical --- Public health --- Public health policy --- State and medicine --- Science and state --- Social policy --- Health statistics --- Medicine --- Statistics --- Government policy --- Statistical methods --- Mainland China --- Inner Mongolia --- Manchuria --- People's Republic of China --- Sinkiang
Choose an application
This book develops alternative methods to estimate the unknown parameters in stochastic volatility models, offering a new approach to test model accuracy. While there is ample research to document stochastic differential equation models driven by Brownian motion based on discrete observations of the underlying diffusion process, these traditional methods often fail to estimate the unknown parameters in the unobserved volatility processes. This text studies the second order rate of weak convergence to normality to obtain refined inference results like confidence interval, as well as nontraditional continuous time stochastic volatility models driven by fractional Levy processes. By incorporating jumps and long memory into the volatility process, these new methods will help better predict option pricing and stock market crash risk. Some simulation algorithms for numerical experiments are provided.
Choose an application
Nowadays, broadcasters must continually innovate to create compelling content that evokes emotions in viewers. Photo-realistic depth of field effect, augmented reality, or even 3D reconstruction could be one way to achieve this. Depth estimation is one of the steps to video understanding that allows these applications. In this master thesis, methods to estimate a depth map from a single image were explored. Especially, deep learning methods were preferred to those based on geometry or the use of sensors. Thus, the solution provided by this work is entirely software-based. In particular, a custom model has been created for a sports broadcast dataset. However, this dataset does not contain any ground truth depth map. Hence, self-supervised monocular depth estimation methods were investigated in this document. Specifically, trainings were performed with non-rectified and synchronized stereo images captured with uncalibrated cameras. In contrast to trainings based on monocular sequences, the scenes are static and the only movement is that of the camera. This is the ideal configuration for training with self-supervised methods. However, to our knowledge, we are the only ones to have attempted to train self-supervised monocular depth estimation methods using non-rectified stereo images captured with uncalibrated cameras. More precisely, in this work, the intrinsic and extrinsic parameters of the cameras are estimated during training along with the depth map. During the course of this research, we were able to demonstrate the ability of self-supervised methods to predict coherent depth maps when trained on non-rectified stereo sports broadcast videos captured with uncalibrated cameras. Moreover, we addressed the limitations regarding the generation of depth maps under those conditions.
Deep learning --- computer vision --- depth estimation --- self-supervised --- monodepth2 --- stereo vision --- monocular vision --- Ingénierie, informatique & technologie > Sciences informatiques
Choose an application
This book addresses contemporary statistical inference issues when no or minimal assumptions on the nature of studied phenomenon are imposed. Information theory methods play an important role in such scenarios. The approaches discussed include various high-dimensional regression problems, time series and dependence analyses.
Technology: general issues --- History of engineering & technology --- Mechanical engineering & materials --- high-dimensional time series --- nonstationarity --- network estimation --- change points --- kernel estimation --- high-dimensional regression --- loss function --- random predictors --- misspecification --- consistent selection --- subgaussianity --- generalized information criterion --- robustness --- statistical learning theory --- information theory --- entropy --- parameter estimation --- learning systems --- privacy --- prediction methods --- misclassification risk --- model misspecification --- penalized estimation --- supervised classification --- variable selection consistency --- archimedean copula --- consistency --- estimation --- extreme-value copula --- tail dependency --- multivariate analysis --- conditional mutual information --- CMI --- information measures --- nonparametric variable selection criteria --- gaussian mixture --- conditional infomax feature extraction --- CIFE --- joint mutual information criterion --- JMI --- generative tree model --- Markov blanket --- minimum distance estimation --- maximum likelihood estimation --- influence functions --- adaptive splines --- B-splines --- right-censored data --- semiparametric regression --- synthetic data transformation --- time series --- n/a
Listing 1 - 10 of 396 | << page >> |
Sort by
|