Listing 1 - 10 of 38 | << page >> |
Sort by
|
Choose an application
This monograph highlights the connection between the theory of neutron transport and the theory of non-local branching processes. By detailing this frequently overlooked relationship, the authors provide readers an entry point into several active areas, particularly applications related to general radiation transport. Cutting-edge research published in recent years is collected here for convenient reference. Organized into two parts, the first offers a modern perspective on the relationship between the neutron branching process (NBP) and the neutron transport equation (NTE), as well as some of the core results concerning the growth and spread of mass of the NBP. The second part generalizes some of the theory put forward in the first, offering proofs in a broader context in order to show why NBPs are as malleable as they appear to be. Stochastic Neutron Transport will be a valuable resource for probabilists, and may also be of interest to numerical analysts and engineers in the field of nuclear research.
Probabilities. --- Stochastic processes. --- Markov processes. --- Applied Probability. --- Probability Theory. --- Stochastic Processes. --- Markov Process. --- Probabilitats --- Processos estocàstics
Choose an application
This book explains the basic theory of Hilbert C*-module in detail, covering a wide range of applications from generalized index to module framework. At the center of the book, the Beurling-Deny criterion is characterized between operator valued Dirichlet forms and quantum Markov semigroups, hence opening a new field of quantum probability research. The general scope of the book includes: basic theory of Hilbert C*-modules; generalized indices and module frames; operator valued Dirichlet forms; and quantum Markov semigroups. This book will be of value to scholars and graduate students in the fields of operator algebra, quantum probability and quantum information.
Operator theory. --- Functional analysis. --- Markov processes. --- Operator Theory. --- Functional Analysis. --- Markov Process. --- Hilbert modules.
Choose an application
Computer programming --- fundamental algorithms --- analysis of algorithms --- Data structures --- computational methods --- markov process --- Electronic digital computers --- Computer programming --- Computer algorithms. --- Computer algorithms
Choose an application
This paper addresses several shortcomings in the productivity and markup estimation literature. Using Monte-Carlo simulations, the analysis shows that the methods in Ackerberg, Caves and Frazer (2015) and De Loecker and Warzynski (2012) produce biased estimates of the impact of policy variables on markups and productivity. This bias stems from endogeneity due to the following: (1) the functional form of the production function; (2) the omission of demand shifters; (3) the absence of price information; (4) the violation of the Markov process for productivity; and (5) misspecification when marginal costs are excluded in the estimation. The paper addresses these concerns using a quasi-maximum likelihood approach and a generalized estimator for the production function. It produces unbiased estimates of the impact of regulation on markups and productivity. The paper therefore proposes a work-around solution for the identification problem identified in Bond, Hashemi, Kaplan and Zoch (2020), and an unbiased measure of productivity, by directly accounting for the joint impact of regulation on markups and productivity.
Competition Policy --- Enterprise Development and Reform --- Legal Regulation and Business Environment --- Markov Process --- Markups --- Private Sector Development --- Production Function --- Productivity --- Quasi-Maximum Likelihood --- Regulation
Choose an application
This book offers the reader a journey through the counterintuitive nature of Brownian motion under confinement. Diffusion is a universal phenomenon that controls a wide range of physical, chemical, and biological processes. The transport of spatially-constrained molecules and small particles is ubiquitous in nature and technology and plays an essential role in different processes. Understanding the physics of diffusion under conditions of confinement is essential for a number of biological phenomena and potential technological applications in micro- and nanofluidics, among others. Studies on diffusion under confinement are typically difficult to understand for young scientists and students because of the extensive background on diffusion processes, physics, and mathematics that is required. All of this information is provided in this book, which is essentially self-contained as a result of the authors’ efforts to make it accessible to an audience of students froma variety of different backgrounds. The book also provides the necessary mathematical details so students can follow the technical process required to solve each problem. Readers will also find detailed explanations of the main results based on the last 30 years of research devoted to studying diffusion under confinement. The authors approach the physical problem from various angles and discuss the role of geometries and boundary conditions in diffusion. This textbook serves as a comprehensive and modern overview of Brownian motion under confinement and is intended for young scientists, graduate students, and advanced undergraduates in physics, physical chemistry, biology, chemistry, chemical engineering, biochemistry, bioengineering, and polymer and material sciences.
Choose an application
This textbook reconstructs the statistics curriculum from the perspective of posterior probability. In recent years, there have been several reports that the results of studies using significant tests cannot be reproduced. It is a problem called a “reproducibility crisis”. For example, suppose we could reject the null hypothesis that “the average number of days to recovery in patients who took a new drug was the same as that in the control group”. However, rejecting the null hypothesis is only a necessary condition for the new drug to be effective. Even if the necessary conditions are met, it does not necessarily mean that the new drug is effective. In fact, there are many cases where the effect is not reproduced. Sufficient conditions should be presented, such as “the average number of days until recovery in patients who take new drugs is sufficiently short compared to the control group, evaluated from a medical point of view”, without paying attention to necessary conditions. This book reconstructs statistics from the perspective of PHC, i.e., probability that a research hypothesis is correct. For example, the PHC curve shows the posterior probability that the statement “The average number of days until recovery for patients taking a new drug is at least θ days shorter than that of the control group” is correct as a function of θ. Using the PHC curve makes it possible to discuss the sufficient conditions rather than the necessary conditions for being an efficient treatment. The value of statistical research should be evaluated with concrete indicators such as “90% probability of being at least 3 days shorter”, not abstract metrics like the p-value.
Social sciences --- Statistics. --- Markov processes. --- Mathematical statistics. --- Psychometrics. --- Statistics in Social Sciences, Humanities, Law, Education, Behavorial Sciences, Public Policy. --- Statistical Theory and Methods. --- Markov Process. --- Bayesian Inference. --- Parametric Inference. --- Statistical methods. --- Estadística matemàtica --- Probabilitats
Choose an application
This book explains the importance of using the probability that the hypothesis is correct (PHC), an intuitive measure that anyone can understand, as an alternative to the p-value. In order to overcome the “reproducibility crisis” caused by the misuse of significance tests, this book provides a detailed explanation of the mechanism of p-hacking using significance tests, and concretely shows the merits of PHC as an alternative to p-values. In March 2019, two impactful papers on statistics were published. One paper, "Moving to a World Beyond ‘p < 0.05’”, was featured in the scholarly journal The American Statistician, overseen by the American Statistical Association. The title of the first chapter is “Don't Say ‘Statistically Significant’”, and it uses the imperative form to clearly forbid the use of significance testing. Another paper, “Retire statistical significance”, was published in the prestigious scientific journal Nature. This commentary was endorsed by more than 800 scientists, advocating for the statement, “We agree, and call for the entire concept of statistical significance to be abandoned.” Consider a study comparing the duration of hospital stays between treatments A and B. Previously, research conclusions were typically stated as: “There was a statistically significant difference at the 5% level in the average duration of hospital stays.” This phrasing is quite abstract. Instead, we present the following conclusion as an example: (1) The average duration of hospital stays for Group A is at least half a day shorter than for Group B. (2) 71% of patients in Group A have shorter hospital stays than the average for Group B. (3) Group A has an average hospital stay that is, on average, no more than 94% of that of Group B. Then, the probability that the expression is correct is shown. That is the PHC curve.
Social sciences --- Statistics. --- Markov processes. --- Mathematical statistics. --- Psychometrics. --- Statistics in Social Sciences, Humanities, Law, Education, Behavorial Sciences, Public Policy. --- Statistical Theory and Methods. --- Markov Process. --- Bayesian Inference. --- Parametric Inference. --- Statistical methods.
Choose an application
This book explains the importance of using the probability that the hypothesis is correct (PHC), an intuitive measure that anyone can understand, as an alternative to the p-value. In order to overcome the “reproducibility crisis” caused by the misuse of significance tests, this book provides a detailed explanation of the mechanism of p-hacking using significance tests, and concretely shows the merits of PHC as an alternative to p-values. In March 2019, two impactful papers on statistics were published. One paper, "Moving to a World Beyond ‘p < 0.05’”, was featured in the scholarly journal The American Statistician, overseen by the American Statistical Association. The title of the first chapter is “Don't Say ‘Statistically Significant’”, and it uses the imperative form to clearly forbid the use of significance testing. Another paper, “Retire statistical significance”, was published in the prestigious scientific journal Nature. This commentary was endorsed by more than 800 scientists, advocating for the statement, “We agree, and call for the entire concept of statistical significance to be abandoned.” Consider a study comparing the duration of hospital stays between treatments A and B. Previously, research conclusions were typically stated as: “There was a statistically significant difference at the 5% level in the average duration of hospital stays.” This phrasing is quite abstract. Instead, we present the following conclusion as an example: (1) The average duration of hospital stays for Group A is at least half a day shorter than for Group B. (2) 71% of patients in Group A have shorter hospital stays than the average for Group B. (3) Group A has an average hospital stay that is, on average, no more than 94% of that of Group B. Then, the probability that the expression is correct is shown. That is the PHC curve.
Social sciences --- Statistics. --- Markov processes. --- Mathematical statistics. --- Psychometrics. --- Statistics in Social Sciences, Humanities, Law, Education, Behavorial Sciences, Public Policy. --- Statistical Theory and Methods. --- Markov Process. --- Bayesian Inference. --- Parametric Inference. --- Statistical methods.
Choose an application
Ecology --- -Balance of nature --- Biology --- Bionomics --- Ecological processes --- Ecological science --- Ecological sciences --- Environment --- Environmental biology --- Oecology --- Mathematics --- Biomathematics. Biometry. Biostatistics --- General ecology and biosociology --- Mathematics. --- ecology --- Statistical methods --- Data processing --- data collection --- mathematics --- Balance of nature --- Environmental sciences --- Population biology --- MARKOV PROCESS --- Structure analysis --- Cluster analysis --- ECOLOGY --- Leslie matrix --- Monograph --- Ecology. --- Correlation (Statistics) --- Multivariate analysis --- Spatial analysis (Statistics) --- Ecology - Mathematics --- Ecologie --- Ecologie mathematique --- Modelisation mathematique --- Mathematiques
Choose an application
Financial econometrics has developed into a very fruitful and vibrant research area in the last two decades. The availability of good data promotes research in this area, specially aided by online data and high-frequency data. These two characteristics of financial data also create challenges for researchers that are different from classical macro-econometric and micro-econometric problems. This Special Issue is dedicated to research topics that are relevant for analyzing financial data. We have gathered six articles under this theme.
tuning parameter choice --- Markov process --- model averaging --- n/a --- steady state distributions --- realized volatility --- threshold --- risk prices --- threshold auto-regression --- bond risk premia --- linear programming estimator --- volatility forecasting --- Bayesian inference --- asset price bubbles --- stationarity --- deviance information criterion --- model selection --- probability integral transform --- forecast comparisons --- Markov-Chain Monte Carlo --- explosive regimes --- multivariate nonlinear time series --- Tukey’s power transformation --- affine term structure models --- Mallows criterion --- nonlinear nonnegative autoregression --- TVAR models --- stochastic conditional duration --- shrinkage --- Tukey's power transformation
Listing 1 - 10 of 38 | << page >> |
Sort by
|