Listing 1 - 7 of 7 |
Sort by
|
Choose an application
Featuring previously unpublished results, Semi-Markov Models: Control of Restorable Systems with Latent Failures describes valuable methodology which can be used by readers to build mathematical models of a wide class of systems for various applications. In particular, this information can be applied to build models of reliability, queuing systems, and technical control. Beginning with a brief introduction to the area, the book covers semi-Markov models for different control strategies in one-component systems, defining their stationary characteristics of reliability and efficiency, and uti
System analysis --- Markov processes. --- Mathematical models. --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes
Choose an application
Semi-Markov Processes: Applications in System Reliability and Maintenance is a modern view of discrete state space and continuous time semi-Markov processes and their applications in reliability and maintenance. The book explains how to construct semi-Markov models and discusses the different reliability parameters and characteristics that can be obtained from those models. The book is a useful resource for mathematicians, engineering practitioners, and PhD and MSc students who want to understand the basic concepts and results of semi-Markov process theory. Clearly defines the properties and
Reliability (Engineering) --- Markov processes. --- Statistical methods. --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Mathematical statistics
Choose an application
This book, written by two mathematicians from the University of Southern California, provides a broad introduction to the important subject of nonlinear mixture models from a Bayesian perspective. It contains background material, a brief description of Markov chain theory, as well as novel algorithms and their applications. It is self-contained and unified in presentation, which makes it ideal for use as an advanced textbook by graduate students and as a reference for independent researchers. The explanations in the book are detailed enough to capture the interest of the curious reader, and complete enough to provide the necessary background material needed to go further into the subject and explore the research literature. In this book the authors present Bayesian methods of analysis for nonlinear, hierarchical mixture models, with a finite, but possibly unknown, number of components. These methods are then applied to various problems including population pharmacokinetics and gene expression analysis. In population pharmacokinetics, the nonlinear mixture model, based on previous clinical data, becomes the prior distribution for individual therapy. For gene expression data, one application included in the book is to determine which genes should be associated with the same component of the mixture (also known as a clustering problem). The book also contains examples of computer programs written in BUGS. This is the first book of its kind to cover many of the topics in this field.
Mathematical statistics --- Markov processes. --- Bayesian statistical decision theory. --- Nonparametric statistics. --- Multivariate analysis. --- Multivariate distributions --- Multivariate statistical analysis --- Statistical analysis, Multivariate --- Analysis of variance --- Matrices --- Distribution-free statistics --- Statistics, Distribution-free --- Statistics, Nonparametric --- Bayes' solution --- Bayesian analysis --- Statistical decision --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes
Choose an application
This book presents the latest findings on stochastic dynamic programming models and on solving optimal control problems in networks. It includes the authors’ new findings on determining the optimal solution of discrete optimal control problems in networks and on solving game variants of Markov decision problems in the context of computational networks. First, the book studies the finite state space of Markov processes and reviews the existing methods and algorithms for determining the main characteristics in Markov chains, before proposing new approaches based on dynamic programming and combinatorial methods. Chapter two is dedicated to infinite horizon stochastic discrete optimal control models and Markov decision problems with average and expected total discounted optimization criteria, while Chapter three develops a special game-theoretical approach to Markov decision processes and stochastic discrete optimal control problems. In closing, the book’s final chapter is devoted to finite horizon stochastic control problems and Markov decision processes. The algorithms developed represent a valuable contribution to the important field of computational network theory.
Economics/Management Science. --- Operation Research/Decision Theory. --- Optimization. --- Operations Research, Management Science. --- Discrete Optimization. --- Algorithm Analysis and Problem Complexity. --- Economics. --- Computer software. --- Mathematical optimization. --- Operations research. --- Economie politique --- Logiciels --- Optimisation mathématique --- Recherche opérationnelle --- Management --- Business & Economics --- Management Theory --- Markov processes. --- Dynamic programming. --- Stochastic control theory. --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Business. --- Decision making. --- Algorithms. --- Management science. --- Business and Management. --- Control theory --- Stochastic processes --- Mathematical optimization --- Programming (Mathematics) --- Systems engineering --- Operations Research/Decision Theory. --- Software, Computer --- Computer systems --- Optimization (Mathematics) --- Optimization techniques --- Optimization theory --- Systems optimization --- Mathematical analysis --- Maxima and minima --- Operations research --- Simulation methods --- System analysis --- Operational analysis --- Operational research --- Industrial engineering --- Management science --- Research --- System theory --- Algorism --- Algebra --- Arithmetic --- Quantitative business analysis --- Problem solving --- Statistical decision --- Deciding --- Decision (Psychology) --- Decision analysis --- Decision processes --- Making decisions --- Management decisions --- Choice (Psychology) --- Foundations --- Decision making
Choose an application
Two-armed response-adaptive clinical trials are modelled as Markov decision problems to pursue two overriding objectives: Firstly, to identify the superior treatment at the end of the trial and, secondly, to keep the number of patients receiving the inferior treatment small. Such clinical trial designs are very important, especially for rare diseases. Thomas Ondra presents the main solution techniques for Markov decision problems and provides a detailed description how to obtain optimal allocation sequences. Contents Introduction to Markov Decision Problems and Examples Finite and Infinite Horizon Markov Decision Problems Solution Algorithms: Backward Induction, Value Iteration and Policy Iteration Designing Response Adaptive Clinical Trials with Markov Decision Problems Target Groups Researchers and students in the fields of mathematics and statistics Professionals in the pharmaceutical industry< The Author Thomas Ondra obtained his Master of Science degree in mathematics at University of Vienna. He is a research assistant and PhD student at the Section for Medical Statistics of Medical University of Vienna. .
Mathematics. --- Computational Mathematics and Numerical Analysis. --- Probability Theory and Stochastic Processes. --- Analysis. --- Global analysis (Mathematics). --- Computer science --- Distribution (Probability theory). --- Mathématiques --- Analyse globale (Mathématiques) --- Informatique --- Distribution (Théorie des probabilités) --- Computer science -- Mathematics. --- Mathematics --- Physical Sciences & Mathematics --- Mathematics - General --- Markov processes. --- Statistical decision. --- Decision problems --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Mathematical analysis. --- Analysis (Mathematics). --- Computer mathematics. --- Probabilities. --- Game theory --- Operations research --- Statistics --- Management science --- Stochastic processes --- Distribution (Probability theory. --- Analysis, Global (Mathematics) --- Differential topology --- Functions of complex variables --- Geometry, Algebraic --- Distribution functions --- Frequency distribution --- Characteristic functions --- Probabilities --- Computer mathematics --- Discrete mathematics --- Electronic data processing --- 517.1 Mathematical analysis --- Mathematical analysis --- Probability --- Statistical inference --- Combinations --- Chance --- Least squares --- Mathematical statistics --- Risk --- Probability Theory. --- Data processing.
Choose an application
An extension problem (often called a boundary problem) of Markov processes has been studied, particularly in the case of one-dimensional diffusion processes, by W. Feller, K. Itô, and H. P. McKean, among others. In this book, Itô discussed a case of a general Markov process with state space S and a specified point a ∈ S called a boundary. The problem is to obtain all possible recurrent extensions of a given minimal process (i.e., the process on S {a} which is absorbed on reaching the boundary a). The study in this lecture is restricted to a simpler case of the boundary a being a discontinuous entrance point, leaving a more general case of a continuous entrance point to future works. He established a one-to-one correspondence between a recurrent extension and a pair of a positive measure k(db) on S {a} (called the jumping-in measure and a non-negative number m< (called the stagnancy rate). The necessary and sufficient conditions for a pair k, m was obtained so that the correspondence is precisely described. For this, Itô used, as a fundamental tool, the notion of Poisson point processes formed of all excursions of the process on S {a}. This theory of Itô's of Poisson point processes of excursions is indeed a breakthrough. It has been expanded and applied to more general extension problems by many succeeding researchers. Thus we may say that this lecture note by Itô is really a memorial work in the extension problems of Markov processes. Especially in Chapter 1 of this note, a general theory of Poisson point processes is given that reminds us of Itô's beautiful and impressive lectures in his day.
Mathematical Statistics --- Mathematics --- Physical Sciences & Mathematics --- Probabilities. --- Mathematical statistics. --- Poisson processes. --- Markov processes. --- Probabilités --- Statistique mathématique --- Processus de Poisson --- Markov, Processus de --- Probabilities --- Mathematical statistics --- Poisson processes --- Markov processes --- Probabilités --- Statistique mathématique --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Processes, Poisson --- Mathematics. --- Functional analysis. --- Measure theory. --- Probability Theory and Stochastic Processes. --- Measure and Integration. --- Functional Analysis. --- Stochastic processes --- Point processes --- Distribution (Probability theory. --- Functional calculus --- Calculus of variations --- Functional equations --- Integral equations --- Math --- Science --- Distribution functions --- Frequency distribution --- Characteristic functions --- Lebesgue measure --- Measurable sets --- Measure of a set --- Algebraic topology --- Integrals, Generalized --- Measure algebras --- Rings (Algebra) --- Probability --- Statistical inference --- Combinations --- Chance --- Least squares --- Risk
Choose an application
This book provides a comprehensive overview of the recent advancement in the field of automatic speech recognition with a focus on deep learning models including deep neural networks and many of their variants. This is the first automatic speech recognition book dedicated to the deep learning approach. In addition to the rigorous mathematical treatment of the subject, the book also presents insights and theoretical foundation of a series of highly successful deep learning models.
Acoustics in engineering. --- Social sciences --- Signal, Image and Speech Processing. --- Engineering Acoustics. --- Computer Appl. in Social and Behavioral Sciences. --- Data processing. --- Automatic speech recognition. --- Markov processes --- Mathematical models. --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Mechanical speech recognizer --- Speech recognition, Automatic --- Pattern recognition systems --- Perceptrons --- Speech, Intelligibility of --- Speech perception --- Speech processing systems --- Signal processing. --- Image processing. --- Speech processing systems. --- Acoustical engineering. --- Application software. --- Application computer programs --- Application computer software --- Applications software --- Apps (Computer software) --- Computer software --- Acoustic engineering --- Sonic engineering --- Sonics --- Sound engineering --- Sound-waves --- Engineering --- Computational linguistics --- Electronic systems --- Information theory --- Modulation theory --- Oral communication --- Speech --- Telecommunication --- Singing voice synthesizers --- Pictorial data processing --- Picture processing --- Processing, Image --- Imaging systems --- Optical data processing --- Processing, Signal --- Information measurement --- Signal theory (Telecommunication) --- Industrial applications
Listing 1 - 7 of 7 |
Sort by
|