Listing 1 - 10 of 149 | << page >> |
Sort by
|
Choose an application
This book concerns continuous-time controlled Markov chains, also known as continuous-time Markov decision processes. They form a class of stochastic control problems in which a single decision-maker wishes to optimize a given objective function. This book is also concerned with Markov games, where two decision-makers (or players) try to optimize their own objective function. Both decision-making processes appear in a large number of applications in economics, operations research, engineering, and computer science, among other areas. An extensive, self-contained, up-to-date analysis of basic o
Markov processes. --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes
Choose an application
This Springer brief addresses the challenges encountered in the study of the optimization of time-nonhomogeneous Markov chains. It develops new insights and new methodologies for systems in which concepts such as stationarity, ergodicity, periodicity and connectivity do not apply. This brief introduces the novel concept of confluencity and applies a relative optimization approach. It develops a comprehensive theory for optimization of the long-run average of time-nonhomogeneous Markov chains. The book shows that confluencity is the most fundamental concept in optimization, and that relative optimization is more suitable for treating the systems under consideration than standard ideas of dynamic programming. Using confluencity and relative optimization, the author classifies states as confluent or branching and shows how the under-selectivity issue of the long-run average can be easily addressed, multi-class optimization implemented, and Nth biases and Blackwell optimality conditions derived. These results are presented in a book for the first time and so may enhance the understanding of optimization and motivate new research ideas in the area.
Markov processes. --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes
Choose an application
Featuring previously unpublished results, Semi-Markov Models: Control of Restorable Systems with Latent Failures describes valuable methodology which can be used by readers to build mathematical models of a wide class of systems for various applications. In particular, this information can be applied to build models of reliability, queuing systems, and technical control. Beginning with a brief introduction to the area, the book covers semi-Markov models for different control strategies in one-component systems, defining their stationary characteristics of reliability and efficiency, and uti
System analysis --- Markov processes. --- Mathematical models. --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes
Choose an application
Semi-Markov Processes: Applications in System Reliability and Maintenance is a modern view of discrete state space and continuous time semi-Markov processes and their applications in reliability and maintenance. The book explains how to construct semi-Markov models and discusses the different reliability parameters and characteristics that can be obtained from those models. The book is a useful resource for mathematicians, engineering practitioners, and PhD and MSc students who want to understand the basic concepts and results of semi-Markov process theory. Clearly defines the properties and
Reliability (Engineering) --- Markov processes. --- Statistical methods. --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Mathematical statistics
Choose an application
Meyn and Tweedie is back! The bible on Markov chains in general state spaces has been brought up to date to reflect developments in the field since 1996 - many of them sparked by publication of the first edition. The pursuit of more efficient simulation algorithms for complex Markovian models, or algorithms for computation of optimal policies for controlled Markov models, has opened new directions for research on Markov chains. As a result, new applications have emerged across a wide range of topics including optimisation, statistics, and economics. New commentary and an epilogue by Sean Meyn summarise recent developments and references have been fully updated. This second edition reflects the same discipline and style that marked out the original and helped it to become a classic: proofs are rigorous and concise, the range of applications is broad and knowledgeable, and key ideas are accessible to practitioners with limited mathematical background.
Stochastic processes --- Markov processes --- Markov processes. --- Mathematics. --- Math --- Science --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov
Choose an application
This is the revised and augmented edition of a now classic book which is an introduction to sub-Markovian kernels on general measurable spaces and their associated homogeneous Markov chains. The first part, an expository text on the foundations of the subject, is intended for post-graduate students. A study of potential theory, the basic classification of chains according to their asymptotic behaviour and the celebrated Chacon-Ornstein theorem are examined in detail.The second part of the book is at a more advanced level and includes a treatment of random walks on general locally compa
Markov processes. --- Stochastic processes. --- Random processes --- Probabilities --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes
Choose an application
The theory of Markov Processes has become a powerful tool in partial differential equations and potential theory with important applications to physics. Professor Dynkin has made many profound contributions to the subject and in this volume are collected several of his most important expository and survey articles. The content of these articles has not been covered in any monograph as yet. This account is accessible to graduate students in mathematics and operations research and will be welcomed by all those interested in stochastic processes and their applications.
Markov processes. --- Stochastic analysis. --- Analysis, Stochastic --- Mathematical analysis --- Stochastic processes --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov
Choose an application
Based on a lecture course given at Chalmers University of Technology, this 2002 book is ideal for advanced undergraduate or beginning graduate students. The author first develops the necessary background in probability theory and Markov chains before applying it to study a range of randomized algorithms with important applications in optimization and other problems in computing. Amongst the algorithms covered are the Markov chain Monte Carlo method, simulated annealing, and the recent Propp-Wilson algorithm. This book will appeal not only to mathematicians, but also to students of statistics and computer science. The subject matter is introduced in a clear and concise fashion and the numerous exercises included will help students to deepen their understanding.
Markov processes. --- Algorithms. --- Algorism --- Algebra --- Arithmetic --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Foundations
Choose an application
"Stochastic Analysis and Diffusion Processes presents a simple, mathematical introduction to Stochastic Calculus and its applications. The book builds the basic theory and offers a careful account of important research directions in Stochastic Analysis. The breadth and power of Stochastic Analysis, and probabilistic behavior of diffusion processes are told without compromising on the mathematical details. Starting with the construction of stochastic processes, the book introduces Brownian motion and martingales. The book proceeds to construct stochastic integrals, establish the Itô formula, and discuss its applications. Next, attention is focused on stochastic differential equations (SDEs) which arise in modeling physical phenomena, perturbed by random forces. Diffusion processes are solutions of SDEs and form the main theme of this book. The Stroock-Varadhan martingale problem, the connection between diffusion processes and partial differential equations, Gaussian solutions of SDEs, and Markov processes with jumps are presented in successive chapters. The book culminates with a careful treatment of important research topics such as invariant measures, ergodic behavior, and large deviation principle for diffusions. Examples are given throughout the book to illustrate concepts and results. In addition, exercises are given at the end of each chapter that will help the reader to understand the concepts better. The book is written for graduate students, young researchers and applied scientists who are interested in stochastic processes and their applications. The reader is assumed to be familiar with probability theory at graduate level. The book can be used as a text for a graduate course on Stochastic Analysis." -- Provided by publisher.
Stochastic analysis. --- Markov processes. --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Analysis, Stochastic --- Mathematical analysis --- Diffusion processes.
Choose an application
The general topic of this book is the ergodic behavior of Markov processes. A detailed introduction to methods for proving ergodicity and upper bounds for ergodic rates is presented in the first part of the book, with the focus put on weak ergodic rates, typical for Markov systems with complicated structure. The second part is devoted to the application of these methods to limit theorems for functionals of Markov processes. The book is aimed at a wide audience with a background in probability and measure theory. Some knowledge of stochastic processes and stochastic differential equations helps in a deeper understanding of specific examples. Contents Part I: Ergodic Rates for Markov Chains and ProcessesMarkov Chains with Discrete State SpacesGeneral Markov Chains: Ergodicity in Total VariationMarkovProcesseswithContinuousTimeWeak Ergodic Rates Part II: Limit TheoremsThe Law of Large Numbers and the Central Limit TheoremFunctional Limit Theorems
Markov processes --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Markov processes. --- ergodic rates. --- ergodicity. --- limit theorems.
Listing 1 - 10 of 149 | << page >> |
Sort by
|