Listing 1 - 10 of 11 | << page >> |
Sort by
|
Choose an application
Stochastic processes --- Markov processes. --- Markov, Processus de --- Markov processes --- 519.233 --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov
Choose an application
Markov chains make it possible to predict the future state of a system from its present state ignoring its past history. Surprisingly, despite the widespread use of Markov chains in many areas of science and technology, their applications in chemical engineering have been relatively meager. A possible reason for this phenomenon might be that books containing material on this subject have been written in such a way that the simplicity of Markov chains has been shadowed by the tedious mathematical derivations. Thus, the major objective of writing this book has been to try to change this situatio
Chemical engineering --- Markov processes --- Mathematics --- Markov processes. --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Mathematics. --- Chemical engineering - Mathematics
Choose an application
Safety critical and high-integrity systems, such as industrial plants and economic systems, can be subject to abrupt changes - for instance, due to component or interconnection failure, sudden environment changes, etc. Combining probability and operator theory, Discrete-Time Markov Jump Linear Systems provides a unified and rigorous treatment of recent results for the control theory of discrete jump linear systems, which are used in these areas of application. The book is designed for experts in linear systems with Markov jump parameters, but is also of interest for specialists in stochastic control since it presents stochastic control problems for which an explicit solution is possible - making the book suitable for course use. Oswaldo Luiz do Valle Costa is Professor in the Department of Telecommunications and Control Engineering at the University of São Paulo, Marcelo Dutra Fragoso is Professor in the Department of Systems and Control at the National Laboratory for Scientific Computing - LNCC/MCT, Rio de Janeiro, and Ricardo Paulino Marques works in the Department of Telecommunications and Control Engineering at the University of São Paulo.
Stochastic control theory. --- Stochastic systems. --- Linear systems. --- Control theory. --- Markov processes. --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Dynamics --- Machine theory --- Systems, Linear --- Differential equations, Linear --- System theory --- Systems, Stochastic --- System analysis --- Control theory
Choose an application
To some extent, it would be accurate to summarize the contents of this book as an intolerably protracted description of what happens when either one raises a transition probability matrix P (i. e. , all entries (P)»j are n- negative and each row of P sums to 1) to higher and higher powers or one exponentiates R(P — I), where R is a diagonal matrix with non-negative entries. Indeed, when it comes right down to it, that is all that is done in this book. However, I, and others of my ilk, would take offense at such a dismissive characterization of the theory of Markov chains and processes with values in a countable state space, and a primary goal of mine in writing this book was to convince its readers that our offense would be warranted. The reason why I, and others of my persuasion, refuse to consider the theory here as no more than a subset of matrix theory is that to do so is to ignore the pervasive role that probability plays throughout. Namely, probability theory provides a model which both motivates and provides a context for what we are doing with these matrices. To wit, even the term "transition probability matrix" lends meaning to an otherwise rather peculiar set of hypotheses to make about a matrix.
Markov processes. --- Stochastic processes. --- Random processes --- Probabilities --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Distribution (Probability theory. --- Probability Theory and Stochastic Processes. --- Distribution functions --- Frequency distribution --- Characteristic functions --- Probabilities. --- Probability --- Statistical inference --- Combinations --- Mathematics --- Chance --- Least squares --- Mathematical statistics --- Risk
Choose an application
Probability theory arose originally in connection with games of chance and then for a long time it was used primarily to investigate the credibility of testimony of witnesses in the “ethical” sciences. Nevertheless, probability has become a very powerful mathematical tool in understanding those aspects of the world that cannot be described by deterministic laws. Probability has succeeded in ?nding strict determinate relationships where chance seemed to reign and so terming them “laws of chance” combining such contrasting - tions in the nomenclature appears to be quite justi?ed. This introductory chapter discusses such notions as determinism, chaos and randomness, p- dictibility and unpredictibility, some initial approaches to formalizing r- domness and it surveys certain problems that can be solved by probability theory. This will perhaps give one an idea to what extent the theory can - swer questions arising in speci?c random occurrences and the character of the answers provided by the theory. 1. 1 The Nature of Randomness The phrase “by chance” has no single meaning in ordinary language. For instance, it may mean unpremeditated, nonobligatory, unexpected, and so on. Its opposite sense is simpler: “not by chance” signi?es obliged to or bound to (happen). In philosophy, necessity counteracts randomness. Necessity signi?es conforming to law – it can be expressed by an exact law. The basic laws of mechanics, physics and astronomy can be formulated in terms of precise quantitativerelationswhichmustholdwithironcladnecessity.
Markov processes. --- Probabilities. --- Statistics. --- Mathematics --- Physical Sciences & Mathematics --- Mathematical Statistics --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Probability --- Statistical inference --- Mathematics. --- Probability Theory and Stochastic Processes. --- Combinations --- Chance --- Least squares --- Mathematical statistics --- Risk --- Stochastic processes --- Distribution (Probability theory. --- Distribution functions --- Frequency distribution --- Characteristic functions --- Probabilities
Choose an application
This volume concentrates on how to construct a Markov process by starting with a suitable pseudo-differential operator. Feller processes, Hunt processes associated with Lp-sub-Markovian semigroups and processes constructed by using the Martingale problem are at the center of the considerations. The potential theory of these processes is further developed and applications are discussed. Due to the non-locality of the generators, the processes are jump processes and their relations to Levy processes are investigated. Special emphasis is given to the symbol of a process, a notion which generalize
Markov processes. --- Pseudodifferential operators. --- Potential theory (Mathematics) --- Green's operators --- Green's theorem --- Potential functions (Mathematics) --- Potential, Theory of --- Mathematical analysis --- Mechanics --- Operators, Pseudodifferential --- Pseudo-differential operators --- Operator theory --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Harmonic analysis. Fourier analysis
Choose an application
In this book, the functional inequalities are introduced to describe:(i) the spectrum of the generator: the essential and discrete spectrums, high order eigenvalues, the principle eigenvalue, and the spectral gap;(ii) the semigroup properties: the uniform intergrability, the compactness, the convergence rate, and the existence of density;(iii) the reference measure and the intrinsic metric: the concentration, the isoperimetic inequality, and the transportation cost inequality.
Ordered algebraic structures --- Stochastic processes --- Inequalities (Mathematics) --- Semigroups of operators. --- Dirichlet forms --- Markov processes --- Spectral theory (Mathematics) --- Inégalités (Mathématiques) --- Semi-groupes d'opérateurs --- Dirichlet, Formes de --- Markov, Processus de --- Spectre (Mathématiques) --- ELSEVIER-B EPUB-LIV-FT --- Semigroups. --- Markov processes. --- Functional analysis --- Hilbert space --- Measure theory --- Transformations (Mathematics) --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Group theory --- Processes, Infinite
Choose an application
Markov Chain Monte Carlo (MCMC) originated in statistical physics, but has spilled over into various application areas, leading to a corresponding variety of techniques and methods. That variety stimulates new ideas and developments from many different places, and there is much to be gained from cross-fertilization. This book presents five expository essays by leaders in the field, drawing from perspectives in physics, statistics and genetics, and showing how different aspects of MCMC come to the fore in different contexts. The essays derive from tutorial lectures at an interdisciplinary progr
Monte Carlo method --- Bayesian statistical decision theory --- Markov processes --- Monte-Carlo, Méthode de --- Statistique bayésienne --- Markov, Processus de --- Monte Carlo method. --- Bayesian statistical decision theory. --- Markov processes. --- Simulatiemodellen. --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Bayes' solution --- Bayesian analysis --- Statistical decision --- Artificial sampling --- Model sampling --- Monte Carlo simulation --- Monte Carlo simulation method --- Stochastic sampling --- Games of chance (Mathematics) --- Mathematical models --- Numerical analysis --- Numerical calculations --- Markov-processen. --- Monte Carlo-methode.
Choose an application
Hidden Markov models have become a widely used class of statistical models with applications in diverse areas such as communications engineering, bioinformatics, finance and many more. This book is a comprehensive treatment of inference for hidden Markov models, including both algorithms and statistical theory. Topics range from filtering and smoothing of the hidden Markov chain to parameter estimation, Bayesian methods and estimation of the number of states. In a unified way the book covers both models with finite state spaces, which allow for exact algorithms for filtering, estimation etc. and models with continuous state spaces (also called state-space models) requiring approximate simulation-based algorithms that are also described in detail. Simulation in hidden Markov models is addressed in five different chapters that cover both Markov chain Monte Carlo and sequential Monte Carlo approaches. Many examples illustrate the algorithms and theory. The book also carefully treats Gaussian linear state-space models and their extensions and it contains a chapter on general Markov chain theory and probabilistic aspects of hidden Markov models. This volume will suit anybody with an interest in inference for stochastic processes, and it will be useful for researchers and practitioners in areas such as statistics, signal processing, communications engineering, control theory, econometrics, finance and more. The algorithmic parts of the book do not require an advanced mathematical background, while the more theoretical parts require knowledge of probability theory at the measure-theoretical level. Olivier Cappé is Researcher for the French National Center for Scientific Research (CNRS). He received the Ph.D. degree in 1993 from Ecole Nationale Supérieure des Télécommunications, Paris, France, where he is currently a Research Associate. Most of his current research concerns computational statistics and statistical learning. Eric Moulines is Professor at Ecole Nationale Supérieure des Télécommunications (ENST), Paris, France. He graduated from Ecole Polytechnique, France, in 1984 and received the Ph.D. degree from ENST in 1990. He has authored more than 150 papers in applied probability, mathematical statistics and signal processing. Tobias Rydén is Professor of Mathematical Statistics at Lund University, Sweden, where he also received his Ph.D. in 1993. His publications include papers ranging from statistical theory to algorithmic developments for hidden Markov models.
Markov processes. --- Stochastic processes. --- Random processes --- Probabilities --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Distribution (Probability theory. --- Mathematical statistics. --- Statistics. --- Computer simulation. --- Probability Theory and Stochastic Processes. --- Statistical Theory and Methods. --- Signal, Image and Speech Processing. --- Statistics for Engineering, Physics, Computer Science, Chemistry and Earth Sciences. --- Statistics for Business, Management, Economics, Finance, Insurance. --- Simulation and Modeling. --- Computer modeling --- Computer models --- Modeling, Computer --- Models, Computer --- Simulation, Computer --- Electromechanical analogies --- Mathematical models --- Simulation methods --- Model-integrated computing --- Statistical analysis --- Statistical data --- Statistical methods --- Statistical science --- Mathematics --- Econometrics --- Statistical inference --- Statistics, Mathematical --- Statistics --- Sampling (Statistics) --- Distribution functions --- Frequency distribution --- Characteristic functions --- Probabilities. --- Statistics . --- Signal processing. --- Image processing. --- Speech processing systems. --- Probability --- Combinations --- Chance --- Least squares --- Mathematical statistics --- Risk --- Computational linguistics --- Electronic systems --- Information theory --- Modulation theory --- Oral communication --- Speech --- Telecommunication --- Singing voice synthesizers --- Pictorial data processing --- Picture processing --- Processing, Image --- Imaging systems --- Optical data processing --- Processing, Signal --- Information measurement --- Signal theory (Telecommunication)
Choose an application
Focusing on discrete-time-scale Markov chains, the contents of this book are an outgrowth of some of the authors' recent research. The motivation stems from existing and emerging applications in optimization and control of complex hybrid Markovian systems in manufacturing, wireless communication, and financial engineering. Much effort in this book is devoted to designing system models arising from these applications, analyzing them via analytic and probabilistic techniques, and developing feasible computational algorithms so as to reduce the inherent complexity. This book presents results including asymptotic expansions of probability vectors, structural properties of occupation measures, exponential bounds, aggregation and decomposition and associated limit processes, and interface of discrete-time and continuous-time systems. One of the salient features is that it contains a diverse range of applications on filtering, estimation, control, optimization, and Markov decision processes, and financial engineering. This book will be an important reference for researchers in the areas of applied probability, control theory, operations research, as well as for practitioners who use optimization techniques. Part of the book can also be used in a graduate course of applied probability, stochastic processes, and applications.
Markov processes. --- Stochastic processes. --- Random processes --- Probabilities --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Distribution (Probability theory. --- Operations research. --- Mathematics. --- Probability Theory and Stochastic Processes. --- Control, Robotics, Mechatronics. --- Operations Research/Decision Theory. --- Applications of Mathematics. --- Distribution functions --- Frequency distribution --- Characteristic functions --- Math --- Science --- Operational analysis --- Operational research --- Industrial engineering --- Management science --- Research --- System theory --- Probabilities. --- Control engineering. --- Robotics. --- Mechatronics. --- Decision making. --- Applied mathematics. --- Engineering mathematics. --- Engineering --- Engineering analysis --- Mathematical analysis --- Probability --- Statistical inference --- Combinations --- Mathematics --- Chance --- Least squares --- Mathematical statistics --- Risk --- Control engineering --- Control equipment --- Control theory --- Engineering instruments --- Automation --- Programmable controllers --- Mechanical engineering --- Microelectronics --- Microelectromechanical systems --- Machine theory --- Deciding --- Decision (Psychology) --- Decision analysis --- Decision processes --- Making decisions --- Management --- Management decisions --- Choice (Psychology) --- Problem solving --- Decision making
Listing 1 - 10 of 11 | << page >> |
Sort by
|