Narrow your search

Library

LUCA School of Arts (9)

Odisee (9)

Thomas More Kempen (9)

Thomas More Mechelen (9)

UCLL (9)

ULB (9)

VIVES (9)

KU Leuven (7)

ULiège (6)

VUB (3)

More...

Resource type

book (11)


Language

English (11)


Year
From To Submit

2005 (11)

Listing 1 - 10 of 11 << page
of 2
>>
Sort by
Markov processes: characterization and convergence
Authors: ---
ISBN: 047176986X 9780471769866 Year: 2005 Publisher: New York Wiley

Loading...
Export citation

Choose an application

Bookmark

Abstract

Applications of Markov chains in chemical engineering
Author:
ISBN: 0444823565 9786611029012 1281029017 0080527396 9780444823564 9780080527390 9781281029010 Year: 2005 Publisher: Amsterdam: Elsevier,

Loading...
Export citation

Choose an application

Bookmark

Abstract

Markov chains make it possible to predict the future state of a system from its present state ignoring its past history. Surprisingly, despite the widespread use of Markov chains in many areas of science and technology, their applications in chemical engineering have been relatively meager. A possible reason for this phenomenon might be that books containing material on this subject have been written in such a way that the simplicity of Markov chains has been shadowed by the tedious mathematical derivations. Thus, the major objective of writing this book has been to try to change this situatio

Discrete-time Markov jump linear systems
Authors: --- ---
ISBN: 128031219X 9786610312191 1846280826 1852337613 Year: 2005 Publisher: London : Springer,

Loading...
Export citation

Choose an application

Bookmark

Abstract

Safety critical and high-integrity systems, such as industrial plants and economic systems, can be subject to abrupt changes - for instance, due to component or interconnection failure, sudden environment changes, etc. Combining probability and operator theory, Discrete-Time Markov Jump Linear Systems provides a unified and rigorous treatment of recent results for the control theory of discrete jump linear systems, which are used in these areas of application. The book is designed for experts in linear systems with Markov jump parameters, but is also of interest for specialists in stochastic control since it presents stochastic control problems for which an explicit solution is possible - making the book suitable for course use. Oswaldo Luiz do Valle Costa is Professor in the Department of Telecommunications and Control Engineering at the University of São Paulo, Marcelo Dutra Fragoso is Professor in the Department of Systems and Control at the National Laboratory for Scientific Computing - LNCC/MCT, Rio de Janeiro, and Ricardo Paulino Marques works in the Department of Telecommunications and Control Engineering at the University of São Paulo.

An introduction to Markov processes
Author:
ISBN: 3540269908 3540234993 3540234519 9787510004483 9783540234999 7510004489 9783540234517 Year: 2005 Publisher: Berlin: Springer,

Loading...
Export citation

Choose an application

Bookmark

Abstract

To some extent, it would be accurate to summarize the contents of this book as an intolerably protracted description of what happens when either one raises a transition probability matrix P (i. e. , all entries (P)»j are n- negative and each row of P sums to 1) to higher and higher powers or one exponentiates R(P — I), where R is a diagonal matrix with non-negative entries. Indeed, when it comes right down to it, that is all that is done in this book. However, I, and others of my ilk, would take offense at such a dismissive characterization of the theory of Markov chains and processes with values in a countable state space, and a primary goal of mine in writing this book was to convince its readers that our offense would be warranted. The reason why I, and others of my persuasion, refuse to consider the theory here as no more than a subset of matrix theory is that to do so is to ignore the pervasive role that probability plays throughout. Namely, probability theory provides a model which both motivates and provides a context for what we are doing with these matrices. To wit, even the term "transition probability matrix" lends meaning to an otherwise rather peculiar set of hypotheses to make about a matrix.

Basic principles and applications of probability theory
Authors: ---
ISBN: 3642081215 3540546863 9786610235100 1280235101 3540263128 Year: 2005 Publisher: Berlin ; London : Springer,

Loading...
Export citation

Choose an application

Bookmark

Abstract

Probability theory arose originally in connection with games of chance and then for a long time it was used primarily to investigate the credibility of testimony of witnesses in the “ethical” sciences. Nevertheless, probability has become a very powerful mathematical tool in understanding those aspects of the world that cannot be described by deterministic laws. Probability has succeeded in ?nding strict determinate relationships where chance seemed to reign and so terming them “laws of chance” combining such contrasting - tions in the nomenclature appears to be quite justi?ed. This introductory chapter discusses such notions as determinism, chaos and randomness, p- dictibility and unpredictibility, some initial approaches to formalizing r- domness and it surveys certain problems that can be solved by probability theory. This will perhaps give one an idea to what extent the theory can - swer questions arising in speci?c random occurrences and the character of the answers provided by the theory. 1. 1 The Nature of Randomness The phrase “by chance” has no single meaning in ordinary language. For instance, it may mean unpremeditated, nonobligatory, unexpected, and so on. Its opposite sense is simpler: “not by chance” signi?es obliged to or bound to (happen). In philosophy, necessity counteracts randomness. Necessity signi?es conforming to law – it can be expressed by an exact law. The basic laws of mechanics, physics and astronomy can be formulated in terms of precise quantitativerelationswhichmustholdwithironcladnecessity.

Markov processes and applications
Author:
ISBN: 1281866911 9786611866914 1860947158 9781860947155 1860942938 9781860942938 1860943241 9781860943249 1860945686 9781860945687 9781281866912 Year: 2005 Publisher: London : Imperial College Press,

Loading...
Export citation

Choose an application

Bookmark

Abstract

This volume concentrates on how to construct a Markov process by starting with a suitable pseudo-differential operator. Feller processes, Hunt processes associated with Lp-sub-Markovian semigroups and processes constructed by using the Martingale problem are at the center of the considerations. The potential theory of these processes is further developed and applications are discussed. Due to the non-locality of the generators, the processes are jump processes and their relations to Levy processes are investigated. Special emphasis is given to the symbol of a process, a notion which generalize

Functional inequalities, Markov semigroups and spectral theory
Author:
ISBN: 7030144155 9780080532073 0080532071 9780080449425 0080449425 9787030144157 1281144681 9781281144683 9786611144685 6611144684 Year: 2005 Publisher: Beijing ; New York : Science press,

Loading...
Export citation

Choose an application

Bookmark

Abstract

In this book, the functional inequalities are introduced to describe:(i) the spectrum of the generator: the essential and discrete spectrums, high order eigenvalues, the principle eigenvalue, and the spectral gap;(ii) the semigroup properties: the uniform intergrability, the compactness, the convergence rate, and the existence of density;(iii) the reference measure and the intrinsic metric: the concentration, the isoperimetic inequality, and the transportation cost inequality.

Markov chain Monte Carlo : innovations and applications
Authors: --- ---
ISBN: 9789812564276 9812564276 9812700919 9789812700919 1281881139 9781281881137 9786611881139 6611881131 Year: 2005 Publisher: Singapore ; Hackensack, NJ : World Scientific,

Loading...
Export citation

Choose an application

Bookmark

Abstract

Markov Chain Monte Carlo (MCMC) originated in statistical physics, but has spilled over into various application areas, leading to a corresponding variety of techniques and methods. That variety stimulates new ideas and developments from many different places, and there is much to be gained from cross-fertilization. This book presents five expository essays by leaders in the field, drawing from perspectives in physics, statistics and genetics, and showing how different aspects of MCMC come to the fore in different contexts. The essays derive from tutorial lectures at an interdisciplinary progr


Book
Inference in hidden Markov models
Authors: --- ---
ISBN: 1281114324 9786611114329 0387289828 Year: 2005 Publisher: New York ; London : Springer,

Loading...
Export citation

Choose an application

Bookmark

Abstract

Hidden Markov models have become a widely used class of statistical models with applications in diverse areas such as communications engineering, bioinformatics, finance and many more. This book is a comprehensive treatment of inference for hidden Markov models, including both algorithms and statistical theory. Topics range from filtering and smoothing of the hidden Markov chain to parameter estimation, Bayesian methods and estimation of the number of states. In a unified way the book covers both models with finite state spaces, which allow for exact algorithms for filtering, estimation etc. and models with continuous state spaces (also called state-space models) requiring approximate simulation-based algorithms that are also described in detail. Simulation in hidden Markov models is addressed in five different chapters that cover both Markov chain Monte Carlo and sequential Monte Carlo approaches. Many examples illustrate the algorithms and theory. The book also carefully treats Gaussian linear state-space models and their extensions and it contains a chapter on general Markov chain theory and probabilistic aspects of hidden Markov models. This volume will suit anybody with an interest in inference for stochastic processes, and it will be useful for researchers and practitioners in areas such as statistics, signal processing, communications engineering, control theory, econometrics, finance and more. The algorithmic parts of the book do not require an advanced mathematical background, while the more theoretical parts require knowledge of probability theory at the measure-theoretical level. Olivier Cappé is Researcher for the French National Center for Scientific Research (CNRS). He received the Ph.D. degree in 1993 from Ecole Nationale Supérieure des Télécommunications, Paris, France, where he is currently a Research Associate. Most of his current research concerns computational statistics and statistical learning. Eric Moulines is Professor at Ecole Nationale Supérieure des Télécommunications (ENST), Paris, France. He graduated from Ecole Polytechnique, France, in 1984 and received the Ph.D. degree from ENST in 1990. He has authored more than 150 papers in applied probability, mathematical statistics and signal processing. Tobias Rydén is Professor of Mathematical Statistics at Lund University, Sweden, where he also received his Ph.D. in 1993. His publications include papers ranging from statistical theory to algorithmic developments for hidden Markov models.

Keywords

Markov processes. --- Stochastic processes. --- Random processes --- Probabilities --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Distribution (Probability theory. --- Mathematical statistics. --- Statistics. --- Computer simulation. --- Probability Theory and Stochastic Processes. --- Statistical Theory and Methods. --- Signal, Image and Speech Processing. --- Statistics for Engineering, Physics, Computer Science, Chemistry and Earth Sciences. --- Statistics for Business, Management, Economics, Finance, Insurance. --- Simulation and Modeling. --- Computer modeling --- Computer models --- Modeling, Computer --- Models, Computer --- Simulation, Computer --- Electromechanical analogies --- Mathematical models --- Simulation methods --- Model-integrated computing --- Statistical analysis --- Statistical data --- Statistical methods --- Statistical science --- Mathematics --- Econometrics --- Statistical inference --- Statistics, Mathematical --- Statistics --- Sampling (Statistics) --- Distribution functions --- Frequency distribution --- Characteristic functions --- Probabilities. --- Statistics . --- Signal processing. --- Image processing. --- Speech processing systems. --- Probability --- Combinations --- Chance --- Least squares --- Mathematical statistics --- Risk --- Computational linguistics --- Electronic systems --- Information theory --- Modulation theory --- Oral communication --- Speech --- Telecommunication --- Singing voice synthesizers --- Pictorial data processing --- Picture processing --- Processing, Image --- Imaging systems --- Optical data processing --- Processing, Signal --- Information measurement --- Signal theory (Telecommunication)

Discrete-time Markov chains: two-time-scale methods and applications
Authors: ---
ISBN: 1280263091 9786610263097 0387268715 038721948X 1441919554 Year: 2005 Publisher: New York Springer

Loading...
Export citation

Choose an application

Bookmark

Abstract

Focusing on discrete-time-scale Markov chains, the contents of this book are an outgrowth of some of the authors' recent research. The motivation stems from existing and emerging applications in optimization and control of complex hybrid Markovian systems in manufacturing, wireless communication, and financial engineering. Much effort in this book is devoted to designing system models arising from these applications, analyzing them via analytic and probabilistic techniques, and developing feasible computational algorithms so as to reduce the inherent complexity. This book presents results including asymptotic expansions of probability vectors, structural properties of occupation measures, exponential bounds, aggregation and decomposition and associated limit processes, and interface of discrete-time and continuous-time systems. One of the salient features is that it contains a diverse range of applications on filtering, estimation, control, optimization, and Markov decision processes, and financial engineering. This book will be an important reference for researchers in the areas of applied probability, control theory, operations research, as well as for practitioners who use optimization techniques. Part of the book can also be used in a graduate course of applied probability, stochastic processes, and applications.

Keywords

Markov processes. --- Stochastic processes. --- Random processes --- Probabilities --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Distribution (Probability theory. --- Operations research. --- Mathematics. --- Probability Theory and Stochastic Processes. --- Control, Robotics, Mechatronics. --- Operations Research/Decision Theory. --- Applications of Mathematics. --- Distribution functions --- Frequency distribution --- Characteristic functions --- Math --- Science --- Operational analysis --- Operational research --- Industrial engineering --- Management science --- Research --- System theory --- Probabilities. --- Control engineering. --- Robotics. --- Mechatronics. --- Decision making. --- Applied mathematics. --- Engineering mathematics. --- Engineering --- Engineering analysis --- Mathematical analysis --- Probability --- Statistical inference --- Combinations --- Mathematics --- Chance --- Least squares --- Mathematical statistics --- Risk --- Control engineering --- Control equipment --- Control theory --- Engineering instruments --- Automation --- Programmable controllers --- Mechanical engineering --- Microelectronics --- Microelectromechanical systems --- Machine theory --- Deciding --- Decision (Psychology) --- Decision analysis --- Decision processes --- Making decisions --- Management --- Management decisions --- Choice (Psychology) --- Problem solving --- Decision making

Listing 1 - 10 of 11 << page
of 2
>>
Sort by