Listing 1 - 9 of 9 |
Sort by
|
Choose an application
Markov processes represent a universal model for a large variety of real life random evolutions. The wide flow of new ideas, tools, methods and applications constantly pours into the ever-growing stream of research on Markov processes that rapidly spreads over new fields of natural and social sciences, creating new streamlined logical paths to its turbulent boundary. Even if a given process is not Markov, it can be often inserted into a larger Markov one (Markovianization procedure) by including the key historic parameters into the state space. This monograph gives a concise, but systematic and self-contained, exposition of the essentials of Markov processes, together with recent achievements, working from the "physical picture" - a formal pre-generator, and stressing the interplay between probabilistic (stochastic differential equations) and analytic (semigroups) tools. The book will be useful to students and researchers. Part I can be used for a one-semester course on Brownian motion, Lévy and Markov processes, or on probabilistic methods for PDE. Part II mainly contains the author's research on Markov processes. From the contents: Tools from Probability and Analysis Brownian motion Markov processes and martingales SDE, ψDE and martingale problems Processes in Euclidean spaces Processes in domains with a boundary Heat kernels for stable-like processes Continuous-time random walks and fractional dynamics Complex chains and Feynman integral
Markov processes. --- Semigroups. --- Group theory --- Generators of groups --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Generators. --- Markov Processes.
Choose an application
This book provides a systemic treatment of time-dependent strong Markov processes with values in a Polish space. It describes its generators and the link with stochastic differential equations in infinite dimensions. In a unifying way, where the square gradient operator is employed, new results for backward stochastic differential equations and long-time behavior are discussed in depth. This mathematical material finds its applications in several branches of the scientific world among which mathematical physics, hedging models in financial mathematics, population models.
Markov processes. --- Semigroups of operators. --- Evolution equations. --- Evolutionary equations --- Equations, Evolution --- Equations of evolution --- Differential equations --- Operators, Semigroups of --- Operator theory --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes
Choose an application
Measure-valued branching processes arise as high density limits of branching particle systems. The Dawson-Watanabe superprocess is a special class of those. The author constructs superprocesses with Borel right underlying motions and general branching mechanisms and shows the existence of their Borel right realizations. He then uses transformations to derive the existence and regularity of several different forms of the superprocesses. This treatment simplifies the constructions and gives useful perspectives. Martingale problems of superprocesses are discussed under Feller type assumptions. The most important feature of the book is the systematic treatment of immigration superprocesses and generalized Ornstein--Uhlenbeck processes based on skew convolution semigroups. The volume addresses researchers in measure-valued processes, branching processes, stochastic analysis, biological and genetic models, and graduate students in probability theory and stochastic processes.
Markov processes. --- Distribution (Probability theory) --- Mathematics. --- Probabilities. --- Probability Theory and Stochastic Processes. --- Distribution functions --- Frequency distribution --- Characteristic functions --- Probabilities --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Distribution (Probability theory. --- Branching processes. --- Probability --- Statistical inference --- Combinations --- Mathematics --- Chance --- Least squares --- Mathematical statistics --- Risk
Choose an application
Probabilities. --- Phase transformations (Statistical physics) --- Measure theory. --- Lebesgue measure --- Measurable sets --- Measure of a set --- Algebraic topology --- Integrals, Generalized --- Measure algebras --- Rings (Algebra) --- Phase changes (Statistical physics) --- Phase transitions (Statistical physics) --- Phase rule and equilibrium --- Statistical physics --- Probability --- Statistical inference --- Combinations --- Mathematics --- Chance --- Least squares --- Mathematical statistics --- Risk --- Gaussian Fields. --- Gibbs Measures. --- Markov Chains. --- Phase Transition. --- Statistical Mechanics.
Choose an application
The purpose of these notes is to explore some simple relations between Markovian path and loop measures, the Poissonian ensembles of loops they determine, their occupation fields, uniform spanning trees, determinants, and Gaussian Markov fields such as the free field. These relations are first studied in complete generality for the finite discrete setting, then partly generalized to specific examples in infinite and continuous spaces.
Markov processes --- Mathematics --- Physical Sciences & Mathematics --- Mathematical Statistics --- Markov processes. --- Stochastic processes. --- Random processes --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Mathematics. --- Potential theory (Mathematics). --- Probabilities. --- Probability Theory and Stochastic Processes. --- Potential Theory. --- Probability --- Statistical inference --- Combinations --- Chance --- Least squares --- Mathematical statistics --- Risk --- Green's operators --- Green's theorem --- Potential functions (Mathematics) --- Potential, Theory of --- Mathematical analysis --- Mechanics --- Math --- Science --- Probabilities --- Stochastic processes --- Distribution (Probability theory. --- Distribution functions --- Frequency distribution --- Characteristic functions
Choose an application
The theory of Markov decision processes focuses on controlled Markov chains in discrete time. The authors establish the theory for general state and action spaces and at the same time show its application by means of numerous examples, mostly taken from the fields of finance and operations research. By using a structural approach many technicalities (concerning measure theory) are avoided. They cover problems with finite and infinite horizons, as well as partially observable Markov decision processes, piecewise deterministic Markov decision processes and stopping problems. The book presents Markov decision processes in action and includes various state-of-the-art applications with a particular view towards finance. It is useful for upper-level undergraduates, Master's students and researchers in both applied probability and finance, and provides exercises (without solutions). .
Markov processes. --- Statistical decision. --- Markov processes --- Programming (Mathematics) --- Stochastic control theory --- Finance --- Mathematics --- Physical Sciences & Mathematics --- Mathematical Statistics --- Mathematical models --- Decision problems --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Mathematics. --- Applied mathematics. --- Engineering mathematics. --- Economics, Mathematical. --- Probabilities. --- Probability Theory and Stochastic Processes. --- Quantitative Finance. --- Applications of Mathematics. --- Game theory --- Operations research --- Statistics --- Management science --- Stochastic processes --- Distribution (Probability theory. --- Finance. --- Math --- Science --- Funding --- Funds --- Economics --- Currency question --- Distribution functions --- Frequency distribution --- Characteristic functions --- Probabilities --- Stochastic control theory. --- Mathematical models. --- Economics, Mathematical . --- Engineering --- Engineering analysis --- Mathematical analysis --- Mathematical economics --- Econometrics --- Probability --- Statistical inference --- Combinations --- Chance --- Least squares --- Mathematical statistics --- Risk --- Methodology
Choose an application
Since their first inception more than half a century ago, automatic reading systems have evolved substantially, thereby showing impressive performance on machine-printed text. The recognition of handwriting can, however, still be considered an open research problem due to its substantial variation in appearance. With the introduction of Markovian models to the field, a promising modeling and recognition paradigm was established for automatic handwriting recognition. However, so far, no standard procedures for building Markov-model-based recognizers could be established though trends toward unified approaches can be identified. Markov Models for Handwriting Recognition provides a comprehensive overview of the application of Markov models in the research field of handwriting recognition, covering both the widely used hidden Markov models and the less complex Markov-chain or n-gram models. First, the text introduces the typical architecture of a Markov model-based handwriting recognition system, and familiarizes the reader with the essential theoretical concepts behind Markovian models. Then, the text gives a thorough review of the solutions proposed in the literature for open problems in applying Markov model-based approaches to automatic handwriting recognition.
Markov processes. --- Mathematics. --- Pattern perception -- Mathematical models. --- Engineering & Applied Sciences --- Electrical & Computer Engineering --- Computer Science --- Electrical Engineering --- Applied Physics --- Pattern recognition systems --- Writing --- Mathematical models. --- Identification --- Data processing. --- Chirography --- Handwriting --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Pattern classification systems --- Pattern recognition computers --- Computer science. --- Pattern recognition. --- Computer Science. --- Pattern Recognition. --- Language and languages --- Ciphers --- Penmanship --- Stochastic processes --- Pattern perception --- Computer vision --- Optical pattern recognition. --- Optical data processing --- Perceptrons --- Visual discrimination --- Design perception --- Pattern recognition --- Form perception --- Perception --- Figure-ground perception
Choose an application
Most networks and databases that humans have to deal with contain large, albeit finite number of units. Their structure, for maintaining functional consistency of the components, is essentially not random and calls for a precise quantitative description of relations between nodes (or data units) and all network components. This book is an introduction, for both graduate students and newcomers to the field, to the theory of graphs and random walks on such graphs. The methods based on random walks and diffusions for exploring the structure of finite connected graphs and databases are reviewed (Markov chain analysis). This provides the necessary basis for consistently discussing a number of applications such diverse as electric resistance networks, estimation of land prices, urban planning, linguistic databases, music, and gene expression regulatory networks.
Random walks (Mathematics) --- Diffusion processes. --- Markov processes. --- Charts, diagrams, etc. --- Diagrams, charts, etc. --- Graphs --- Plots (Diagrams) --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Stochastic processes --- Markov processes --- Additive process (Probability theory) --- Random walk process (Mathematics) --- Walks, Random (Mathematics) --- Cell aggregation --- Data structures (Computer scienc. --- Engineering. --- Applications of Graph Theory and Complex Networks. --- Manifolds and Cell Complexes (incl. Diff.Topology). --- Data Structures and Information Theory. --- Complexity. --- Mathematics. --- Construction --- Industrial arts --- Technology --- Aggregation, Cell --- Cell patterning --- Cell interaction --- Microbial aggregation --- Data structures (Computer science) --- Information structures (Computer science) --- Structures, Data (Computer science) --- Structures, Information (Computer science) --- Electronic data processing --- File organization (Computer science) --- Abstract data types (Computer science) --- Physics. --- Manifolds (Mathematics). --- Complex manifolds. --- Data structures (Computer science). --- Computational complexity. --- Complexity, Computational --- Machine theory --- Analytic spaces --- Manifolds (Mathematics) --- Geometry, Differential --- Topology --- Natural philosophy --- Philosophy, Natural --- Physical sciences --- Dynamics
Choose an application
In this work, we provide a treatment of the relationship between two models that have been widely used in the implementation of autonomous agents: the Belief DesireIntention (BDI) model and Markov Decision Processes (MDPs). We start with an informal description of the relationship, identifying the common features of the two approaches and the differences between them. Then we hone our understanding of these differences through an empirical analysis of the performance of both models on the TileWorld testbed. This allows us to show that even though the MDP model displays consistently better behavior than the BDI model for small worlds, this is not the case when the world becomes large and the MDP model cannot be solved exactly. Finally we present a theoretical analysis of the relationship between the two approaches, identifying mappings that allow us to extract a set of intentions from a policy (a solution to an MDP), and to extract a policy from a set of intentions.
Decision making. --- Intelligent agents (Computer software). --- Markov processes. --- Statistical decision. --- Markov processes --- Statistical decision --- Intelligent agents (Computer software) --- Decision making --- Mechanical Engineering --- Engineering & Applied Sciences --- Mathematics --- Physical Sciences & Mathematics --- Mathematical Statistics --- Computer Science --- Mechanical Engineering - General --- Information Technology --- Artificial Intelligence --- Deciding --- Decision (Psychology) --- Decision analysis --- Decision processes --- Making decisions --- Management --- Management decisions --- Agents, Autonomous (Computer software) --- Agents, Cognitive (Computer software) --- Agents, Intelligent (Computer software) --- Assistants, Cognitive (Computer software) --- Assistants, Intelligent software --- Autonomous agents (Computer software) --- Cognitive agents (Computer software) --- Cognitive assistants (Computer software) --- IAs (Computer software) --- Intelligent agent software --- Intelligent software agents --- Intelligent software assistants --- Software agents (Computer software) --- Special agents (Computer software) --- Decision problems --- Analysis, Markov --- Chains, Markov --- Markoff processes --- Markov analysis --- Markov chains --- Markov models --- Models, Markov --- Processes, Markov --- Computer science. --- Artificial intelligence. --- Computer simulation. --- Computer Science. --- Artificial Intelligence (incl. Robotics). --- Simulation and Modeling. --- Computer modeling --- Computer models --- Modeling, Computer --- Models, Computer --- Simulation, Computer --- Electromechanical analogies --- Mathematical models --- Simulation methods --- Model-integrated computing --- AI (Artificial intelligence) --- Artificial thinking --- Electronic brains --- Intellectronics --- Intelligence, Artificial --- Intelligent machines --- Machine intelligence --- Thinking, Artificial --- Bionics --- Cognitive science --- Digital computer simulation --- Electronic data processing --- Logic machines --- Machine theory --- Self-organizing systems --- Fifth generation computers --- Neural computers --- Informatics --- Science --- Choice (Psychology) --- Problem solving --- Artificial intelligence --- Game theory --- Operations research --- Statistics --- Management science --- Stochastic processes --- Computer programs --- Artificial Intelligence.
Listing 1 - 9 of 9 |
Sort by
|