Narrow your search

Library

KU Leuven (4)

Odisee (4)

Thomas More Kempen (4)

Thomas More Mechelen (4)

UCLL (4)

ULB (4)

ULiège (4)

VIVES (4)

LUCA School of Arts (3)

UGent (2)

More...

Resource type

book (4)


Language

English (4)


Year
From To Submit

2023 (1)

2020 (1)

2009 (2)

Listing 1 - 4 of 4
Sort by

Book
Sobolev gradients and differential equations
Author:
ISBN: 3642040403 9786613569493 3642040411 128039157X Year: 2009 Publisher: Berlin ; New York : Springer,

Loading...
Export citation

Choose an application

Bookmark

Abstract

A Sobolev gradient of a real-valued functional on a Hilbert space is a gradient of that functional taken relative to an underlying Sobolev norm. This book shows how descent methods using such gradients allow a unified treatment of a wide variety of problems in differential equations. For discrete versions of partial differential equations, corresponding Sobolev gradients are seen to be vastly more efficient than ordinary gradients. In fact, descent methods with these gradients generally scale linearly with the number of grid points, in sharp contrast with the use of ordinary gradients. Aside from the first edition of this work, this is the only known account of Sobolev gradients in book form. Most of the applications in this book have emerged since the first edition was published some twelve years ago. What remains of the first edition has been extensively revised. There are a number of plots of results from calculations and a sample MatLab code is included for a simple problem. Those working through a fair portion of the material have in the past been able to use the theory on their own applications and also gain an appreciation of the possibility of a rather comprehensive point of view on the subject of partial differential equations.


Book
Nonlinear Conjugate Gradient Methods for Unconstrained Optimization
Author:
ISBN: 3030429504 3030429490 Year: 2020 Publisher: Cham : Springer International Publishing : Imprint: Springer,

Loading...
Export citation

Choose an application

Bookmark

Abstract

Two approaches are known for solving large-scale unconstrained optimization problems—the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and the comparisons versus other conjugate gradient methods are given. The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will gain an understanding of their properties and their convergence and will learn to develop and prove the convergence of his/her own methods. Numerous numerical studies are supplied with comparisons and comments on the behavior of conjugate gradient algorithms for solving a collection of 800 unconstrained optimization problems of different structures and complexities with the number of variables in the range [1000,10000]. The book is addressed to all those interested in developing and using new advanced techniques for solving unconstrained optimization complex problems. Mathematical programming researchers, theoreticians and practitioners in operations research, practitioners in engineering and industry researchers, as well as graduate students in mathematics, Ph.D. and master students in mathematical programming, will find plenty of information and practical applications for solving large-scale unconstrained optimization problems and applications by conjugate gradient methods.


Book
Gradient expectations : structure, origins, and synthesis of predictive neural networks
Author:
ISBN: 0262374676 0262374684 0262545616 Year: 2023 Publisher: Cambridge, MA : The MIT Press,

Loading...
Export citation

Choose an application

Bookmark

Abstract

An insightful investigation into the mechanisms underlying the predictive functions of neural networks--and their ability to chart a new path for AI. Prediction is a cognitive advantage like few others, inherently linked to our ability to survive and thrive. Our brains are awash in signals that embody prediction. Can we extend this capability more explicitly into synthetic neural networks to improve the function of AI and enhance its place in our world Gradient Expectations is a bold effort by Keith L. Downing to map the origins and anatomy of natural and artificial neural networks to explore how, when designed as predictive modules, their components might serve as the basis for the simulated evolution of advanced neural network systems. Downing delves into the known neural architecture of the mammalian brain to illuminate the structure of predictive networks and determine more precisely how the ability to predict might have evolved from more primitive neural circuits. He then surveys past and present computational neural models that leverage predictive mechanisms with biological plausibility, identifying elements, such as gradients, that natural and artificial networks share. Behind well-founded predictions lie gradients, Downing finds, but of a different scope than those that belong to today's deep learning. Digging into the connections between predictions and gradients, and their manifestation in the brain and neural networks, is one compelling example of how Downing enriches both our understanding of such relationships and their role in strengthening AI tools. Synthesizing critical research in neuroscience, cognitive science, and connectionism, Gradient Expectations offers unique depth and breadth of perspective on predictive neural-network models, including a grasp of predictive neural circuits that enables the integration of computational models of prediction with evolutionary algorithms.


Book
Conjugate gradient algorithms in nonconvex optimization
Author:
ISBN: 3540856331 3642099254 9786611904555 1281904554 354085634X 9783540856337 Year: 2009 Publisher: New York : Springer,

Loading...
Export citation

Choose an application

Bookmark

Abstract

This up-to-date book is on algorithms for large-scale unconstrained and bound constrained optimization. Optimization techniques are shown from a conjugate gradient algorithm perspective. Large part of the book is devoted to preconditioned conjugate gradient algorithms. In particular memoryless and limited memory quasi-Newton algorithms are presented and numerically compared to standard conjugate gradient algorithms. The special attention is paid to the methods of shortest residuals developed by the author. Several effective optimization techniques based on these methods are presented. Because of the emphasis on practical methods, as well as rigorous mathematical treatment of their convergence analysis, the book is aimed at a wide audience. It can be used by researches in optimization, graduate students in operations research, engineering, mathematics and computer science. Practitioners can benefit from numerous numerical comparisons of professional optimization codes discussed in the book.

Keywords

Algorithms. --- Conjugate gradient methods. --- Mathematical optimization. --- Conjugate gradient methods --- Algorithms --- Mathematical optimization --- Mathematics --- Civil & Environmental Engineering --- Physical Sciences & Mathematics --- Engineering & Applied Sciences --- Mathematical Theory --- Operations Research --- Algebra --- Calculus --- Algorism --- Gradient methods, Conjugate --- Mathematics. --- Operations research. --- Decision making. --- Calculus of variations. --- Quality control. --- Reliability. --- Industrial safety. --- Calculus of Variations and Optimal Control; Optimization. --- Operation Research/Decision Theory. --- Quality Control, Reliability, Safety and Risk. --- Arithmetic --- Approximation theory --- Equations --- Iterative methods (Mathematics) --- Foundations --- Numerical solutions --- System safety. --- Operations Research/Decision Theory. --- Safety, System --- Safety of systems --- Systems safety --- Accidents --- Industrial safety --- Systems engineering --- Operational analysis --- Operational research --- Industrial engineering --- Management science --- Research --- System theory --- Optimization (Mathematics) --- Optimization techniques --- Optimization theory --- Systems optimization --- Mathematical analysis --- Maxima and minima --- Operations research --- Simulation methods --- System analysis --- Prevention --- Industrial accidents --- Industries --- Job safety --- Occupational hazards, Prevention of --- Occupational health and safety --- Occupational safety and health --- Prevention of industrial accidents --- Prevention of occupational hazards --- Safety, Industrial --- Safety engineering --- Safety measures --- Safety of workers --- System safety --- Dependability --- Trustworthiness --- Conduct of life --- Factory management --- Reliability (Engineering) --- Sampling (Statistics) --- Standardization --- Quality assurance --- Quality of products --- Deciding --- Decision (Psychology) --- Decision analysis --- Decision processes --- Making decisions --- Management --- Management decisions --- Choice (Psychology) --- Problem solving --- Isoperimetrical problems --- Variations, Calculus of --- Decision making

Listing 1 - 4 of 4
Sort by