Narrow your search

Library

KU Leuven (2)

Odisee (2)

Thomas More Kempen (2)

Thomas More Mechelen (2)

UCLL (2)

ULB (2)

ULiège (2)

VIVES (2)

AP (1)

FARO (1)

More...

Resource type

book (4)

digital (1)


Language

English (5)


Year
From To Submit

2021 (3)

2018 (2)

Listing 1 - 5 of 5
Sort by

Book
Information Bottleneck : Theory and Applications in Deep Learning
Authors: ---
Year: 2021 Publisher: Basel, Switzerland MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

The celebrated information bottleneck (IB) principle of Tishby et al. has recently enjoyed renewed attention due to its application in the area of deep learning. This collection investigates the IB principle in this new context. The individual chapters in this collection: • provide novel insights into the functional properties of the IB; • discuss the IB principle (and its derivates) as an objective for training multi-layer machine learning structures such as neural networks and decision trees; and • offer a new perspective on neural network learning via the lens of the IB framework. Our collection thus contributes to a better understanding of the IB principle specifically for deep learning and, more generally, of information–theoretic cost functions in machine learning. This paves the way toward explainable artificial intelligence.


Book
Information Bottleneck : Theory and Applications in Deep Learning
Authors: ---
Year: 2021 Publisher: Basel, Switzerland MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

The celebrated information bottleneck (IB) principle of Tishby et al. has recently enjoyed renewed attention due to its application in the area of deep learning. This collection investigates the IB principle in this new context. The individual chapters in this collection: • provide novel insights into the functional properties of the IB; • discuss the IB principle (and its derivates) as an objective for training multi-layer machine learning structures such as neural networks and decision trees; and • offer a new perspective on neural network learning via the lens of the IB framework. Our collection thus contributes to a better understanding of the IB principle specifically for deep learning and, more generally, of information–theoretic cost functions in machine learning. This paves the way toward explainable artificial intelligence.


Book
Information Bottleneck : Theory and Applications in Deep Learning
Authors: ---
Year: 2021 Publisher: Basel, Switzerland MDPI - Multidisciplinary Digital Publishing Institute

Loading...
Export citation

Choose an application

Bookmark

Abstract

The celebrated information bottleneck (IB) principle of Tishby et al. has recently enjoyed renewed attention due to its application in the area of deep learning. This collection investigates the IB principle in this new context. The individual chapters in this collection: • provide novel insights into the functional properties of the IB; • discuss the IB principle (and its derivates) as an objective for training multi-layer machine learning structures such as neural networks and decision trees; and • offer a new perspective on neural network learning via the lens of the IB framework. Our collection thus contributes to a better understanding of the IB principle specifically for deep learning and, more generally, of information–theoretic cost functions in machine learning. This paves the way toward explainable artificial intelligence.


Book
Information Loss in Deterministic Signal Processing Systems
Authors: ---
ISBN: 3319595334 3319595326 Year: 2018 Publisher: Cham : Springer International Publishing : Imprint: Springer,

Loading...
Export citation

Choose an application

Bookmark

Abstract

This book introduces readers to essential tools for the measurement and analysis of information loss in signal processing systems. Employing a new information-theoretic systems theory, the book analyzes various systems in the signal processing engineer’s toolbox: polynomials, quantizers, rectifiers, linear filters with and without quantization effects, principal components analysis, multirate systems, etc. The user benefit of signal processing is further highlighted with the concept of relevant information loss. Signal or data processing operates on the physical representation of information so that users can easily access and extract that information. However, a fundamental theorem in information theory—data processing inequality—states that deterministic processing always involves information loss.  These measures form the basis of a new information-theoretic systems theory, which complements the currently prevailing approaches based on second-order statistics, such as the mean-squared error or error energy. This theory not only provides a deeper understanding but also extends the design space for the applied engineer with a wide range of methods rooted in information theory, adding to existing methods based on energy or quadratic representations.


Digital
Information Loss in Deterministic Signal Processing Systems
Authors: ---
ISBN: 9783319595337 Year: 2018 Publisher: Cham Springer International Publishing

Loading...
Export citation

Choose an application

Bookmark

Abstract

This book introduces readers to essential tools for the measurement and analysis of information loss in signal processing systems. Employing a new information-theoretic systems theory, the book analyzes various systems in the signal processing engineer’s toolbox: polynomials, quantizers, rectifiers, linear filters with and without quantization effects, principal components analysis, multirate systems, etc. The user benefit of signal processing is further highlighted with the concept of relevant information loss. Signal or data processing operates on the physical representation of information so that users can easily access and extract that information. However, a fundamental theorem in information theory—data processing inequality—states that deterministic processing always involves information loss.  These measures form the basis of a new information-theoretic systems theory, which complements the currently prevailing approaches based on second-order statistics, such as the mean-squared error or error energy. This theory not only provides a deeper understanding but also extends the design space for the applied engineer with a wide range of methods rooted in information theory, adding to existing methods based on energy or quadratic representations.

Listing 1 - 5 of 5
Sort by