Listing 1 - 5 of 5 |
Sort by
|
Choose an application
The celebrated information bottleneck (IB) principle of Tishby et al. has recently enjoyed renewed attention due to its application in the area of deep learning. This collection investigates the IB principle in this new context. The individual chapters in this collection: • provide novel insights into the functional properties of the IB; • discuss the IB principle (and its derivates) as an objective for training multi-layer machine learning structures such as neural networks and decision trees; and • offer a new perspective on neural network learning via the lens of the IB framework. Our collection thus contributes to a better understanding of the IB principle specifically for deep learning and, more generally, of information–theoretic cost functions in machine learning. This paves the way toward explainable artificial intelligence.
Information technology industries --- information theory --- variational inference --- machine learning --- learnability --- information bottleneck --- representation learning --- conspicuous subset --- stochastic neural networks --- mutual information --- neural networks --- information --- bottleneck --- compression --- classification --- optimization --- classifier --- decision tree --- ensemble --- deep neural networks --- regularization methods --- information bottleneck principle --- deep networks --- semi-supervised classification --- latent space representation --- hand crafted priors --- learnable priors --- regularization --- deep learning
Choose an application
The celebrated information bottleneck (IB) principle of Tishby et al. has recently enjoyed renewed attention due to its application in the area of deep learning. This collection investigates the IB principle in this new context. The individual chapters in this collection: • provide novel insights into the functional properties of the IB; • discuss the IB principle (and its derivates) as an objective for training multi-layer machine learning structures such as neural networks and decision trees; and • offer a new perspective on neural network learning via the lens of the IB framework. Our collection thus contributes to a better understanding of the IB principle specifically for deep learning and, more generally, of information–theoretic cost functions in machine learning. This paves the way toward explainable artificial intelligence.
information theory --- variational inference --- machine learning --- learnability --- information bottleneck --- representation learning --- conspicuous subset --- stochastic neural networks --- mutual information --- neural networks --- information --- bottleneck --- compression --- classification --- optimization --- classifier --- decision tree --- ensemble --- deep neural networks --- regularization methods --- information bottleneck principle --- deep networks --- semi-supervised classification --- latent space representation --- hand crafted priors --- learnable priors --- regularization --- deep learning
Choose an application
The celebrated information bottleneck (IB) principle of Tishby et al. has recently enjoyed renewed attention due to its application in the area of deep learning. This collection investigates the IB principle in this new context. The individual chapters in this collection: • provide novel insights into the functional properties of the IB; • discuss the IB principle (and its derivates) as an objective for training multi-layer machine learning structures such as neural networks and decision trees; and • offer a new perspective on neural network learning via the lens of the IB framework. Our collection thus contributes to a better understanding of the IB principle specifically for deep learning and, more generally, of information–theoretic cost functions in machine learning. This paves the way toward explainable artificial intelligence.
Information technology industries --- information theory --- variational inference --- machine learning --- learnability --- information bottleneck --- representation learning --- conspicuous subset --- stochastic neural networks --- mutual information --- neural networks --- information --- bottleneck --- compression --- classification --- optimization --- classifier --- decision tree --- ensemble --- deep neural networks --- regularization methods --- information bottleneck principle --- deep networks --- semi-supervised classification --- latent space representation --- hand crafted priors --- learnable priors --- regularization --- deep learning --- information theory --- variational inference --- machine learning --- learnability --- information bottleneck --- representation learning --- conspicuous subset --- stochastic neural networks --- mutual information --- neural networks --- information --- bottleneck --- compression --- classification --- optimization --- classifier --- decision tree --- ensemble --- deep neural networks --- regularization methods --- information bottleneck principle --- deep networks --- semi-supervised classification --- latent space representation --- hand crafted priors --- learnable priors --- regularization --- deep learning
Choose an application
This book introduces readers to essential tools for the measurement and analysis of information loss in signal processing systems. Employing a new information-theoretic systems theory, the book analyzes various systems in the signal processing engineer’s toolbox: polynomials, quantizers, rectifiers, linear filters with and without quantization effects, principal components analysis, multirate systems, etc. The user benefit of signal processing is further highlighted with the concept of relevant information loss. Signal or data processing operates on the physical representation of information so that users can easily access and extract that information. However, a fundamental theorem in information theory—data processing inequality—states that deterministic processing always involves information loss. These measures form the basis of a new information-theoretic systems theory, which complements the currently prevailing approaches based on second-order statistics, such as the mean-squared error or error energy. This theory not only provides a deeper understanding but also extends the design space for the applied engineer with a wide range of methods rooted in information theory, adding to existing methods based on energy or quadratic representations.
Engineering. --- System theory. --- Complexity, Computational. --- Complexity. --- Signal, Image and Speech Processing. --- Complex Systems. --- Construction --- Industrial arts --- Technology --- Computational complexity. --- Signal processing. --- Image processing. --- Speech processing systems. --- Statistical physics. --- Dynamical systems. --- Complexity, Computational --- Electronic data processing --- Machine theory --- Dynamical systems --- Kinetics --- Mathematics --- Mechanics, Analytic --- Force and energy --- Mechanics --- Physics --- Statics --- Mathematical statistics --- Computational linguistics --- Electronic systems --- Information theory --- Modulation theory --- Oral communication --- Speech --- Telecommunication --- Singing voice synthesizers --- Pictorial data processing --- Picture processing --- Processing, Image --- Imaging systems --- Optical data processing --- Processing, Signal --- Information measurement --- Signal theory (Telecommunication) --- Statistical methods --- Signal processing --- Digital techniques --- Mathematics.
Choose an application
This book introduces readers to essential tools for the measurement and analysis of information loss in signal processing systems. Employing a new information-theoretic systems theory, the book analyzes various systems in the signal processing engineer’s toolbox: polynomials, quantizers, rectifiers, linear filters with and without quantization effects, principal components analysis, multirate systems, etc. The user benefit of signal processing is further highlighted with the concept of relevant information loss. Signal or data processing operates on the physical representation of information so that users can easily access and extract that information. However, a fundamental theorem in information theory—data processing inequality—states that deterministic processing always involves information loss. These measures form the basis of a new information-theoretic systems theory, which complements the currently prevailing approaches based on second-order statistics, such as the mean-squared error or error energy. This theory not only provides a deeper understanding but also extends the design space for the applied engineer with a wide range of methods rooted in information theory, adding to existing methods based on energy or quadratic representations.
Discrete mathematics --- Applied physical engineering --- Computer science --- Computer. Automation --- beeldverwerking --- grafentheorie --- spraaktechnologie --- signal processing --- informatica --- ingenieurswetenschappen --- signaalverwerking
Listing 1 - 5 of 5 |
Sort by
|