Listing 1 - 10 of 99 | << page >> |
Sort by
|
Choose an application
Annotation The Data Compression Conference (DCC) is an international forum for current work concerning data compression and related applications The conference addresses not only compression methods for specific types of data (text, images, video, audio, medical and scientific data, graphics, web content, etc ), but also the use of techniques from information theory and data compression in networking, communications, and storage applications involving large datasets (including image and information mining, retrieval, archiving, backup, communications, and human computer interface) Both theoretical and experimental work is of interest.
Choose an application
The Data Compression Conference (DCC) is an international forum for current work concerning data compression and related applications The conference addresses not only compression methods for specific types of data (text, images, video, audio, medical and scientific data, graphics, web content, etc), but also the use of techniques from information theory and data compression in networking, communications, and storage applications involving large datasets (including image and information mining, retrieval, archiving, backup, communications, and human computer interface) Both theoretical and experimental work is of interest.
Choose an application
Choose an application
An IP-based media protocol over both broadcasting networks and broadband networks, including data modeling, data transmission methods, signaling messages, and media presentation mechanisms is defined in this standard.
Choose an application
Choose an application
Choose an application
Choose an application
Choose an application
Choose an application
Lossless data compression is a facet of source coding and a well studied problem of information theory. Its goal is to find a shortest possible code that can be unambiguously recovered. Here, we focus on rigorous analysis of code redundancy for known sources. The redundancy rate problem determines by how much the actual code length exceeds the optimal code length. We present precise analyses of three types of lossless data compression schemes, namely fixed-to-variable (FV) length codes, variable-to-fixed (VF) length codes, and variable to- variable (VV) length codes. In particular, we investigate the average redundancy of Shannon, Huffman, Tunstall, Khodak and Boncelet codes. These codes have succinct representations as trees, either as coding or parsing trees, and we analyze here some of their parameters (e.g., the average path from the root to a leaf). Such trees are precisely analyzed by analytic methods, known also as analytic combinatorics, in which complex analysis plays decisive role. These tools include generating functions, Mellin transform, Fourier series, saddle point method, analytic poissonization and depoissonization, Tauberian theorems, and singularity analysis. The term analytic information theory has been coined to describe problems of information theory studied by analytic tools. This approach lies on the crossroad of information theory, analysis of algorithms, and combinatorics.
Listing 1 - 10 of 99 | << page >> |
Sort by
|