Listing 1 - 10 of 22 | << page >> |
Sort by
|
Choose an application
The main focus of these lectures is basis extremal problems and inequalities – two sides of the same coin. Additionally they prepare well for approaches and methods useful and applicable in a broader mathematical context. Highlights of the book include a solution to the famous 4m-conjecture of Erdös/Ko/Rado 1938, one of the oldest problems in combinatorial extremal theory, an answer to a question of Erdös (1962) in combinatorial number theory "What is the maximal cardinality of a set of numbers smaller than n with no k+1 of its members pair wise relatively prime?", and the discovery that the AD-inequality implies more general and sharper number theoretical inequalities than for instance Behrend's inequality. Several concepts and problems in the book arise in response to or by rephrasing questions from information theory, computer science, statistical physics. The interdisciplinary character creates an atmosphere rich of incentives for new discoveries and lends Ars Combinatoria a special status in mathematics. At the end of each chapter, problems are presented in addition to exercises and sometimes conjectures that can open a reader’s eyes to new interconnections.
Combinatorial analysis. --- Combinatorics --- Algebra --- Mathematical analysis --- Distribution (Probability theory. --- Information theory. --- Combinatorics. --- Computational complexity. --- Number theory. --- Discrete Mathematics. --- Probability Theory and Stochastic Processes. --- Theory of Computation. --- Discrete Mathematics in Computer Science. --- Number Theory. --- Number study --- Numbers, Theory of --- Complexity, Computational --- Electronic data processing --- Machine theory --- Communication theory --- Communication --- Cybernetics --- Distribution functions --- Frequency distribution --- Characteristic functions --- Probabilities --- Discrete mathematics. --- Probabilities. --- Computers. --- Computer science—Mathematics. --- Automatic computers --- Automatic data processors --- Computer hardware --- Computing machines (Computers) --- Electronic brains --- Electronic calculating-machines --- Electronic computers --- Hardware, Computer --- Computer systems --- Calculators --- Cyberspace --- Probability --- Statistical inference --- Combinations --- Mathematics --- Chance --- Least squares --- Mathematical statistics --- Risk --- Discrete mathematical structures --- Mathematical structures, Discrete --- Structures, Discrete mathematical --- Numerical analysis
Choose an application
Choose an application
Information theory. --- Combinatorial analysis. --- Théorie de l'information --- Analyse combinatoire --- Ahlswede, Rudolf, --- Combinatorial analysis --- Information theory --- Computer Science --- Algebra --- Mathematics --- Engineering & Applied Sciences --- Physical Sciences & Mathematics --- Communication theory --- Combinatorics --- Computer science. --- Computer communication systems. --- Data structures (Computer science). --- Algorithms. --- Numerical analysis. --- Computer science --- Computer Science. --- Computer Communication Networks. --- Algorithm Analysis and Problem Complexity. --- Discrete Mathematics in Computer Science. --- Numeric Computing. --- Data Structures, Cryptology and Information Theory. --- Mathematics. --- Computer mathematics --- Discrete mathematics --- Electronic data processing --- Mathematical analysis --- Algorism --- Arithmetic --- Information structures (Computer science) --- Structures, Data (Computer science) --- Structures, Information (Computer science) --- File organization (Computer science) --- Abstract data types (Computer science) --- Communication systems, Computer --- Computer communication systems --- Data networks, Computer --- ECNs (Electronic communication networks) --- Electronic communication networks --- Networks, Computer --- Teleprocessing networks --- Data transmission systems --- Digital communications --- Electronic systems --- Information networks --- Telecommunication --- Cyberinfrastructure --- Network computers --- Informatics --- Science --- Foundations --- Distributed processing --- Communication --- Cybernetics --- Computer software. --- Computational complexity. --- Electronic data processing. --- Data structures (Computer scienc. --- Data Structures and Information Theory. --- Complexity, Computational --- Machine theory --- Software, Computer --- Computer systems --- ADP (Data processing) --- Automatic data processing --- Data processing --- EDP (Data processing) --- IDP (Data processing) --- Integrated data processing --- Computers --- Office practice --- Automation --- Computer science—Mathematics. --- Computer networks. --- Discrete mathematics. --- Numerical Analysis. --- Discrete mathematical structures --- Mathematical structures, Discrete --- Structures, Discrete mathematical --- Numerical analysis
Choose an application
Information theory. --- Communication theory --- Communication --- Cybernetics --- Teoria de la informació --- Teoria de la comunicació --- Cibernètica --- Comunicació --- Entropia (Teoria de la informació) --- Processament de la parla --- Representació del coneixement (Teoria de la informació) --- Teoria de la codificació --- Teoria de la commutació --- Lingüística matemàtica --- Teoria matemàtica de la comunicació --- Traducció automàtica --- Transmissió de dades --- Xarxes semàntiques (Teoria de la informació) --- Economia de la informació --- Informàtica
Choose an application
Number theory --- Operational research. Game theory --- Discrete mathematics --- Computer science --- discrete wiskunde --- stochastische analyse --- informatica --- kansrekening --- getallenleer
Choose an application
Choose an application
The main focus of these lectures is basis extremal problems and inequalities - two sides of the same coin. Additionally they prepare well for approaches and methods useful and applicable in a broader mathematical context. Highlights of the book include a solution to the famous 4m-conjecture of Erdös/Ko/Rado 1938, one of the oldest problems in combinatorial extremal theory, an answer to a question of Erdös (1962) in combinatorial number theory "What is the maximal cardinality of a set of numbers smaller than n with no k+1 of its members pair wise relatively prime?", and the discovery that the AD-inequality implies more general and sharper number theoretical inequalities than for instance Behrend's inequality. Several concepts and problems in the book arise in response to or by rephrasing questions from information theory, computer science, statistical physics. The interdisciplinary character creates an atmosphere rich of incentives for new discoveries and lends Ars Combinatoria a special status in mathematics. At the end of each chapter, problems are presented in addition to exercises and sometimes conjectures that can open a reader's eyes to new interconnections.
Number theory --- Operational research. Game theory --- Discrete mathematics --- Computer science --- discrete wiskunde --- stochastische analyse --- informatica --- kansrekening --- getallenleer
Choose an application
The calculation of channel capacities was one of Rudolf Ahlswede's specialties and is the main topic of this second volume of his Lectures on Information Theory. Here we find a detailed account of some very classical material from the early days of Information Theory, including developments from the USA, Russia, Hungary and which Ahlswede was probably in a unique position to describe the German school centered around his supervisor Konrad Jacobs. These lectures made an approach to a rigorous justification of the foundations of Information Theory. This is the second of several volumes documenting Rudolf Ahlswede's lectures on Information Theory. Each volume includes comments from an invited well-known expert. In the supplement to the present volume, Gerhard Kramer contributes his insights. Classical information processing concerns the main tasks of gaining knowledge and the storage, transmission and hiding of data. The first task is the prime goal of Statistics. For transmission and hiding of data, Shannon developed an impressive mathematical theory called Information Theory, which he based on probabilistic models. The theory largely involves the concept of codes with small error probabilities in spite of noise in the transmission, which is modeled by channels. The lectures presented in this work are suitable for graduate students in Mathematics, and also for those working in Theoretical Computer Science, Physics, and Electrical Engineering with a background in basic Mathematics. The lectures can be used as the basis for courses or to supplement courses in many ways. Ph.D. students will also find research problems, often with conjectures, that offer potential subjects for a thesis. More advanced researchers may find questions which form the basis of entire research programs.
Mathematics. --- Information and Communication, Circuits. --- Mathématiques --- Engineering & Applied Sciences --- Mathematics --- Physical Sciences & Mathematics --- Computer Science --- Mathematical Theory --- Information theory. --- Artificial intelligence. --- AI (Artificial intelligence) --- Artificial thinking --- Electronic brains --- Intellectronics --- Intelligence, Artificial --- Intelligent machines --- Machine intelligence --- Thinking, Artificial --- Communication theory --- Communication --- Cybernetics --- Bionics --- Cognitive science --- Digital computer simulation --- Electronic data processing --- Logic machines --- Machine theory --- Self-organizing systems --- Simulation methods --- Fifth generation computers --- Neural computers --- Math --- Science
Choose an application
Devoted to information security, this volume begins with a short course on cryptography, mainly based on lectures given by Rudolf Ahlswede at the University of Bielefeld in the mid 1990s. It was the second of his cycle of lectures on information theory which opened with an introductory course on basic coding theorems, as covered in Volume 1 of this series. In this third volume, Shannon’s historical work on secrecy systems is detailed, followed by an introduction to an information-theoretic model of wiretap channels, and such important concepts as homophonic coding and authentication. Once the theoretical arguments have been presented, comprehensive technical details of AES are given. Furthermore, a short introduction to the history of public-key cryptology, RSA and El Gamal cryptosystems is provided, followed by a look at the basic theory of elliptic curves, and algorithms for efficient addition in elliptic curves. Lastly, the important topic of “oblivious transfer” is discussed, which is strongly connected to the privacy problem in communication. Today, the importance of this problem is rapidly increasing, and further research and practical realizations are greatly anticipated. This is the third of several volumes serving as the collected documentation of Rudolf Ahlswede’s lectures on information theory. Each volume includes comments from an invited well-known expert. In the supplement to the present volume, Rüdiger Reischuk contributes his insights. Classical information processing concerns the main tasks of gaining knowledge and the storage, transmission and hiding of data. The first task is the prime goal of Statistics. For transmission and hiding data, Shannon developed an impressive mathematical theory called Information Theory, which he based on probabilistic models. The theory largely involves the concept of codes with small error probabilities in spite of noise in the transmission, which is modeled by channels. The lectures presented in this work are suitable for graduate students in Mathematics, and also for those working in Theoretical Computer Science, Physics, and Electrical Engineering with a background in basic Mathematics. The lectures can be used as the basis for courses or to supplement courses in many ways. Ph.D. students will also find research problems, often with conjectures, that offer potential subjects for a thesis. More advanced researchers may find questions which form the basis of entire research programs.
Computer Science --- Mathematical Theory --- Mathematics --- Engineering & Applied Sciences --- Physical Sciences & Mathematics --- Information theory. --- Data encryption (Computer science) --- Communication theory --- Data encoding (Computer science) --- Encryption of data (Computer science) --- Communication --- Cybernetics --- Computer security --- Cryptography --- Mathematics. --- Information and Communication, Circuits. --- Math --- Science
Choose an application
The fourth volume of Rudolf Ahlswede’s lectures on Information Theory is focused on Combinatorics. Ahlswede was originally motivated to study combinatorial aspects of Information Theory via zero-error codes: in this case the structure of the coding problems usually drastically changes from probabilistic to combinatorial. The best example is Shannon’s zero error capacity, where independent sets in graphs have to be examined. The extension to multiple access channels leads to the Zarankiewicz problem. A code can be regarded combinatorially as a hypergraph; and many coding theorems can be obtained by appropriate colourings or coverings of the underlying hypergraphs. Several such colouring and covering techniques and their applications are introduced in this book. Furthermore, codes produced by permutations and one of Ahlswede’s favourite research fields -- extremal problems in Combinatorics -- are presented. Whereas the first part of the book concentrates on combinatorial methods in order to analyse classical codes as prefix codes or codes in the Hamming metric, the second is devoted to combinatorial models in Information Theory. Here the code concept already relies on a rather combinatorial structure, as in several concrete models of multiple access channels or more refined distortions. An analytical tool coming into play, especially during the analysis of perfect codes, is the use of orthogonal polynomials. Classical information processing concerns the main tasks of gaining knowledge and the storage, transmission and hiding of data. The first task is the prime goal of Statistics. For transmission and hiding data, Shannon developed an impressive mathematical theory called Information Theory, which he based on probabilistic models. The theory largely involves the concept of codes with small error probabilities in spite of noise in the transmission, which is modeled by channels. The lectures presented in this work are suitable for graduate students in Mathematics, and also for those working in Theoretical Computer Science, Physics, and Electrical Engineering with a background in basic Mathematics. The lectures can be used as the basis for courses or to supplement courses in many ways. Ph.D. students will also find research problems, often with conjectures, that offer potential subjects for a thesis. More advanced researchers may find questions which form the basis of entire research programs.
Combinatorial optimization --- Data processing. --- Optimization, Combinatorial --- Mathematics. --- Information theory. --- Combinatorics. --- Information and Communication, Circuits. --- Combinatorial analysis --- Mathematical optimization --- Math --- Science --- Combinatorics --- Algebra --- Mathematical analysis --- Communication theory --- Communication --- Cybernetics
Listing 1 - 10 of 22 | << page >> |
Sort by
|