Listing 1 - 1 of 1 |
Sort by
|
Choose an application
From cell phones to Web portals, advances in information and communications technology have thrust society into an information age that is far-reaching, fast-moving, increasingly complex, and yet essential to modern life. Now, renowned scholar and author David Luenberger has produced Information Science, a text that distills and explains the most important concepts and insights at the core of this ongoing revolution. The book represents the material used in a widely acclaimed course offered at Stanford University. Drawing concepts from each of the constituent subfields that collectively comprise information science, Luenberger builds his book around the five "E's" of information: Entropy, Economics, Encryption, Extraction, and Emission. Each area directly impacts modern information products, services, and technology--everything from word processors to digital cash, database systems to decision making, marketing strategy to spread spectrum communication. To study these principles is to learn how English text, music, and pictures can be compressed, how it is possible to construct a digital signature that cannot simply be copied, how beautiful photographs can be sent from distant planets with a tiny battery, how communication networks expand, and how producers of information products can make a profit under difficult market conditions. The book contains vivid examples, illustrations, exercises, and points of historic interest, all of which bring to life the analytic methods presented: Presents a unified approach to the field of information science Emphasizes basic principles Includes a wide range of examples and applications Helps students develop important new skills Suggests exercises with solutions in an instructor's manual
Information science. --- Information theory. --- Addition. --- Algorithm. --- Alice and Bob. --- Amplitude. --- Approximation. --- Bandwidth (signal processing). --- Bibliography. --- Binary code. --- Binary number. --- Binary search tree. --- Binary tree. --- Bit. --- Block code. --- Bubble sort. --- Caesar cipher. --- Calculation. --- Channel capacity. --- Cipher. --- Ciphertext. --- Comma code. --- Commodity. --- Common knowledge (logic). --- Competition. --- Computation. --- Computer. --- Conditional entropy. --- Conditional probability. --- Consideration. --- Consumer. --- Cryptanalysis. --- Cryptogram. --- Cryptography. --- Customer. --- Data mining. --- Data structure. --- Database. --- Demand curve. --- Digital signature. --- Discounts and allowances. --- Economic surplus. --- Encryption. --- Estimation. --- Expected value. --- Fourier series. --- Fourier transform. --- Frequency analysis. --- Functional dependency. --- Heapsort. --- Huffman coding. --- Hyperplane. --- Information retrieval. --- Insertion sort. --- Instance (computer science). --- Integer. --- Inverted index. --- Key size. --- Letter frequency. --- Logarithm. --- Marginal cost. --- Measurement. --- Modulation. --- Notation. --- Nyquist–Shannon sampling theorem. --- One-time pad. --- Parity bit. --- Percentage. --- Pricing. --- Probability. --- Public-key cryptography. --- Quantity. --- Quicksort. --- Radio wave. --- Random variable. --- Ranking (information retrieval). --- Requirement. --- Result. --- Run-length encoding. --- Shift register. --- Sine wave. --- Sorting algorithm. --- Special case. --- Spectral density. --- Spreadsheet. --- Standard deviation. --- Subset. --- Substitution cipher. --- Summation. --- Technology. --- Theorem. --- Theory. --- Time complexity. --- Transmitter. --- Transposition cipher. --- Tree (data structure). --- Tuple. --- Uncertainty. --- Value (economics). --- Word (computer architecture).
Listing 1 - 1 of 1 |
Sort by
|