Objectives |
To expose to students some concepts in information
theory, and the performance characteristics of an ideal communications
system.
To expose to students fundamentals in coding
and its applications.
|
References/Textbooks |
C.E. Shannon, “A Mathematical Theory of Communications”,
Bell System Tech. Journal, Vol. 27, July and Oct. 1948.
Haykin, S. Communication Systems, 4th ed., John
Wiley & Sons, 2001.
Sergio Verdu and Steven W. McLaughlin, "Information
Theory: 50 Years of Discovery" IEEE Press and John Wiley & Sons, Inc.
Simon Haykin, ‘Digital Communications’, John
Wiley & Sons, 1988.
Simon Haykin, ‘Communication Systems’, 3rd edition,
John Wiley.
H. Taub & D. Schilling, ‘Principles of Communication
Systems’, McGraw Hill.
W.W. Peterson & E.J. Weldon, ‘Error-correcting
codes’, MIT Press, 1972.
N. Abramson, ‘Information Theory and Coding’,
McGraw-Hill, 1963
R.E. Ziemer and W.H. Tranter, Principles of Communications:
Systems, Modulation, and Noise, John Wiley and Sons, Inc., 1994, Fourth
edition.
J.G. Proakis and M. Salehi, Communication Systems
Engineering, Prentice Hall, 1994, (ISBN: 0-13-158932-6).
B.P. Lathi, Modern Digital and Analog Communication
Systems, Oxford University Press, 1998, (ISBN: 0-19-511009-9).
J.D. Gibson, Principles of Digital and Analog
Communications, MacMillan, 1993, (ISBN: 0-02-341860-5).
L.W. Couch, Digital and Analog Communication
Systems, Prentice Hall, 1997.
Richard B. Wells, Applied Coding and Information
Theory for Engineers, Prentice-Hall, 1999
Cover T.M and Thomas J.A., Elements of information
theory, John Wiley & Sons, 1991
Van der Lubbe J.C.A, Information Theory, Cambridge
University Press, 1997
Christian Schlegel, Trellis Coding, IEEE Press,
1997, ISBN 0-7803-1052-7
|
Contents |
Chapter
1 Information Sources and Sources Coding
-
Logarithmic measure for information,
self and average information. Entropy, information rate, discrete sources,
extensions of a discrete source, Shannon’s source coding theorem. Markov
source. Joint and conditional entropy.
-
Source coding theorem and algorithms.
Kraft inequality, Huffman code, prefix code, Lempel-Ziv code, rate distortion
theory. Scalar and vector quantization, waveform coding.
|
Chapter
2 Channel Capacity and Coding
-
Discrete channels, a priori and
a posterior entropies, equivocation, mutual information, noiseless channel,
deterministic channel, channel capacity, Shannon’s channel coding theorem,
bandwidth-S/N trade-off.
-
Channel capacity theorem. Continuous
information source, maximum relative entropy.
|
Chapter
3 Linear Block and Cyclic Error-Correction Coding
-
Model of digital communication
system employing coding. Algebraic coding theory. Definition of terms:
redundancy, code efficiency, systematic codes, Hamming distance, Hamming
weight, Hamming bound.
-
Types of codes: parity check codes,
Hamming codes, BCH codes, maximum-length or pseudo-random codes, Reed-Solomon
codes, concatenated codes. Linear block codes, generator and parity check
matrix, syndrome decoding. Cyclic codes, generation and detection.
-
Coding for reliable communication,
coding gain, bandwidth expansion ratio. Comparison of coded and uncoded
systems.
|
Chapter
4 Convolutional Codes
-
Burst error detecting and correcting
codes. Convolutional codes, time domain and frequency domain approaches.
Code tree, Trellis and state diagram. Decoding of convolutional codes,
Veterbi’s algorithm, sequential decoding.
-
Transfer function and distance
properties of convolutional codes. Bound on the bit error rate. Coding
gain.
|
Chapter
5 Applications of Coding
Coding for bandwidth constrained
channels: combined coding and modulation, Trellis coded modulation (TCM),
decoding of TCM codes.
Coding for white Gaussian noise
channel. Coding for compound-error channels, coding for error control in
data storage.
|
Laboratory |
|