Syllabus
1.2 Timeline of information theory
1.3 Information, Message and Signal
2.1 Self-information &conditional self-information
2.2 Jensen's inequality
2.3 Entropy
2.4 Joint entropy & conditional entropy
2.5 Relative entropy & Mutual information
2.6 Relationship between entropy & Mutual information
2.7 Chain rules for entropy, relative entropy &mutual information
2.8 Data processing inequality
3.1 the Asymptotic Equipartition Property
3.2 Consequences of the AEP: Data compression
3.3 High probability sets and the typical set
4.1 Markov chains
4.2 Entropy Rate
4.3 Example
4.4 Hidden Markov models
5.1 Examples of codes
5.2 Kraft inequality
5.3 Optimal codes
5.4 Bounds on the optimal code length
5.5 Huffman codes
5.6 Optimality of Huffman codes
5.7 Shannon codes
5.8 Shannon-Fano-Elias coding
6.1 History of Communication Systems
6.2 Communication Systems
6.3 Channel Capacity
6.4 Properties of Channel Capacity
6.5 Communication System-Revisited
6.6 Definitions
6.7 Channel Coding Theorem
6.8 Demonstration of Channel Coding
6.9 Examples
7.1 Definitions
7.2 The AEP for continuous random variables
7.3 Relation of differential entropy to discrete entropy
7.4 Joint and conditional differential entropy
7.5 Relative entropy and mutual information
7.6 Properties of differential entropy, relative entropy and mutual information
8.1 The Gaussian Channel: definitions
8.2 Converse to the Coding Theorem for Gaussian Channels
8.3 Band-limited Channels
8.4 Parallel Gaussian Channels
8.5 Channels with Colored Gaussian Noise
8.6 Gaussian Channels with Feedback
9.1 Maximum Entropy Distributions
9.2 Examples
9.3 Anomalous Maximum Entropy
9.4 Spectrum Estimation
9.5 Entropy Rates of A Gaussian Process
9.6 Burg's Maximum Entropy Theorem
10.1 Distortion Definitions
10.2 Rate distortion function
10.3 Rate distortion theorem and the convers
10.4 Calculation of the rate distortion function
11.1 Gaussian Multiple User Channel
11.2 Jointly Typical Sequences
11.3 The Multiple Access Channel
11.4 The Broadcast Channel
11.5 The Relay Channel