School / Prep
ENSEIRB-MATMECA
Internal code
EI5IS103
Description
Definitions and basic properties of the amount of information provided by the realization of an event and the entropy (in Shannon's sense) of a random variable, simple conditional entropy of a random variable, mutual information of two v.a.
Coding theory, decipherability and ambiguity, spontaneity of codes, Sardinas and Patterson algorithm, Kraft-MCMillan's necessary and sufficient condition for the existence of a decipherable code with words of given lengths
Coding optimality (of v.a.), links between the entropy of a v.a. and the average length of code words. and the average length of the associated code words (Shannon theorems), Huffman algorithm
Transmission of information through noisy memoryless channels, some important channels, channel capacity, calculation of capacity in simple cases, decoding problem, uniform-error-bound decoding scheme, Shannon's fundamental theorem on the possibility of correct transmission with arbitrarily large probability and rate less than capacity (without demonstration).
Teaching hours
- CMLectures16h
- TIIndividual work12h
Mandatory prerequisites
Discrete probability
Syllabus
1. General information theory
2. Coding theory
3. Optimal codes, Huffman's algorithm
4. Introduction to noisy channels, capacity
Bibliography
authorized documents
Assessment of knowledge
Initial assessment / Main session - Tests
Type of assessment | Type of test | Duration (in minutes) | Number of tests | Test coefficient | Eliminatory mark in the test | Remarks |
---|---|---|---|---|---|---|
Integral Continuous Control | Continuous control | 1 |