• Your selection is empty.

    Register the diplomas, courses or lessons of your choice.

Information theory

  • School / Prep

    ENSEIRB-MATMECA

Internal code

EI5IS103

Description


Definitions and basic properties of the amount of information provided by the realization of an event and the entropy (in Shannon's sense) of a random variable, simple conditional entropy of a random variable, mutual information of two v.a.
Coding theory, decipherability and ambiguity, spontaneity of codes, Sardinas and Patterson algorithm, Kraft-MCMillan's necessary and sufficient condition for the existence of a decipherable code with words of given lengths
Coding optimality (of v.a.), links between the entropy of a v.a. and the average length of code words. and the average length of the associated code words (Shannon theorems), Huffman algorithm
Transmission of information through noisy memoryless channels, some important channels, channel capacity, calculation of capacity in simple cases, decoding problem, uniform-error-bound decoding scheme, Shannon's fundamental theorem on the possibility of correct transmission with arbitrarily large probability and rate less than capacity (without demonstration).

Read more

Teaching hours

  • CMLectures16h
  • TIIndividual work12h

Mandatory prerequisites

Discrete probability

Read more

Syllabus

1. General information theory
2. Coding theory
3. Optimal codes, Huffman's algorithm
4. Introduction to noisy channels, capacity

Read more

Bibliography

authorized documents

Read more

Assessment of knowledge

Initial assessment / Main session - Tests

Type of assessmentType of testDuration (in minutes)Number of testsTest coefficientEliminatory mark in the testRemarks
Integral Continuous ControlContinuous control1