FU02: Information Theory and Data Compression (Q3/2020)

  • Instructors

    • Professor Anh Pham (email: pham@u-aizu.ac.jp)

      • Office: 306C, office hours: by appointment.

    • S/A Professor: Guowei Lyu (no teaching this year)

  • Teaching Assistant

    • Tuan Bui (d8202103), office: 305E, office hours: by appointment.

    • Hoang Le (d8202101), office: 305E, office hours: by appointment.

    • Linh Hoang (m5232108), office: 305E, office hours: by appointment.

  • Class Meeting

    • Lectures: Mon-1/2 and Thu-1/2 at M9

    • Lab exercise: Thu-3/4 at std1 and std2

  • Course Syllabus

  • Links


Course Objectives

This courseInformation Theory and Data Compressiongives knowledge and basic skills as follows. Transmitting information efficiently and accurately is one of the important technical challenges in the modern digital society. Information theory is rooted in mathematical formulation and provides a theoretical solution to this problem. The idea of ​​information theory makes it possible to construct an efficient coding for information communication and error correction by utilizing the probability and the statistical theorem. Information theory plays an important role in fields such as image data compression, cryptology theory, network communication, information quantity evaluation, etc.

 


Lecture and Exercise Plan

Lecture Plan

  • Lecture 1:  Course introduction, Intro. to Information Theory

  • Lecture 2:  Errors and Error Detecting Codes

  • Lecture 3 – 4: Error Correcting Codes - Repetition and Hamming codes

  • Lecture 5 – 6: Data compression & Huffman Code (Prefix-free, Trees & Codes, Kraft Inequality, Trees with Prob., Huffman Code)

  • Lecture 7 – 8: Probability & Inference

  • Lecture 9: Entropy & Source Coding Theorem

  • Lecture 10: Channel & Mutual information

  • Lecture 11: Channel Capacity

  • Lecture 12: Channel Coding Theorem

  • Lecture 13-14: Information Transmission over Noisy Channel

Exercise Plan

  • Lab 1 (1 session): Matlab tutorial
  • Lab 2 (1 session): Image transmission over binary symmetric channel (BSC)
  • Lab 3 (1 session): Implementation of repetition code (R3) encoder/decoder
  • Lab 4 (1 session): Implementation of Hamming code (7,4) encoder/decoder
  • Lab 5 (2 sessions): Implementation of Source coding (data compression) using Huffman code
  • Lab 6 (1 sessions): TBD

Detail schedule for lectures and exercises, check the up-to-date course syllabus.


Textbooks

  • Information Theory, Pattern Recognition, and Neural Networks (English, free), David J. C. MacKay, University of Cambridge, ISBN: 9780521642989. FREE, downloadable from the course Gdrive.

  • A Student's Guide To Coding And Information Theory, STEFAN M. MOSER PO-NING CHEN, University of Cambridge

    • Paperback or Kindle, price is about 3000 Yen.


Grading Policy

  • A: Quiz - 10%

  • L: Lab assignments: 40%  (15/10/15 for Lab 1/2/3)

  • E: Final exams: 50%


Anh T. Pham, 2019--2020.