Prerequisites: |
Single-variable calculus, and a solid background in linear algebra. Some familiarity with modular arithmetic might help, but is not required. We hope to eventually reach some depth in understanding finite fields, their structure and matrices/linear algebra over them. For this reason, the course has a slightly higher mathematical level than Math 5248. |
Instructor: | Victor Reiner (You can call me "Vic"). |
Office: Vincent Hall 256 Telephone (with voice mail): (612) 625-6682 E-mail: reiner@math.umn.edu |
|
Classes: |
Monday, Wednesday 3:35-5:00pm in Ford Hall B10 |
Office hours: | Mon, Tue, Wed at 11:15am, also Wed at 3pm, and by appointment. |
Course content: |
This is an introductory course in the mathematics of codes for communication designed to achieve compression of information and error-detecting/correction. We intend to cover most of the text by Garrett listed below, including treatment of
even though they are sometimes included in the title of the course. The topics covered are similar to when the course has been taught in the past by Prof. Paul Garrett, the author of our text A useful resource are his lecture transparencies and homeworks solutions from his crypto page. This is not a course in
|
Text: |
The Mathematics of Coding Theory: Information, Compression, Error Correction and
Finite Fields
by Paul Garrett, Prentice Hall, 2004. A (non-required) supplemental text which has been used for part of this course in the past is Introduction to coding and information theory by Steven Roman, Springer-Verlag, 1997. I may draw a small amount of material from this text. Here is Richard Ehrenborg's parlor trick using the Hamming binary [7,4,1]-code that was demonstrated on the first day of class. |
Homework and exams: | There will be 6 homework assignments due generally every other week,
except for
Late homework will not be accepted. Early homework is fine, and can be left in my mailbox in the School of Math mailroom near Vincent Hall 105. Collaboration is encouraged as long as everyone collaborating understands thoroughly the solution, and you write up the solution in your own words, along with a note at the top of the homework indicating with whom you've collaborated. Homework solutions should be well-explained-- the grader will be told not to give credit for an unsupported answer. |
Grading: |
Homework = 50% of grade Each of 2 midterms = 15% of grade Final exam 20% of grade. Complaints about the grading should be brought to me. |
Policy on incompletes: | Incompletes will be given only in exceptional circumstances, where the student has completed almost the entire course with a passing grade, but something unexpected happens to prevent completion of the course. Incompletes will never be made up by taking the course again later. You must talk to me before the final exam if you think an incomplete may be warranted. |
Other expectations | This is a 4-credit course, so I would guess that the average student should spend about 8 hours per week outside of class to get a decent grade. Part of this time each week would be well-spent making a first pass through the material in the book that we anticipate to cover in class that week, so that you can bring your questions/confusions to class and ask about them. |
Assignment or Exam | Due date | Problems, mainly from Garrett's book | |
---|---|---|---|
Homework 1 | Wed Feb. 1 |
From Garrett's text: 1.33 2.02, 2.04, 2.05 3.02,3.05,3.06, (3.07 optional) Not from text: A. Consider these three collections C1, C2, C3 of codewords: C1={0,10,110,1110,1111} C2={0,10,110,1110,1101} C3={0,01,011,0111,1111} Indicate for each (with explanation) whether or not it is (a) uniquely decipherable, (b) instantaneous. B. Does there exist a binary code which is instantaneous and has code words with lengths (1,2,3,3)? If not, prove it. If so, construct one. C. State and prove the precise conditions under which a random variable X on a finite probability space has entropy H(X) equal to zero. |
|
Homework 2 | Wed Feb. 15 |
From Garrett's text: 4.01, 4.02, 4.04, 4.06, (4.07 removed from HW), 4.11, 4.12 Not from text: Let p be a probability between 0 and 1. Explain why a noisy channel with input and output alphabets both {0,1} and the following probabilities is called a useless channel: P(0 received | 0 sent)=p P(0 received | 1 sent)=p P(1 received | 0 sent)=1-p P(1 received | 1 sent)=1-p |
|
Midterm exam 1 | Wed. Feb. 22 | Midterm exam 1 in PostScript, PDF. | |
Homework 3 | Wed Mar. 8 |
From Garrett's text: 5.01, 02, 03, 04, 05, 08 6.01, 02, 05, 20, 22, 38, 48, 50, 53 |
|
Homework 4 | Wed Mar. 29 |
From Garrett's text: 6.30, 31, 57, 81 8.17 9.11, 12 10.04, 08, 09, 11, 13 11.11 12.06, 12.10, 12.12, 12.14, 12.15 |
|
Midterm exam 2 | Wed. Apr. 5 | Midterm exam 2 in PostScript, PDF. | |
Homework 5 | Wed Apr. 19 |
From Garrett's text: 12.01, 04, 17, 19, 20 13.02, 05, 07, 09, 10 14.01, 04 |
|
Homework 6 | Wed Apr. 26 (note 1-week due date!) |
From Garrett's text: 11.02, 05, 07 15.03, 13 16.05 | |
Final exam | Wed. May 3 | Final exam in PostScript, PDF. |