MSc:AnalyticalGeometryAndLinearAlgebraI

From IU
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

Analytical Geometry and Linear Algebra

  • Course name: Information Theory
  • Course number: 224

Course Characteristics

Key concepts of the class

  • The subject and importance of Linear Algebra and Algebraic Geometry.
  • Linear vector spaces.
  • Cartesian coordinates and the 3D linear vector space.
  • The dot product and the cross product of vectors, and their geometric meaning.
  • Linear transformations of coordinates and vector. Matrices, matrix–vector and matrix–matrix operations.
  • 3D coordinate transformations in physics and mechanics.
  • Solving systems of linear equations. Determinants. The solution spaces.
  • Equations for Planes and Lines in the 3D space. Geometrical problems on

What is the purpose of this course?

The main purpose of this course is two-fold:

  • to provide the students with a general understanding of the central role the Information Theory plays in Computer Science, and its important influence on Mathematics (in particular, Mathematical Statistics), Physics and Quantitative Finance;
  • to provide the students with a hands-on experience in application of Information Theory methods, in such areas as Data Compression and Cryptography; the latter is particularly important for those students who do not systematically study Cryptography as part of any other Computer Science module.

.

Course objectives based on Bloom’s taxonomy

- What should a student remember at the end of the course?

By the end of the course, the students should be able to:

  • Remember the notion and the fundamental role of the Entropy in Mathematics (Probability Theory), Computer Science, Physics and other quantitative disciplines.
  • Know the Foundations of Coding Theory: Symbols, messages, information content of natural languages, estimation of information transmission time via radio frequencies.
  • Possess the notion, and understand the importance, of Prefix-free codes and the Kraft-McMillan inequality.
  • Know the principles of the classical Lossless Data Compression algorithms: Shannon-Fano, Huffmann and Lempel-Ziv.
  • Know the principles and applications of Lossy Data Compression Algorithms, in particular for images, audio and video streams.
  • Know the principles and applications of Error Detection and Correction Codes.
  • Possess the fundamental notions of Cryptography and Cryptanalysis.

- What should a student be able to understand at the end of the course?

By the end of the course, the students should be able to:

  • Understand the Kelly criterion and apply it to Risk Management problems under uncertainty, in particular in Quantitative Finance.
  • Understand the fundamental connections between Entropy, Expected Code Length and the Data Compression Rate.
  • Understand the importance, principles and implementations of the Fast Fourier Transform algorithms, in particular for lossy data compression, and apply them in various problems of Applied Mathematics and Software Engineering. Be aware of similar modern algorithms such as Wavelet Transform.
  • Understand the role of Error Detection and Error Correction Codes in computer hardware, communications and other areas. Understand the design and implementation of classical linear Error Correction Codes. Understand and apply the Hashing Functions as non-linear Error Detection Codes.
  • Understand the foundations of the Pre-Shared (Symmetric) Key cryptography: Feistel framework, DES algorithm and its analogues (Russian GOST).
  • Understand the hierarchy of Algebraic Structures, in particular Groups, Rings and Fields, and their connections to Cryptography and to the AES algorithm in particular.

- What should a student be able to apply at the end of the course?

By the end of the course, the students should be able to:

  • Apply the Maximum Entropy principle to estimate the unknown probability distributions in Probability Theory and related disciplines.
  • Apply the method of Lagrangean Multipliers in various optimization problems of Applied Mathematics, including both analytical and numerical implementations of this method.
  • Apply the modern industrial-strength data compression algorithms in various Software Development problems.
  • Develop a working knowledge of the Extended GCD Algorithm and algebraic operations in Finite Fields (Zp and Galois Fields); apply them in software development problems.

Course evaluation

Course grade breakdown
Proposed points
Assignments and Labs 30 30
Interim performance assessment 20 20
Final Exam 50 50

Grades range

Course grading range
Proposed range
A. Excellent 85-100 85–100
B. Good 65-84 65–84
C. Satisfactory 50-64 50–64
D. Poor 0-49 0–49

Resources and reference material

  • D. Applebaum, Probability and Information: An Integrated Approach, 2nd ed., Cambridge, 2007.
  • D. J.C. MacKay, Information Theory, Inference, and Learning Algorithms, Cambridge, 2005.
  • T. M. Cover and J. A. Thomas, Elements of Information Theory, 2ed, Wiley, 2006.
  • W. Stallings, Cryptography and Network Security: Principles and Practice. 7th ed., Pearson, 2017.

Course Sections

The main sections of the course and approximate hour distribution between them is as follows:

Course Sections
Section Section Title Teaching Hours
1 Introduction into Data Mining 2
2 Data Mining and Data Analysis in Financial Trading 4
3 Micro-Structure of Financial Markets 16
4 Feature Engineering for Machine Learning in Financial Trading 4
5 Descriptive Statistics of Financial Trading Data 4
6 Principal and Independent Component Analysis (PCA and ICA) 8
7 Machine Learning Methods for Prediction of Financial Data 16

Section 1

Section title:

Introduction into the Information Theory. The Entropy.

Topics covered in this section:

  • What the Information Theory is about and why it is important.
  • The fundamental notion of Entropy and its connections to Mathematics, Physics, Economics and Social Sciences.
  • Discrete probability distributions and the mathematical definition of the Entropy.

What forms of evaluation were used to test students’ performance in this section?

|a|c| & Yes/No
Development of individual parts of software product code & 0
Homework and group projects & 0
Midterm evaluation & 1
Testing (written or computer based) & 0
Reports & 0
Essays & 0
Oral polls & 1
Discussions & 1


Typical questions for ongoing performance evaluation within this section

  1. Why do we need to study the Information Theory in the first place?
  2. What is the Entropy in the informal sense?
  3. Provide the mathematical definition of the entropy for a discrete random variable.
  4. Explain connections between the mathematical and physical definitions of the entropy.

Typical questions for seminar classes (labs) within this section

  1. Compute the entropy for a given discrete distribution.
  2. Compute the limits of the entropy when one of the probabilities tends to 0.
  3. Develop a program which computes, for a given file, the entropy of the data stored in it.

Test questions for final assessment in this section

  1. Compute the maximum and minimum entropy for a distribution with not-fully-known probabilities.
  2. Under which condition, in general, the maximum entropy is achieved, and what does it mean from the “physical” point of view?

Section 2

Section title:

The Maximum Entropy Principal, Lagrangean Optimization and the Kelly Criterion.

Topics covered in this section:

  • Foundations of the Maximum Entropy Principle: its idea, purpose, rationale and implementation.
  • The Method of Lagrangean Multipliers, and how to use it in conjunction with the Maximum Entropty Principle.
  • The Jensen inequality.
  • The Kelly Criterion in Financial Asset Management: Another example of a non-trivial optimization setup.

What forms of evaluation were used to test students’ performance in this section?

|a|c| & Yes/No
Development of individual parts of software product code & 1
Homework and group projects & 0
Midterm evaluation & 1
Testing (written or computer based) & 0
Reports & 0
Essays & 0
Oral polls & 1
Discussions & 1


Typical questions for ongoing performance evaluation within this section

  1. What is the objective and rationale of the Maximum Entropy Principle?
  2. Why is the Maximum Entropy Principle closely connected with the method of Lagrangean Multipliers?
  3. What is the meaning and the applications of the Jensen inequality?
  4. What is the objective, and the main idea, of the Kelly Criterion?

Typical questions for seminar classes (labs) within this section

  1. Formulate and elaborate the Maximum Entropy Principle for a given problem of finding a probability distribution.
  2. Solve a given constrained optimization problem (the students must figure out the method of Lagrangean Multipliers).
  3. Elaborate a numerical implementation of the Lagranean Optimization Method in cases when there is no closed-form analytical solution.
  4. Provide the definitions of convex and concave functions in the differentiable and non-differentiable cases.
  5. Why maximization of the expected investment outcome is not necessarily the best idea in Financial Asset Management?

Test questions for final assessment in this section

  1. Solve a given problem in estimating an unknown discrete probability distribution (Hint: apply the Maximum Entropy Principle).
  2. Using the Jensen inequality, estimate the expected value of a given function of a random variable.
  3. Construct a money-management strategy using the Kelly Criterion.

Section 3

Section title:

Foundation of Coding Theory.

Topics covered in this section:

  • Symbols and messages, message transmission over communication channels, encoding and decoding basics.
  • Information content of natural languages.
  • Estimation of information transmission time (via radio frequencies).
  • Uniquely-decodable codes and Prefix (postfix)-free codes.
  • Kraft-McMillan inequality.

What forms of evaluation were used to test students’ performance in this section?

|a|c| & Yes/No
Development of individual parts of software product code & 0
Homework and group projects & 0
Midterm evaluation & 1
Testing (written or computer based) & 0
Reports & 0
Essays & 0
Oral polls & 0
Discussions & 0


Typical questions for ongoing performance evaluation within this section

  1. What is the Symbol and Message from the Information Theory point of view?
  2. Explain the probabilistic setup of transmitting messages.
  3. Explain how the transmission time of binary nessages can be estimated based on the radio waves frequency.
  4. Explain the notion of prefix-free (resp. postfix-free) codes, and why they are important.
  5. Explain the meaning of the Kraft-McMillan inequality.

Typical questions for seminar classes (labs) within this section

  1. Verify that a given code is prefix-free.
  2. For a binary code with given message lengths, verify whether it can be made prefix-free.
  3. Construct a prefix-free binary code for a given set of messages.

Test questions for final assessment in this section

Similar to seminar questions outlined above.

Section 4

Section title:

Lossless Data Compression Methods.

Topics covered in this section:

  • Connection between Entropy, Expected Code Length and the Data Compression Rate. The Shannon Theorem and the Gibbs Inequality.
  • Huffman, Shannon-Fano, Arithmetic and Block codes.
  • Lempel-Ziv compression algorithm (LZ78).

What forms of evaluation were used to test students’ performance in this section?

|a|c| & Yes/No
Development of individual parts of software product code & 1
Homework and group projects & 0
Midterm evaluation & 1
Testing (written or computer based) & 0
Reports & 0
Essays & 0
Oral polls & 0
Discussions & 0


Typical questions for ongoing performance evaluation within this section

  1. Explain the connection between the source entropy and the expected codeword length.
  2. Explain the principles of Huffman, Shannon-Fano, Arithmetic and Block codes.
  3. Formulate the LZ78 algorithm.

Typical questions for seminar classes (labs) within this section

  1. Estimate the expected codeword length for a given information source.
  2. For a given message source, construct the Huffman encoding.
  3. In a similar setup, construct the Shannon-Fano encoding.
  4. Implement / elaborate the LZ78 algorithm.

Test questions for final assessment in this section

Similar to seminar (lab) questions above, but include elaboration of data compression algorithms on paper rather than computer implementations.

Section 5

Section title:

Lossy Data Compression Algorithms and Fast Fourier Transform.

Topics covered in this section:

  • When lossy compression is applicable: Images, Audio and Video streams.
  • Example: Compression of 2D images
  • Fourier Transform and its applications
  • The Fast Fourier Transform algorithms. The Cooley-Tukey FFT algorithm.

What forms of evaluation were used to test students’ performance in this section?

|a|c| & Yes/No
Development of individual parts of software product code & 1
Homework and group projects & 0
Midterm evaluation & 0
Testing (written or computer based) & 0
Reports & 0
Essays & 0
Oral polls & 0
Discussions & 1


Typical questions for ongoing performance evaluation within this section

  1. When would lossy compression algorithms be applicable?
  2. Explain the principles of lossy image, audio and video compression.
  3. How can Fourier Transform be used for (lossy) data compression?
  4. Explain the rationale behind the Fast Fourier Transform algorithms.

Typical questions for seminar classes (labs) within this section

  1. Implement the Cooley-Tukey FFT algorithm in software.
  2. Apply the FFT method to compression and decompression of image files.

Test questions for final assessment in this section

  1. Provide the formulas of direct and inverse Fourier Transform in 1 and multiple dimensions.
  2. Elaborate the Cooley-Tukey FFT algorithm.

Section 6

Section title:

Introduction into Error-Detecting and Error-Correcting Codes (EDC/ECC).

Topics covered in this section:

  • The motivation behind EDC/ECC.
  • Examples of hardware and software implementations of EDC/ECC.
  • Hamming Codes.
  • Hashing Functions as non-linear EDC.

What forms of evaluation were used to test students’ performance in this section?

|a|c| & Yes/No
Development of individual parts of software product code & 0
Homework and group projects & 0
Midterm evaluation & 0
Testing (written or computer based) & 0
Reports & 0
Essays & 0
Oral polls & 0
Discussions & 1


Typical questions for ongoing performance evaluation within this section

  1. Provide a list of hardware implementations of ECC/EDC in commodity-of-the-shelf computers and communication devices.
  2. Describe the principles of Hamming codes construction.

Typical questions for seminar classes (labs) within this section

  1. Specify the encoding and decoding algorithms of the Hamming code (7,4) in the matrix form, including the error detection and correction operations. How many errors can it detect and correct?
  2. Specify the desirable properties of a hashing function.
  3. Provide a list of commonly-used “strong” hashing functions.

Test questions for final assessment in this section

  • Specify the encoding and decoding algorithms of the general Hamming code of the order . What are the admissible values of and ? How many errors can it detect and correct?
  • Suppose that we are building a hardware data integrity layer for an SSD (solid state drive) of 1 terabyte capacity, which must be able to preserve the data integrity even if an entire disk block (2 kbytes) is lost anywhere in SSD. How much data redundancy would this mechanism require? Which ECC/EDC would you use to that end?

Section 7

Section title:

Foundations of Foundations of Symmetric (Pre-Shared) and Public Key Cryptography.

Topics covered in this section:

  • Introduction into Cryptography and Cryptanalysis. The historical perspective: code books and Enigma.
  • Foundations of the Pre-Shared (Symmetric) Key cryptography. The Feistel framework.
  • The DES algorithm. The S-Boxes as a non-linear core of DES. Weaknesses of DES. Extensions and analogues of DES (3DES, Russian GOST).
  • A recap of Higher Algebra: Groups, Rings, Fields
  • Finite Fields: and Galois Fields .
  • Galois Fields in Cryptography: the modern AES algorithm.
  • Introduction into public-key cryptography: the RSA framework.

What forms of evaluation were used to test students’ performance in this section?

|a|c| & Yes/No
Development of individual parts of software product code & 1
Homework and group projects & 0
Midterm evaluation & 0
Testing (written or computer based) & 0
Reports & 0
Essays & 0
Oral polls & 0
Discussions & 1


Typical questions for ongoing performance evaluation within this section

  1. Explain the difference between the terms “encoding” and “encryption”.
  2. Explain the typical cryptanalysis techniques from the historical perspective.
  3. What are the desirable properties of an encryption algorithm from the Entropy point of view?
  4. Is the non-linearity property essential for an encription algorithm? Why or why not? How is this issue addressed in DES?

Typical questions for seminar classes (labs) within this section

  1. Describe the DES algorithm in details.
  2. Explain the weaknesses of DES, and how they can be rectified.
  3. Provide the definitions and typical examples of Groups, Rings and Fields.
  4. For which orders a finite field of order would exist?

Test questions for final assessment in this section

  1. Construct the addition and multiplication tables for a given field.
  2. Construct the addition and multiplication tables for a given Galois field.
  3. Explain the details of the AES encryption algorithm.
  4. Explain the principles of the RSA public-key cryptography framework.