IU:TestPage

From IU
Revision as of 13:00, 25 March 2022 by R.sirgalina (talk | contribs)
Jump to navigation Jump to search

Analytical Geometry \& Linear Algebra -- II

  • Course name: Analytical Geometry \& Linear Algebra -- II
  • Course number: XYZ

Course Characteristics

Key concepts of the class

  • fundamental principles of linear algebra,
  • concepts of linear algebra objects and their representation in vector-matrix form

What is the purpose of this course?

This course covers matrix theory and linear algebra, emphasizing topics useful in other disciplines. Linear algebra is a branch of mathematics that studies systems of linear equations and the properties of matrices. The concepts of linear algebra are extremely useful in physics, data sciences, and robotics. Due to its broad range of applications, linear algebra is one of the most widely used subjects in mathematics.

Course objectives based on Bloom’s taxonomy

- What should a student remember at the end of the course?

By the end of the course, the students should be able to

  • List basic notions of linear algebra
  • Understand key principles involved in solution of linear equation systems and the properties of matrices
  • Linear regression analysis
  • Fast Fourier Transform
  • How to find eigenvalues and eigenvectors for matrix diagonalization and single value decomposition

- What should a student be able to understand at the end of the course?

By the end of the course, the students should be able to

  • Key principles involved in solution of linear equation systems and the properties of matrices
  • Become familiar with the four fundamental subspaces
  • Linear regression analysis
  • Fast Fourier Transform
  • How to find eigenvalues and eigenvectors for matrix diagonalization and single value decomposition

- What should a student be able to apply at the end of the course?

By the end of the course, the students should be able to

  • Linear equation system solving by using the vector-matrix approach
  • Make linear regression analysis
  • Fast Fourier Transform
  • To find eigenvalues and eigenvectors for matrix diagonalization and single value decomposition

Course evaluation

Course grade breakdown
type points
Labs/seminar classes 20
Interim performance assessment 30
Exams 50

Grades range

Course grading range
grade low high
A 85 100
B 65 84
C 50 64
D 0 49

Resources and reference material

  • Gilbert Strang. Linear Algebra and Its Applications, 4th Edition, Brooks Cole, 2006. ISBN: 9780030105678
  • Gilbert Strang. Introduction to Linear Algebra, 4th Edition, Wellesley, MA: Wellesley-Cambridge Press, 2009. ISBN: 9780980232714

Course Sections

The main sections of the course and approximate hour distribution between them is as follows:

Section 1

Section title

Linear equation system solving by using the vector-matrix approach

Topics covered in this section

  • The geometry of linear equations. Elimination with matrices.
  • Matrix operations, including inverses. and factorization.
  • Transposes and permutations. Vector spaces and subspaces.
  • The null space: Solving and . Row reduced echelon form. Matrix rank.

What forms of evaluation were used to test students’ performance in this section?

Form Yes/No
Development of individual parts of software product code 1
Homework and group projects 1
Midterm evaluation 1
Testing (written or computer based) 1
Reports 0
Essays 0
Oral polls 0
Discussions 1

Typical questions for ongoing performance evaluation within this section

  1. How to perform Gauss elimination?
  2. How to perform matrices multiplication?
  3. How to perform LU factorization?
  4. How to find complete solution for any linear equation system Ax=b?

Typical questions for seminar classes (labs) within this section

  1. Find the solution for the given linear equation system by using Gauss elimination.
  2. Perform factorization for the given matrix .
  3. Factor the given symmetric matrix into with the diagonal pivot matrix .
  4. Find inverse matrix for the given matrix .

Tasks for midterm assessment within this section

Test questions for final assessment in this section

  1. Find linear independent vectors (exclude dependent): , , , , . Find if is a composition of this vectors. Find .
  2. Find : ( – upper-triangular matrix). Find , if .
  3. Find complete solution for the system , if and . Provide an example of vector b that makes this system unsolvable.

Section 2

Section title

Linear regression analysis and decomposition .

Topics covered in this section

  • Independence, basis and dimension. The four fundamental subspaces.
  • Orthogonal vectors and subspaces. Projections onto subspaces
  • Projection matrices. Least squares approximations. Gram-Schmidt and A = QR.

What forms of evaluation were used to test students’ performance in this section?

Form Yes/No
Development of individual parts of software product code 1
Homework and group projects 1
Midterm evaluation 1
Testing (written or computer based) 1
Reports 0
Essays 0
Oral polls 0
Discussions 1

Typical questions for ongoing performance evaluation within this section

  1. What is linear independence of vectors?
  2. Define the four fundamental subspaces of a matrix?
  3. How to define orthogonal vectors and subspaces?
  4. How to define orthogonal complements of the space?
  5. How to find vector projection on a subspace?
  6. How to perform linear regression for the given measurements?
  7. How to find an orthonormal basis for the subspace spanned by the given vectors?

Typical questions for seminar classes (labs) within this section

  1. Check out linear independence of the given vectors
  2. Find four fundamental subspaces of the given matrix.
  3. Check out orthogonality of the given subspaces.
  4. Find orthogonal complement for the given subspace.
  5. Find vector projection on the given subspace.
  6. Perform linear regression for the given measurements.
  7. Find an orthonormal basis for the subspace spanned by the given vectors.

Tasks for midterm assessment within this section

Test questions for final assessment in this section

  1. Find the dimensions of the four fundamental subspaces associated with , depending on the parameters and : .
  2. Find a vector orthogonal to the Row space of matrix , and a vector orthogonal to the , and a vector orthogonal to the : .
  3. Find the best straight-line fit to the measurements: , , , .
  4. Find the projection matrix of vector onto the : .
  5. Find an orthonormal basis for the subspace spanned by the vectors: , , . Then express in the form of

Section 3

Section title

Fast Fourier Transform. Matrix Diagonalization.

Topics covered in this section

  • Complex Numbers. Hermitian and Unitary Matrices.
  • Fourier Series. The Fast Fourier Transform
  • Eigenvalues and eigenvectors. Matrix diagonalization.

What forms of evaluation were used to test students’ performance in this section?

Form Yes/No
Development of individual parts of software product code 1
Homework and group projects 1
Midterm evaluation 1
Testing (written or computer based) 1
Reports 0
Essays 0
Oral polls 0
Discussions 1

Typical questions for ongoing performance evaluation within this section

  1. Make the definition of Hermitian Matrix.
  2. Make the definition of Unitary Matrix.
  3. How to find matrix for the Fourier transform?
  4. When we can make fast Fourier transform?
  5. How to find eigenvalues and eigenvectors of a matrix?
  6. How to diagonalize a square matrix?

Typical questions for seminar classes (labs) within this section

  1. Check out is the given matrix Hermitian.
  2. Check out is the given matrix Unitary.
  3. Find the matrix for the given Fourier transform.
  4. Find eigenvalues and eigenvectors for the given matrix.
  5. Find diagonalize form for the given matrix.

Tasks for midterm assessment within this section

Test questions for final assessment in this section

  1. Find eigenvector of the circulant matrix for the eigenvalue = +++: .
  2. Diagonalize this matrix: .
  3. is the matrix with full set of orthonormal eigenvectors. Prove that .
  4. Find all eigenvalues and eigenvectors of the cyclic permutation matrix .

Section 4

Section title

Symmetric, positive definite and similar matrices. Singular value decomposition.

Topics covered in this section

  • Linear differential equations.
  • Symmetric matrices. Positive definite matrices.
  • Similar matrices. Left and right inverses, pseudoinverse. Singular value decomposition (SVD).

What forms of evaluation were used to test students’ performance in this section?

Form Yes/No
Development of individual parts of software product code 1
Homework and group projects 1
Midterm evaluation 1
Testing (written or computer based) 1
Reports 0
Essays 0
Oral polls 0
Discussions 1

Typical questions for ongoing performance evaluation within this section

  1. How to solve linear differential equations?
  2. Make the definition of symmetric matrix?
  3. Make the definition of positive definite matrix?
  4. Make the definition of similar matrices?
  5. How to find left and right inverses matrices, pseudoinverse matrix?
  6. How to make singular value decomposition of the matrix?

Typical questions for seminar classes (labs) within this section

  1. Find solution of the linear differential equation.
  2. Make the definition of symmetric matrix.
  3. Check out the given matrix on positive definess
  4. Check out the given matrices on similarity.
  5. For the given matrix find left and right inverse matrices, pseudoinverse matrix.
  6. Make the singular value decomposition of the given matrix.

Tasks for midterm assessment within this section

Test questions for final assessment in this section

  1. Find for .
  2. Write down the first order equation system for the following differential equation and solve it: Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle y"(0)=6} , , . Is the solution of this system will be stable?
  3. For which and quadratic form is positive definite:
  4. Find the SVD and the pseudoinverse of the matrix .