Difference between revisions of "MSc: Optimization"

From IU
Jump to navigation Jump to search
m (I.konyukhov moved page MSc:Optimization to MSc:Optimization.F21)
(No difference)

Revision as of 20:24, 14 November 2021

Optimization

  • Course name: Optimization
  • Course number: R-01

Course characteristics

Key concepts of the class

  • Optimization of a cost function
  • Algorithms to find solution of linear and nonlinear optimization problems

What is the purpose of this course?

The main purpose of this course to make the student aware of basic notions of mathematical programming and of its importance in the area of engineering.

Course Objectives Based on Bloom’s Taxonomy

What should a student remember at the end of the course?

By the end of the course, the students should be able to

  • explain the goal of an optimization problem
  • remind the importance of converge analysis for optimization algorithms
  • draft solution codes in Python/Matlab.

What should a student be able to understand at the end of the course?

By the end of the course, the students should be able to

  • formulate a simple optimization problem
  • select the appropriate solution algorithm
  • find the solution.

What should a student be able to apply at the end of the course?

By the end of the course, the students should be able to apply:

  • the simplex method
  • algorithms to solve nonlinear optimization problems.

Course evaluation

The course has two major forms of evaluations:

Course grade breakdown
Component Points
Labs/seminar classes (weekly evaluations) 0
Interim performance assessment (class participation) 100/0
Exams 0/100

A student can take the examination by properly solving three assignments or the final exam, which is composed of three questions on the same arguments of the assignments.

Grades range

Course grading range
A. Excellent 86-100
B. Good 71-85
C. Satisfactory 56-70
D. Poor 0-55

Resources and reference material

  • Textbook: C.H. Papadimitriou, K. Steiglitz, Combinatorial Optimization, Dover, New York, 1982.
  • Textbook: D. Bertsekas, Nonlinear Programming, Athena Scientific, 1999.

Course Sections

The main sections of the course and approximate hour distribution between them is as follows:

Course Sections
Section Section Title Teaching Hours
1 Linear programming 15
2 Nonlinear programming 15


Section 1

Section title: Linear programming

Topics covered in this section

  • simplex method to solve real linear programs
  • cutting-plane and branch-and-bound methods to solve integer linear programs.


What forms of evaluation were used to test students’ performance in this section?

Yes/No
Development of individual parts of software product code 0
Homework and group projects 0
Midterm evaluation 1
Testing (written or computer based) 1
Reports 0
Essays 0
Oral polls 0
Discussions 0


Typical questions for ongoing performance evaluation within this section

  1. How a convex set and a convex function are defined?
  2. What is the difference between polyhedron and polytope?
  3. Why does always a linear program include constraints?

Test questions for final assessment in this section

  1. Why does the simplex method require to be initialized with a correct basic feasible solution?
  2. How one can test absence of solutions to a linear program?
  3. How one can test unbounded solutions to a linear program?
  4. How can the computational complexity of an optimization algorithm can be defined?


Section 2

Section title: Nonlinear programming

Topics covered in this section

  • methods for unconstrained optimization
  • linear and nonlinear least-squares problems
  • methods for constrained optimization


What forms of evaluation were used to test students’ performance in this section?

Yes/No
Development of individual parts of software product code 0
Homework and group projects 0
Midterm evaluation 1
Testing (written or computer based) 1
Reports 0
Essays 0
Oral polls 0
Discussions 0


Typical questions for ongoing performance evaluation within this section

  1. Which are the necessary and sufficient conditions of optimality of a generic minimization/maximization problem?
  2. What is the goal of a descent algorithm?
  3. What does it mean to fit some experimental data points

Test questions for final assessment in this section

  1. How is it possible to compute the Lagrange multiplier of a constrained optimization problem?
  2. Which are the convergence conditions of the steepest descent method?
  3. Which are the convergence conditions of the Newton’s method?
  4. How can one “penalize” a constraint?