BSc:NonlinearOptimization

From IU
Revision as of 13:48, 30 July 2021 by 10.90.136.11 (talk) (Created page with "= Nonlinear Optimization = * <span>'''Course name:'''</span> Nonlinear Optimization * <span>'''Course number:'''</span> == Course Characteristics == === Key concepts of the...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Nonlinear Optimization

  • Course name: Nonlinear Optimization
  • Course number:

Course Characteristics

Key concepts of the class

  • Methods of nonlinear optimization
  • Solution of optimization problems

What is the purpose of this course?

The main purpose of this course...

  • The formation of professional competencies in accordance with the Federal State Educational Standard
  • Fundamentalization of education
  • Training of a specialist with knowledge and skills related to optimization problems in mechatronics and robotics based on meaningful formulation and subsequent formalization, and solution.

Course objectives based on Bloom’s taxonomy

- What should a student remember at the end of the course?

By the end of the course, the students should be able to remember and recognize:

  • The role and place of optimization methods in the development of modern mechatronics and robotics,
  • Terminology of optimization problems,
  • Classification of optimization tasks,
  • Models and methods of optimization theory, tasks effectively solved with their use,
  • Concepts and principles of theories related to solving mathematical programming problems,
  • Optimization software

- What should a student be able to understand at the end of the course?

By the end of the course, the students should be able to describe and explain

  • Formalized description of optimization problems for constructing mathematical models,
  • Optimization methods and theory for solving problems of mechatronic and robotic systems (meaningful statement, choice of solution method, implementation),
  • Results of solving problems of mathematical optimization,
  • Instrumental (software) tools for analytical and numerical solution of optimization problems

- What should a student be able to apply at the end of the course?

By the end of the course, the students should be able to ...

  • Skills to formalize optimization tasks,
  • Information handling technologies for solving finite-dimensional optimization problems,
  • Skills in using numerical methods to find the optimal solution for successful problem solving,
  • The main types of information systems and general-purpose applications for solving practical optimization problems with their help,
  • Algorithms for solving problems of mathematical optimization

Course evaluation

Course grade breakdown
Proposed points
Labs/seminar classes 20
Interim performance assessment 30
Exams 50

The course grades are given according to the following rules: Homework assignments (4) = 20 pts, Quizzes (4) = 40 pts, Term project = 40 pts

Grades range

Course grading range
Proposed range
A. Excellent 90-100
B. Good 75-89
C. Satisfactory 60-74
D. Poor 0-59

If necessary, please indicate freely your course’s grading features.

Resources and reference material

  • Dimitri Bertsekas. Nonlinear Programming: 3rd Edition. Athena Scientific, 2016. ISBN: 1886529051
  • Nocedal J., Wright S. Numerical optimization. Springer Science and Business Media, 2006
  • Dimitri Bertsekas. Nonlinear Programming: 2nd Edition. Belmont, MA: Athena Scientific Press, 1999. ISBN: 1886529000.

Course Sections

The main sections of the course and approximate hour distribution between them is as follows:

Course Sections
Section Section Title Teaching Hours
1 Dynamics and electrodynamics 6
2 Electric motors 6
3 Transmission mechanisms and sensors 4
4 Control systems 6

Section 1

Section title:

Unconstrained Optimization

Topics covered in this section:

  • Optimality Conditions
  • Gradient Methods
  • Convergence Analysis of Gradient Methods
  • Rate of Convergence
  • Newton and Gauss-Newton Methods
  • Additional Methods

What forms of evaluation were used to test students’ performance in this section?

|a|c| & Yes/No
Development of individual parts of software product code & 0
Homework and group projects & 1
Midterm evaluation & 0
Testing (written or computer based) & 1
Reports & 0
Essays & 0
Oral polls & 0
Discussions & 1


Typical questions for ongoing performance evaluation within this section

  1. What are the optimal conditions?
  2. Describe the idea of gradient methods.
  3. What is the rate of convergence?
  4. Explain Newton’s optimization method.
  5. What are the least squares problems?

Typical questions for seminar classes (labs) within this section

  1. Implement Newton’s method (e.g., using Matlab).
  2. Implement the gradient method.
  3. Implement Gauss-Newton method.
  4. Implement the nonderivative method.
  5. Implement the conjugate direction method.

Test questions for final assessment in this section

  1. What is Newton’s optimization method?
  2. What are the gradient methods?
  3. Explain Quasi-Newton optimization method.

Section 2

Section title:

Optimization Over a Convex Set

Topics covered in this section:

  • Optimality Conditions
  • Feasible Direction Methods
  • Alternatives to Gradient Projection
  • Two-Metric Projection Methods
  • Manifold Suboptimization Methods

What forms of evaluation were used to test students’ performance in this section?

|a|c| & Yes/No
Development of individual parts of software product code & 0
Homework and group projects & 1
Midterm evaluation & 0
Testing (written or computer based) & 1
Reports & 0
Essays & 0
Oral polls & 0
Discussions & 1


Typical questions for ongoing performance evaluation within this section

  1. What are the optimal conditions?
  2. Descent directions and stepsize rules.
  3. Feasible directions and stepsize rules based on projection.
  4. Convergence analysis.
  5. What is convex set?

Typical questions for seminar classes (labs) within this section

  1. Implement the conditional gradient method.
  2. Implement gradient projection method.
  3. Implement two-metric projection method.
  4. Implement manifold suboptimization Method.

Test questions for final assessment in this section

  1. Explain optimality conditions
  2. Explain feasible direction methods
  3. What are alternatives to gradient projection?
  4. Explain twomMetric projection methods.
  5. Describe manifold suboptimization methods.

Section 3

Section title:

Constrained Optimization and Lagrange Multipliers

Topics covered in this section:

  • Necessary conditions for equality constraints
  • Sufficient conditions and sensitivity analysis
  • Inequality constraints
  • Linear constraints and duality
  • Barrier and interior point methods
  • Penalty and augmented Lagrangian methods
  • Lagrangian and Primal-Dual Interior Point Methods

What forms of evaluation were used to test students’ performance in this section?

|a|c| & Yes/No
Development of individual parts of software product code & 0
Homework and group projects & 0
Midterm evaluation & 0
Testing (written or computer based) & 1
Reports & 0
Essays & 0
Oral polls & 1
Discussions & 1


Typical questions for ongoing performance evaluation within this section

  1. The penalty approach
  2. The elimination approach
  3. The Lagrangian function
  4. The augmented Lagrangian approach
  5. Karush-Kuhn-Tucker optimally conditions
  6. Sufficiency conditions and Lagrangian minimization

Typical questions for seminar classes (labs) within this section

  1. Implement barrier and interior point methods
  2. Implement the quadratic penalty function method
  3. Implement multiplier method
  4. Implement Newton-like method for equality constraints
  5. Implement primal-dual point method

Test questions for final assessment in this section

  1. Describe sufficiency conditions and Lagrangian minimization
  2. What are Karush-Kuhn-Tucker optimally conditions?
  3. Explain penalty and augmented Lagrangian methods
  4. Describe barrier and interior point methods

Section 3

Section title:

Duality, convex programming and dual methods

Topics covered in this section:

  • The dual problem
  • Convex cost - linear constraints
  • Convex cost - convex constraints
  • Conjugate functions and Fenchel duality
  • Dual ascent methods for differentiable dual problems
  • Nondifferentiable optimization methods

What forms of evaluation were used to test students’ performance in this section?

|a|c| & Yes/No
Development of individual parts of software product code & 0
Homework and group projects & 1
Midterm evaluation & 0
Testing (written or computer based) & 1
Reports & 0
Essays & 0
Oral polls & 1
Discussions & 1


Typical questions for ongoing performance evaluation within this section

  1. Lagrange multipliers
  2. Characterization of primal and dual optimal solutions
  3. Network optimization
  4. Coordinate ascent for quadratic programming.
  5. Subgradient methods

Typical questions for seminar classes (labs) within this section

  1. Implement subgradient method
  2. Implement approximate and incremental subgradient methods
  3. Implement cutting plane method.
  4. Implement ascent and approximate ascent methods.

Test questions for final assessment in this section

  1. The weak duality theorem.
  2. Separable problems and their geometry
  3. Lagrangian relaxation