BSc:NonlinearOptimization
Nonlinear Optimization
- Course name: Nonlinear Optimization
- Course number:
Course Characteristics
Key concepts of the class
- Methods of nonlinear optimization
- Solution of optimization problems
What is the purpose of this course?
The main purpose of this course...
- The formation of professional competencies in accordance with the Federal State Educational Standard
- Fundamentalization of education
- Training of a specialist with knowledge and skills related to optimization problems in mechatronics and robotics based on meaningful formulation and subsequent formalization, and solution.
Course objectives based on Bloom’s taxonomy
- What should a student remember at the end of the course?
By the end of the course, the students should be able to remember and recognize:
- The role and place of optimization methods in the development of modern mechatronics and robotics,
- Terminology of optimization problems,
- Classification of optimization tasks,
- Models and methods of optimization theory, tasks effectively solved with their use,
- Concepts and principles of theories related to solving mathematical programming problems,
- Optimization software
- What should a student be able to understand at the end of the course?
By the end of the course, the students should be able to describe and explain
- Formalized description of optimization problems for constructing mathematical models,
- Optimization methods and theory for solving problems of mechatronic and robotic systems (meaningful statement, choice of solution method, implementation),
- Results of solving problems of mathematical optimization,
- Instrumental (software) tools for analytical and numerical solution of optimization problems
- What should a student be able to apply at the end of the course?
By the end of the course, the students should be able to ...
- Skills to formalize optimization tasks,
- Information handling technologies for solving finite-dimensional optimization problems,
- Skills in using numerical methods to find the optimal solution for successful problem solving,
- The main types of information systems and general-purpose applications for solving practical optimization problems with their help,
- Algorithms for solving problems of mathematical optimization
Course evaluation
Proposed points | ||
---|---|---|
Labs/seminar classes | 20 | |
Interim performance assessment | 30 | |
Exams | 50 |
The course grades are given according to the following rules: Homework assignments (4) = 20 pts, Quizzes (4) = 40 pts, Term project = 40 pts
Grades range
Proposed range | ||
---|---|---|
A. Excellent | 90-100 | |
B. Good | 75-89 | |
C. Satisfactory | 60-74 | |
D. Poor | 0-59 |
If necessary, please indicate freely your course’s grading features.
Resources and reference material
- Dimitri Bertsekas. Nonlinear Programming: 3rd Edition. Athena Scientific, 2016. ISBN: 1886529051
- Nocedal J., Wright S. Numerical optimization. Springer Science and Business Media, 2006
- Dimitri Bertsekas. Nonlinear Programming: 2nd Edition. Belmont, MA: Athena Scientific Press, 1999. ISBN: 1886529000.
Course Sections
The main sections of the course and approximate hour distribution between them is as follows:
Section | Section Title | Teaching Hours |
---|---|---|
1 | Dynamics and electrodynamics | 6 |
2 | Electric motors | 6 |
3 | Transmission mechanisms and sensors | 4 |
4 | Control systems | 6 |
Section 1
Section title:
Unconstrained Optimization
Topics covered in this section:
- Optimality Conditions
- Gradient Methods
- Convergence Analysis of Gradient Methods
- Rate of Convergence
- Newton and Gauss-Newton Methods
- Additional Methods
What forms of evaluation were used to test students’ performance in this section?
|a|c| & Yes/No
Development of individual parts of software product code & 0
Homework and group projects & 1
Midterm evaluation & 0
Testing (written or computer based) & 1
Reports & 0
Essays & 0
Oral polls & 0
Discussions & 1
Typical questions for ongoing performance evaluation within this section
- What are the optimal conditions?
- Describe the idea of gradient methods.
- What is the rate of convergence?
- Explain Newton’s optimization method.
- What are the least squares problems?
Typical questions for seminar classes (labs) within this section
- Implement Newton’s method (e.g., using Matlab).
- Implement the gradient method.
- Implement Gauss-Newton method.
- Implement the nonderivative method.
- Implement the conjugate direction method.
Test questions for final assessment in this section
- What is Newton’s optimization method?
- What are the gradient methods?
- Explain Quasi-Newton optimization method.
Section 2
Section title:
Optimization Over a Convex Set
Topics covered in this section:
- Optimality Conditions
- Feasible Direction Methods
- Alternatives to Gradient Projection
- Two-Metric Projection Methods
- Manifold Suboptimization Methods
What forms of evaluation were used to test students’ performance in this section?
|a|c| & Yes/No
Development of individual parts of software product code & 0
Homework and group projects & 1
Midterm evaluation & 0
Testing (written or computer based) & 1
Reports & 0
Essays & 0
Oral polls & 0
Discussions & 1
Typical questions for ongoing performance evaluation within this section
- What are the optimal conditions?
- Descent directions and stepsize rules.
- Feasible directions and stepsize rules based on projection.
- Convergence analysis.
- What is convex set?
Typical questions for seminar classes (labs) within this section
- Implement the conditional gradient method.
- Implement gradient projection method.
- Implement two-metric projection method.
- Implement manifold suboptimization Method.
Test questions for final assessment in this section
- Explain optimality conditions
- Explain feasible direction methods
- What are alternatives to gradient projection?
- Explain twomMetric projection methods.
- Describe manifold suboptimization methods.
Section 3
Section title:
Constrained Optimization and Lagrange Multipliers
Topics covered in this section:
- Necessary conditions for equality constraints
- Sufficient conditions and sensitivity analysis
- Inequality constraints
- Linear constraints and duality
- Barrier and interior point methods
- Penalty and augmented Lagrangian methods
- Lagrangian and Primal-Dual Interior Point Methods
What forms of evaluation were used to test students’ performance in this section?
|a|c| & Yes/No
Development of individual parts of software product code & 0
Homework and group projects & 0
Midterm evaluation & 0
Testing (written or computer based) & 1
Reports & 0
Essays & 0
Oral polls & 1
Discussions & 1
Typical questions for ongoing performance evaluation within this section
- The penalty approach
- The elimination approach
- The Lagrangian function
- The augmented Lagrangian approach
- Karush-Kuhn-Tucker optimally conditions
- Sufficiency conditions and Lagrangian minimization
Typical questions for seminar classes (labs) within this section
- Implement barrier and interior point methods
- Implement the quadratic penalty function method
- Implement multiplier method
- Implement Newton-like method for equality constraints
- Implement primal-dual point method
Test questions for final assessment in this section
- Describe sufficiency conditions and Lagrangian minimization
- What are Karush-Kuhn-Tucker optimally conditions?
- Explain penalty and augmented Lagrangian methods
- Describe barrier and interior point methods
Section 3
Section title:
Duality, convex programming and dual methods
Topics covered in this section:
- The dual problem
- Convex cost - linear constraints
- Convex cost - convex constraints
- Conjugate functions and Fenchel duality
- Dual ascent methods for differentiable dual problems
- Nondifferentiable optimization methods
What forms of evaluation were used to test students’ performance in this section?
|a|c| & Yes/No
Development of individual parts of software product code & 0
Homework and group projects & 1
Midterm evaluation & 0
Testing (written or computer based) & 1
Reports & 0
Essays & 0
Oral polls & 1
Discussions & 1
Typical questions for ongoing performance evaluation within this section
- Lagrange multipliers
- Characterization of primal and dual optimal solutions
- Network optimization
- Coordinate ascent for quadratic programming.
- Subgradient methods
Typical questions for seminar classes (labs) within this section
- Implement subgradient method
- Implement approximate and incremental subgradient methods
- Implement cutting plane method.
- Implement ascent and approximate ascent methods.
Test questions for final assessment in this section
- The weak duality theorem.
- Separable problems and their geometry
- Lagrangian relaxation