Course Title and Code
Optimal Control (EE5521)
Course Credit
3-0-0-3 (Lecture-Tutorial-Practical-Credit)
Course Category
ERC
Target Programme
MS
Prerequisite
Knowledge of state space representation
Consent of Teacher Required
Required
Learning Outcomes
At the end of the course, students should be able to understand the formulation of optimal control problems, applying the variational approach, the linear quadratic regulator problem and its solution, and an overview of robust control methods.
Teaching Methodology
Classroom lectures
Assessment Methods
Written examination, Continuous assessment
Course Content
S. No. | Topics | Lecture Hours |
---|---|---|
1 | Review of parametric optimization, Concept of functional, variational problems and performance indices, Euler-Lagrange equation to find the extremal of a functional, Transversality condition, Application of variation approach to control problems. | 10 |
2 | A brief review of Stability, Lyapunov stability and LaSalle’s Invariance Principle. Linear quadratic regulator (LQR) problem for finite horizon and infinite horizon, Optimal solution of LQR problem, Different techniques for solution of algebraic Riccati equation, Stability and robustness properties of LQR design, Optimal control with constraints on input, Optimal saturating controllers, Hamilton-Jacobi-Bellman equation, Dynamic programming and principle of optimality, Time optimal control problem, Pontryagin’s minimum principle. | 22 |
3 | Concept of system and signal norms, Small-gain theorem, physical interpretation of H-infinity norm, H-2 norm, Computation of H-infinity Norm, statement of H-infinity control problem, H-infinity control problem: Synthesis. Discussion on stability margin and performance of H-infinity based controlled systems. | 7 |
4 | Extended topics: Brief overview of optimization problems in control, Linear quadratic Gaussian control, optimal control connections to game theory. | 3 |
Textbooks
- D.S. Naidu, "Optimal Control Systems," CRC Press, 1st Edition 2002, ISBN: 978-0849308925.
- Donald E. Kirk, "Optimal Control Theory: An Introduction," Dover Publications, 2004, ISBN: 978-0486434841.
- Dimitri P. Bertsekas, "Dynamic Programming and Optimal Control," 4th Edition, Athena Scientific, 2017, ISBN: 978-1886529434.
Reference Books
- Arturo Locatelli, "Optimal Control: An Introduction," Birkhäuser, 2001, ISBN: 978-3764364083.
- Daniel Liberzon, "Calculus of Variations and Optimal Control Theory: A Concise Introduction," Princeton University Press, 2011, ISBN: 978-0691151878.
- Mark Kot, "A First Course in the Calculus of Variations," Vol. 72. American Mathematical Society, 2014, ISBN: 978-1470414955.
- Mike Mesterton-Gibbons, "A Primer on Calculus of Variations and Optimal Control Theory," Vol. 50. American Mathematical Society, 2009, ISBN: 978-0821847725.