Optimal Control (EE 60565)
University of Notre Dame
Fall 2015 - Enroll in EE60565 section 01
Office Hours: M.D. Lemmon - T/Th 3-5 - Fitzpatrick 264
Description: Optimal control is concerned with
control laws that maximize a specified measure of
a dynamical system's performance. This course is a rigorous introduction to
the classical theory of optimal control. The topics covered in
this course include optimization of static functions, the calculus of
variations, Pontryagin's principle, dynamic programming, linear
quadratic optimal control, non-cooperative differential games with
applications to control theory
- Review of Mathematical Programming
Convex Analysis, Constrained and Unconstrained Problems, KKT Conditions
- Calculus of Variations and Optimal Control
Basic CoV Problem, Euler-Lagrange Equations, CoV and Optimal Control, Numerical Methods
- Maximum Principle
Statement of MP, Proof of MP, Bang-bang Control
- Dynamic Programming
Principle of Optimality, Bellman Equation, Nonsmooth solutions, LQR, Stochastic Control
- Game Theory and Optimal Control
Matrix Games, Differential Games, HJI equation, Hinfinity Control, Pareto Efficient Solutions
- Basar, Tamer, et al. Dynamic noncooperative game theory. Vol. 200. London: Academic press, 1995.
- Bazarra, Sherali and Shetty, Nonlinear Programming: theory and algorithms, 2nd edition, John Wiley, 1993.
- Dockner, Engelbert. Differential games in economics and management science. Cambridge University Press, 2000.
- Dorato, Peter, Vito Cerone, and Chaouki Abdallah. Linear-quadratic control: an introduction. Simon & Schuster, 1994.
- Fleming, Wendell, and Raymond Rishel. Deterministic and stochastic optimal control. Springer, 1975.
- W. Rudin, Principles of Mathematical Analysis , McGraw-Hill, 3rd edition, 1976.