Optimal Control (EE 60565)

University of Notre Dame

Fall 2015 - Enroll in EE60565 section 01
location TBD
Office Hours: M.D. Lemmon - T/Th 3-5 - Fitzpatrick 264

Course Vault

Description: Optimal control is concerned with control laws that maximize a specified measure of a dynamical system's performance. This course is a rigorous introduction to the classical theory of optimal control. The topics covered in this course include optimization of static functions, the calculus of variations, Pontryagin's principle, dynamic programming, linear quadratic optimal control, non-cooperative differential games with applications to control theory
  • Review of Mathematical Programming
    Convex Analysis, Constrained and Unconstrained Problems, KKT Conditions
  • Calculus of Variations and Optimal Control
    Basic CoV Problem, Euler-Lagrange Equations, CoV and Optimal Control, Numerical Methods
  • Maximum Principle
    Statement of MP, Proof of MP, Bang-bang Control
  • Dynamic Programming
    Principle of Optimality, Bellman Equation, Nonsmooth solutions, LQR, Stochastic Control
  • Game Theory and Optimal Control
    Matrix Games, Differential Games, HJI equation, Hinfinity Control, Pareto Efficient Solutions

Grading: Recitations 50% - Term Paper 50%
Instructor: Michael Lemmon, Dept. of Electrical Engineering, University of Notre Dame, Fitzpatrick 264, lemmon at nd.edu
Textbook: D. Liberzon, Calculus of Variations and Optimal Control Theory: a concise introduction, Princeton University Press, 2012
Additional References:
  1. Basar, Tamer, et al. Dynamic noncooperative game theory. Vol. 200. London: Academic press, 1995.
  2. Bazarra, Sherali and Shetty, Nonlinear Programming: theory and algorithms, 2nd edition, John Wiley, 1993.
  3. Dockner, Engelbert. Differential games in economics and management science. Cambridge University Press, 2000.
  4. Dorato, Peter, Vito Cerone, and Chaouki Abdallah. Linear-quadratic control: an introduction. Simon & Schuster, 1994.
  5. Fleming, Wendell, and Raymond Rishel. Deterministic and stochastic optimal control. Springer, 1975.
  6. W. Rudin, Principles of Mathematical Analysis , McGraw-Hill, 3rd edition, 1976.