# Exam summary: Dynamic Programming and Optimal Control

In the autumn semester of 2018 I took the course Dynamic Programming and Optimal Control. The summary I took with me to the exam is available here in PDF format as well as in LaTeX format. Here’s an overview of the topics the course covered:

- Introduction to Dynamic Programming
- Problem statement
- Open-loop and Closed-loop control
- Discrete-state and finite-state problems

- The Dynamic Programming Algorithm (DPA)
- The standard problem formulation
- The principle of optimality
- The Dynamic Programming Algorithm
- Is expected value a good choice for the cost?

- The Dynamic Programming Algorithm (cont’d)
- Converting non-standard problems to standard form

- Infinite horizon problems
- The Stochastic Shortest Path (SSP) problem
- Theorem: SSP and the Bellman Equation (BE)

- Solving the Bellman Equation
- Value Iteration (VI)
- Policy Iteration (PI)
- Analogy and comparison between VI and PI
- Variants of VI and PI
- Connections to Linear Algebra
- Linear Programming (LP)

- Discounted Problems
- Shortest Path problems and Deterministic Finite State systems
- The Shortest Path (SP) problem
- The Deterministic Finite State (DFS) problem
- Equivalence of SP and DFS
- Hidden Markov Models (HMM) and the Viterbi algorithm

- Shortest Path algorithms
- Label-correcting methods (LCA)

- Deterministic continuous-time optimal control and the Hamilton-Jacobi-Bellman equation
- The Hamilton-Jacobi-Bellman (HJB) equation

- Pontryagin’s minimum principle
- Notation
- The minimum principle
- Fixed terminal state
- Free initial state
- Free terminal time
- Time-varying system and cost
- Singular problems