# CMU 16-745: Optimal Control
# Continuous-Time Dynamics
-
Most general/generic for smooth systems:
x˙=f(x,u)
- x∈Rn is the state
- u∈Rm is the input
- f represents the dynamics
-
For a mechanical system:
x=[qv]
- q: configuration (not always a vector)
- v: velocity
# Example: Pendulum
- Equation of Motion: ml2θ¨+mglsinθ=τ (where τ=u)
- State variables: q=θ, v=θ˙
- State vector: x=[θθ˙]⇒x˙=[θ˙…]=f(x,u)[θ˙−lgsinθ+ml21u]
- Manifold: q∈S1 (circle), x∈S1×R (cylinder)
# Control-Affine Systems
x˙=f0(x)+B(x)u
- f0(x): "drift"
- B(x): "input Jacobian"
- Most systems can be put in this form.
- Pendulum Example:
f0(x)=[θ˙−lgsinθ],B(x)=[0ml21]
# Manipulator Dynamics
M(q)v˙+C(q,v)=B(q)u+F
- M(q): "Mass matrix" or "Inertia tensor"
- C(q,v): "Dynamic Bias" (Coriolis + gravity)
- B(q): "Input Jacobian"
- F: "External forces"
Velocity Kinematics: q˙=G(q)v
State-Space Form:
x˙=f(x,u)=[G(q)vM(q)−1(B(q)u−C)]
- Pendulum Example: M(q)=ml2,C(q,v)=mglsinθ,B=1,G=I
- All mechanical systems can be written like this.
- Just rewriting Euler-Lagrange equations: L=21vTM(q)v−U(q)
# Linear Systems
x˙=A(t)x+B(t)u
- Called "time-invariant" if A(t)=A and B(t)=B.
- Called "time-variant" otherwise.
- We often approximate non-linear systems with linear ones:
x˙=f(x,u)⇒A=∂x∂f,B=∂u∂f
# Equilibria
- A point where the system will "remain at rest": x˙=f(x,u)=0.
- Algebraically, these are the roots of the dynamics.
- Pendulum Example:
x˙=[θ˙−lgsinθ]=[00]⇒θ˙=0,θ=0,π
# First Control Problem:
- Can I move the equilibria?
Example: Hold pendulum at θ=π/2
x˙=[θ˙−lgsin(π/2)+ml21u]=[00]
ml21u=lgsin(π/2)⇒u=mgl
# Stability of Equilibrium
# Example: Pendulum
f(x)=[θ˙−lgsinθ]⇒∂x∂f=[0−lgcosθ10]
-
At θ=π (Upward):
∂x∂fθ=π=[0g/l10]⇒eig(∂x∂f)=±lg⇒ unstable
-
At θ=0 (Downward):
∂x∂fθ=0=[0−g/l10]⇒eig(∂x∂f)=±ilg⇒ marginally stable (undamped oscillations)
-
Marginally Stable: The pure imaginary case results in undamped oscillations.
-
Adding Damping: (e.g., u=−kdθ˙) results in a strictly negative real part.