4.4 KiB
What is a limit cycle?
Isolated, closed trajectories.
- Not like a center.
- Centers are closed, but not isolated.
- Neighboring trajectories are NOT closed. Different forms:
- Stable - Trajectories pull onto the limit cycle
- Unstable - Trajectories are repelled by the limit cycle.
A imit cycle is a explicitly nonlinear phenomenon.
You can't identify if there is a limit cycle by using linearizing methods.
How do we find limit cycles?
How do we rule out a closed loop?
Bendixon's Criterion:
If we have some flow field:
\dot{\vec{x}}= f(\vec x)
- If we can find a function
\zeta(x,y)such that\nabla \cdot (\zeta f))does not change sign in some region ofR, then there's no limit cycle in that region. - If in some region
R,\zeta(x,y)s.t :\frac{\partial}{\partial x} (\zeta(x,y) f_1(x,y)) + \frac{\partial}{\partial y}(\zeta(x,y) f_2(x,y))is of constant sign, then there are no closed orbits in R. Finding\zetais tricky.
Example:
\dot x = y
\dot y = -x -y + x^2 + y^2
Assume \zeta(x,y) = 1
\partial / \partial x (y) + \partial / \partial y (-x - y +x^2 +y^2) /rightarrow 0 + (-1+2y)
Assume \zeta(x,y) = e^{\alpha x}
\partial / \partial x (e^{\alpha x} y) + \partial / \partial y (e^{\alpha x} (-x - y +x^2 +y^2))
\alpha e^{\alpha x} y + 2 y e^{\alpha x} - e^{\alpha x}
e^{\alpha x}((\alpha+2) y -1)
Now let \alpha = -2
\nabla \cdot (\zeta f) = e^{-2 x}
Now a special note: These functions can define where limit cycles can't be. If the function doesn't change sign for a subset of R, there can't be a limit cycle contained in that subset. There CAN be a limit cycle that crosses the point the function changes sign.
Lyapunov Function
Aleksander Lyapunov (Liapunov)
V(\vec x) = V(x,y) \leftarrow a scalar function
V(\vec x) > 0 \forall \vec x\neq \vec x^*
V(\vec x^*) = 0
\dot V = \frac{dV}{dt} <0
V(\vec x) is a positive definite function.
Then the system is stable ISL (in the sense of Lyapunov).
The system will always asymptotically approach the equilibrium point.
\frac{dV}{dt} = \frac{dV}{dx} \frac{dx}{dt} + \frac{dV}{dy} \frac {dy}{dt} = \dot x \frac{dV}{dx} + \dot y \frac{dV}{dy}
Example:
\dot x = y - x^3
\dot y = -x-y^3
V(x,y) = c_1 x^2 + c_2 y^2
\frac{dV}{dt} = 2 c_1 x \dot x + 2 c_2 y \dot y
= 2 c_1 x(y-x^3) + 2c_2 y(-x-y^3)
Assume c_1 = c_2
... \therefore \frac{dV}{dt} = -2c(x^4+y^4) < 0
Therefore limit cycles are not possible.
Index Method
This is a method covered in the book. Sometimes is used to rule out limit cycles.
Poincare - Bendixon Theorem.
Book!
Perturbation Methods
- Weakly nonlinear systems
Linear Resonator:
m \ddot x + b \dot x + kx = fWeakly Nonlinear:m \ddot x + b \dot x + kx + \alpha x^3 = fWith a bookkeeping term:m \ddot x + b \dot x + kx + \epsilon \alpha x^3 = f
Asymptotic Expansion
x \neq x(t) \rightarrow x = x(t,\epsilon)
x(t,\epsilon) = x_0(t) + \epsilon x_1(t) + \epsilon^2 x_2(t) + ... + \text{H.O.T.s}
Looking for solutions that are like
x(t,\epsilon) ~ \sum_{k=0}^{\inf} x_k(t) \delta_c(\epsilon)
Where \delta is an asymptotically scaling number. This series sometimes doesn't converge but still gives useful information about the solution.
Example:
for x>=0
\dot x + x - \epsilon x^2 = 0, x(0) = 2
Develop a 3 term approximation using asymptotic expansion:
x(t,\epsilon) = x_o(t) + \epsilon x_1(t) + \epsilon^2 x_2(t) + ...
\dot x(t,\epsilon) = \dot x_o(t) + \epsilon \dot x_1(t) + \epsilon^2 \dot x_2(t) + ...
Sub into the EOM:, and satisfy initial conditions x_0(2) = 0; x_1(0) = x_2(0) = 0
\dot x_o(t) + \epsilon \dot x_1(t) + \epsilon^2 \dot x_2(t) + x_o(t) + \epsilon x_1(t) + \epsilon^2 x_2(t) - \epsilon (x_o(t) + \epsilon x_1(t) + \epsilon^2 x_2(t))^2 = 0
Now that last term is going to yield higher order \epsilon terms (^2, ^4). We can't get rid of these, we'll need to keep them.
Now collect terms:
| Power | Expression |
|---|---|
\epsilon^0 |
\dot x_0 + x_0 = 0 \rightarrow x_0 = c_1e^{-t} \rightarrow x_0 = 2 e^{-t} |
\epsilon^2 |
\dot x_1 + x_1 - x_0^2 = 0 \rightarrow \dot x_1 + x_1 - 4 e^{2t} = 0 \rightarrow x_1 = 4(e^{-t} - 2e^{-2t}) |
\epsilon^3 |
\dot x_2 + x_2 -2(2e^{-t})(4 e^{-t} - e^{-2t}) \rightarrow ... |
Then we have an approximate solution for small \epsilon. What small means depends on the problem... |