Monday, February 9, 2009
On the definition of asymptotic stability
One of you asked me how the fact that "all solutions with initial conditions in a neighbourhood of an equilibrium converging to an equilibrium as time goes to infinity" (as in the definition of asymptotic stability) should imply Lyapunov stability. This is a good question, as it highlights an important fact. Asymptotic stability of an equilibrium solution is a local concept, requiring first of all the equilibrium to be Lyapunov stable (read carefully the given definition of asymptotic stability). It is namely possible that all solutions with initial conditions in the neighbourhood of an equilibrium converge to the equilibrium as time goes to infinity, but that the equilibrium is not Lyapunov stable. It may happen that some solutions converge to the equilibrium but only by wandering first far away from the equilibrium.
As an example, consider the flow of a vector field on the circle, dx/dt=1+sin(x) (with x in [0,2pi)). This flow has one equilibrium, x=-pi/2, and all solutions converge to this equilibrium as time goes to infinity. However, if one starts very closely on one side of the equilibrium, the solution may take you all the way around the circle to the other side, since all solutions converge to the equilibrium from the same side!
As an example, consider the flow of a vector field on the circle, dx/dt=1+sin(x) (with x in [0,2pi)). This flow has one equilibrium, x=-pi/2, and all solutions converge to this equilibrium as time goes to infinity. However, if one starts very closely on one side of the equilibrium, the solution may take you all the way around the circle to the other side, since all solutions converge to the equilibrium from the same side!
Subscribe to Posts [Atom]