Lyapunov Stability - Definition For Discrete-time Systems

Definition For Discrete-time Systems

The definition for discrete-time systems is almost identical to that for continuous-time systems. The definition below provides this, using an alternate language commonly used in more mathematical texts.

Let be a metric space and a continuous function. A point is said to be Lyapunov stable, if, for each, there is a such that for all, if

then

for all .

We say that is asymptotically stable if it belongs to the interior of its stable set, i.e. if there is a such that

whenever .

Read more about this topic:  Lyapunov Stability

Famous quotes containing the words definition and/or systems:

    It’s a rare parent who can see his or her child clearly and objectively. At a school board meeting I attended . . . the only definition of a gifted child on which everyone in the audience could agree was “mine.”
    Jane Adams (20th century)

    Not out of those, on whom systems of education have exhausted their culture, comes the helpful giant to destroy the old or to build the new, but out of unhandselled savage nature, out of terrible Druids and Berserkirs, come at last Alfred and Shakespeare.
    Ralph Waldo Emerson (1803–1882)