Lyapunov Stability - Definition For Discrete-time Systems

Definition For Discrete-time Systems

The definition for discrete-time systems is almost identical to that for continuous-time systems. The definition below provides this, using an alternate language commonly used in more mathematical texts.

Let be a metric space and a continuous function. A point is said to be Lyapunov stable, if, for each, there is a such that for all, if

then

for all .

We say that is asymptotically stable if it belongs to the interior of its stable set, i.e. if there is a such that

whenever .

Read more about this topic:  Lyapunov Stability

Famous quotes containing the words definition and/or systems:

    It is very hard to give a just definition of love. The most we can say of it is this: that in the soul, it is a desire to rule; in the spirit, it is a sympathy; and in the body, it is but a hidden and subtle desire to possess—after many mysteries—what one loves.
    François, Duc De La Rochefoucauld (1613–1680)

    The skylines lit up at dead of night, the air- conditioning systems cooling empty hotels in the desert and artificial light in the middle of the day all have something both demented and admirable about them. The mindless luxury of a rich civilization, and yet of a civilization perhaps as scared to see the lights go out as was the hunter in his primitive night.
    Jean Baudrillard (b. 1929)