Lyapunov Stability - Definition For Discrete-time Systems

Definition For Discrete-time Systems

The definition for discrete-time systems is almost identical to that for continuous-time systems. The definition below provides this, using an alternate language commonly used in more mathematical texts.

Let be a metric space and a continuous function. A point is said to be Lyapunov stable, if, for each, there is a such that for all, if

then

for all .

We say that is asymptotically stable if it belongs to the interior of its stable set, i.e. if there is a such that

whenever .

Read more about this topic:  Lyapunov Stability

Famous quotes containing the words definition and/or systems:

    Was man made stupid to see his own stupidity?
    Is God by definition indifferent, beyond us all?
    Is the eternal truth man’s fighting soul
    Wherein the Beast ravens in its own avidity?
    Richard Eberhart (b. 1904)

    The skylines lit up at dead of night, the air- conditioning systems cooling empty hotels in the desert and artificial light in the middle of the day all have something both demented and admirable about them. The mindless luxury of a rich civilization, and yet of a civilization perhaps as scared to see the lights go out as was the hunter in his primitive night.
    Jean Baudrillard (b. 1929)