Lyapunov Stability - Definition For Discrete-time Systems

Definition For Discrete-time Systems

The definition for discrete-time systems is almost identical to that for continuous-time systems. The definition below provides this, using an alternate language commonly used in more mathematical texts.

Let be a metric space and a continuous function. A point is said to be Lyapunov stable, if, for each, there is a such that for all, if

then

for all .

We say that is asymptotically stable if it belongs to the interior of its stable set, i.e. if there is a such that

whenever .

Read more about this topic:  Lyapunov Stability

Famous quotes containing the words definition and/or systems:

    I’m beginning to think that the proper definition of “Man” is “an animal that writes letters.”
    Lewis Carroll [Charles Lutwidge Dodgson] (1832–1898)

    What is most original in a man’s nature is often that which is most desperate. Thus new systems are forced on the world by men who simply cannot bear the pain of living with what is. Creators care nothing for their systems except that they be unique. If Hitler had been born in Nazi Germany he wouldn’t have been content to enjoy the atmosphere.
    Leonard Cohen (b. 1934)