Lyapunov Stability - Definition For Discrete-time Systems

Definition For Discrete-time Systems

The definition for discrete-time systems is almost identical to that for continuous-time systems. The definition below provides this, using an alternate language commonly used in more mathematical texts.

Let be a metric space and a continuous function. A point is said to be Lyapunov stable, if, for each, there is a such that for all, if

then

for all .

We say that is asymptotically stable if it belongs to the interior of its stable set, i.e. if there is a such that

whenever .

Read more about this topic:  Lyapunov Stability

Famous quotes containing the words definition and/or systems:

    No man, not even a doctor, ever gives any other definition of what a nurse should be than this—”devoted and obedient.” This definition would do just as well for a porter. It might even do for a horse. It would not do for a policeman.
    Florence Nightingale (1820–1910)

    Before anything else, we need a new age of Enlightenment. Our present political systems must relinquish their claims on truth, justice and freedom and have to replace them with the search for truth, justice, freedom and reason.
    Friedrich Dürrenmatt (1921–1990)