Definition For Discrete-time Systems
The definition for discrete-time systems is almost identical to that for continuous-time systems. The definition below provides this, using an alternate language commonly used in more mathematical texts.
Let be a metric space and a continuous function. A point is said to be Lyapunov stable, if, for each, there is a such that for all, if
then
for all .
We say that is asymptotically stable if it belongs to the interior of its stable set, i.e. if there is a such that
whenever .
Read more about this topic: Lyapunov Stability
Famous quotes containing the words definition and/or systems:
“Beauty, like all other qualities presented to human experience, is relative; and the definition of it becomes unmeaning and useless in proportion to its abstractness. To define beauty not in the most abstract, but in the most concrete terms possible, not to find a universal formula for it, but the formula which expresses most adequately this or that special manifestation of it, is the aim of the true student of aesthetics.”
—Walter Pater (18391894)
“Our little systems have their day;
They have their day and cease to be:
They are but broken lights of thee,
And thou, O Lord, art more than they.”
—Alfred Tennyson (18091892)