Dynamical System (definition) - General Definition

General Definition

In the most general sense, a dynamical system is a tuple (T, M, Φ) where T is a monoid, written additively, M is a set and Φ is a function

with

for

The function Φ(t,x) is called the evolution function of the dynamical system: it associates to every point in the set M a unique image, depending on the variable t, called the evolution parameter. M is called phase space or state space, while the variable x represents an initial state of the system.

We often write

if we take one of the variables as constant.

is called flow through x and its graph trajectory through x. The set

is called orbit through x.

A subset S of the state space M is called Φ-invariant if for all x in S and all t in T

In particular, for S to be Φ-invariant, we require that I(x) = T for all x in S. That is, the flow through x should be defined for all time for every element of S.

Read more about this topic:  Dynamical System (definition)

Famous quotes containing the words general and/or definition:

    It is a maxim among these lawyers, that whatever hath been done before, may legally be done again: and therefore they take special care to record all the decisions formerly made against common justice and the general reason of mankind.
    Jonathan Swift (1667–1745)

    Beauty, like all other qualities presented to human experience, is relative; and the definition of it becomes unmeaning and useless in proportion to its abstractness. To define beauty not in the most abstract, but in the most concrete terms possible, not to find a universal formula for it, but the formula which expresses most adequately this or that special manifestation of it, is the aim of the true student of aesthetics.
    Walter Pater (1839–1894)