Dynamical System (definition) - General Definition

General Definition

In the most general sense, a dynamical system is a tuple (T, M, Φ) where T is a monoid, written additively, M is a set and Φ is a function

with

for

The function Φ(t,x) is called the evolution function of the dynamical system: it associates to every point in the set M a unique image, depending on the variable t, called the evolution parameter. M is called phase space or state space, while the variable x represents an initial state of the system.

We often write

if we take one of the variables as constant.

is called flow through x and its graph trajectory through x. The set

is called orbit through x.

A subset S of the state space M is called Φ-invariant if for all x in S and all t in T

In particular, for S to be Φ-invariant, we require that I(x) = T for all x in S. That is, the flow through x should be defined for all time for every element of S.

Read more about this topic:  Dynamical System (definition)

Famous quotes containing the words general and/or definition:

    Without metaphor the handling of general concepts such as culture and civilization becomes impossible, and that of disease and disorder is the obvious one for the case in point. Is not crisis itself a concept we owe to Hippocrates? In the social and cultural domain no metaphor is more apt than the pathological one.
    Johan Huizinga (1872–1945)

    Scientific method is the way to truth, but it affords, even in
    principle, no unique definition of truth. Any so-called pragmatic
    definition of truth is doomed to failure equally.
    Willard Van Orman Quine (b. 1908)