Dynamical System (definition) - General Definition

General Definition

In the most general sense, a dynamical system is a tuple (T, M, Φ) where T is a monoid, written additively, M is a set and Φ is a function

with

for

The function Φ(t,x) is called the evolution function of the dynamical system: it associates to every point in the set M a unique image, depending on the variable t, called the evolution parameter. M is called phase space or state space, while the variable x represents an initial state of the system.

We often write

if we take one of the variables as constant.

is called flow through x and its graph trajectory through x. The set

is called orbit through x.

A subset S of the state space M is called Φ-invariant if for all x in S and all t in T

In particular, for S to be Φ-invariant, we require that I(x) = T for all x in S. That is, the flow through x should be defined for all time for every element of S.

Read more about this topic:  Dynamical System (definition)

Famous quotes containing the words general and/or definition:

    Why not draft executive and management brains to prepare and produce the equipment the $21-a-month draftee must use and forget this dollar-a-year tommyrot? Would we send an army into the field under a dollar-a-year General who had to be home Mondays, Wednesdays and Fridays?
    Lyndon Baines Johnson (1908–1973)

    Although there is no universal agreement as to a definition of life, its biological manifestations are generally considered to be organization, metabolism, growth, irritability, adaptation, and reproduction.
    The Columbia Encyclopedia, Fifth Edition, the first sentence of the article on “life” (based on wording in the First Edition, 1935)