Mathematics - Notation, Language, and Rigor

Notation, Language, and Rigor

Most of the mathematical notation in use today was not invented until the 16th century. Before that, mathematics was written out in words, a painstaking process that limited mathematical discovery. Euler (1707–1783) was responsible for many of the notations in use today. Modern notation makes mathematics much easier for the professional, but beginners often find it daunting. It is extremely compressed: a few symbols contain a great deal of information. Like musical notation, modern mathematical notation has a strict syntax (which to a limited extent varies from author to author and from discipline to discipline) and encodes information that would be difficult to write in any other way.

Mathematical language can be difficult to understand for beginners. Words such as or and only have more precise meanings than in everyday speech. Moreover, words such as open and field have been given specialized mathematical meanings. Technical terms such as homeomorphism and integrable have precise meanings in mathematics. Additionally, shorthand phrases such as iff for "if and only if" belong to mathematical jargon. There is a reason for special notation and technical vocabulary: mathematics requires more precision than everyday speech. Mathematicians refer to this precision of language and logic as "rigor".

Mathematical proof is fundamentally a matter of rigor. Mathematicians want their theorems to follow from axioms by means of systematic reasoning. This is to avoid mistaken "theorems", based on fallible intuitions, of which many instances have occurred in the history of the subject. The level of rigor expected in mathematics has varied over time: the Greeks expected detailed arguments, but at the time of Isaac Newton the methods employed were less rigorous. Problems inherent in the definitions used by Newton would lead to a resurgence of careful analysis and formal proof in the 19th century. Misunderstanding the rigor is a cause for some of the common misconceptions of mathematics. Today, mathematicians continue to argue among themselves about computer-assisted proofs. Since large computations are hard to verify, such proofs may not be sufficiently rigorous.

Axioms in traditional thought were "self-evident truths", but that conception is problematic. At a formal level, an axiom is just a string of symbols, which has an intrinsic meaning only in the context of all derivable formulas of an axiomatic system. It was the goal of Hilbert's program to put all of mathematics on a firm axiomatic basis, but according to Gödel's incompleteness theorem every (sufficiently powerful) axiomatic system has undecidable formulas; and so a final axiomatization of mathematics is impossible. Nonetheless mathematics is often imagined to be (as far as its formal content) nothing but set theory in some axiomatization, in the sense that every mathematical statement or proof could be cast into formulas within set theory.

Read more about this topic:  Mathematics

Famous quotes containing the word rigor:

    It is known that Whistler when asked how long it took him to paint one of his “nocturnes” answered: “All of my life.” With the same rigor he could have said that all of the centuries that preceded the moment when he painted were necessary. From that correct application of the law of causality it follows that the slightest event presupposes the inconceivable universe and, conversely, that the universe needs even the slightest of events.
    Jorge Luis Borges (1899–1986)