Western United States

The Western United States, commonly referred to as the American West or simply "the West," traditionally refers to the region comprising the westernmost states of the United States. Because the U.S. expanded westward after its founding, the meaning of the West has evolved over time. Prior to about 1800, the crest of the Appalachian Mountains was seen as the western frontier. Since then, the frontier moved further west and the Mississippi River was referenced as the easternmost possible boundary of the West.

The West mostly comprises arid to semi-arid plateaus and plains and forested mountains.

In the 21st century, the states which include the Rocky Mountains and the Great Basin to the West Coast are generally considered to comprise the American West.

Read more about Western United States:  Region and Concept, Demographics, Natural Geography, History and Culture, Major Metropolitan Areas, Politics

Famous quotes containing the words united states, western, united and/or states:

    Falling in love with a United States Senator is a splendid ordeal. One is nestled snugly into the bosom of power but also placed squarely in the hazardous path of exposure.
    Barbara Howar (b. 1934)

    In everyone’s youthful dreams, philosophy is still vaguely but inseparably, and with singular truth, associated with the East, nor do after years discover its local habitation in the Western world. In comparison with the philosophers of the East, we may say that modern Europe has yet given birth to none.
    Henry David Thoreau (1817–1862)

    God knows that any man who would seek the presidency of the United States is a fool for his pains. The burden is all but intolerable, and the things that I have to do are just as much as the human spirit can carry.
    Woodrow Wilson (1856–1924)

    Since the Civil War its six states have produced fewer political ideas, as political ideas run in the Republic, than any average county in Kansas or Nebraska.
    —H.L. (Henry Lewis)