Western United States

The Western United States, commonly referred to as the American West or simply "the West," traditionally refers to the region comprising the westernmost states of the United States. Because the U.S. expanded westward after its founding, the meaning of the West has evolved over time. Prior to about 1800, the crest of the Appalachian Mountains was seen as the western frontier. Since then, the frontier moved further west and the Mississippi River was referenced as the easternmost possible boundary of the West.

The West mostly comprises arid to semi-arid plateaus and plains and forested mountains.

In the 21st century, the states which include the Rocky Mountains and the Great Basin to the West Coast are generally considered to comprise the American West.

Read more about Western United States:  Region and Concept, Demographics, Natural Geography, History and Culture, Major Metropolitan Areas, Politics

Famous quotes containing the words united states, western, united and/or states:

    I feel most at home in the United States, not because it is intrinsically a more interesting country, but because no one really belongs there any more than I do. We are all there together in its wholly excellent vacuum.
    Wyndham Lewis (1882–1957)

    But go, and if you listen she will call,
    Go to the western gate, Luke Havergal—
    Luke Havergal.
    Edwin Arlington Robinson (1869–1935)

    There was no speculation so promising, or at the same time so praisworthy, as the United Metropolitan Improved Hot Muffin and Crumpet Baking and Punctual Delivery Company.
    Charles Dickens (1812–1870)

    On September 16, 1985, when the Commerce Department announced that the United States had become a debtor nation, the American Empire died.
    Gore Vidal (b. 1925)