The Western United States, commonly referred to as the American West or simply "the West," traditionally refers to the region comprising the westernmost states of the United States. Because the U.S. expanded westward after its founding, the meaning of the West has evolved over time. Prior to about 1800, the crest of the Appalachian Mountains was seen as the western frontier. Since then, the frontier moved further west and the Mississippi River was referenced as the easternmost possible boundary of the West.
The West mostly comprises arid to semi-arid plateaus and plains and forested mountains.
In the 21st century, the states which include the Rocky Mountains and the Great Basin to the West Coast are generally considered to comprise the American West.
Read more about Western United States: Region and Concept, Demographics, Natural Geography, History and Culture, Major Metropolitan Areas, Politics
Famous quotes containing the words united states, western, united and/or states:
“In the United States there is more space where nobody is is than where anybody is.”
—Gertrude Stein (18741946)
“I wouldnt say when youve seen one Western youve seen the lot; but when youve seen the lot you get the feeling youve seen one.”
—Katharine Whitehorn (b. 1926)
“The genius of any slave system is found in the dynamics which isolate slaves from each other, obscure the reality of a common condition, and make united rebellion against the oppressor inconceivable.”
—Andrea Dworkin (b. 1946)
“By intervening in the Vietnamese struggle the United States was attempting to fit its global strategies into a world of hillocks and hamlets, to reduce its majestic concerns for the containment of communism and the security of the Free World to a dimension where governments rose and fell as a result of arguments between two colonels wives.”
—Frances Fitzgerald (b. 1940)