West

West

West is a noun, adjective, or adverb indicating direction or geography.

Read more about West.

Famous quotes containing the word west:

    Right now I think censorship is necessary; the things they’re doing and saying in films right now just shouldn’t be allowed. There’s no dignity anymore and I think that’s very important.
    —Mae West (1892–1980)

    We in the West do not refrain from childbirth because we are concerned about the population explosion or because we feel we cannot afford children, but because we do not like children.
    Germaine Greer (b. 1939)

    It is queer how it is always one’s virtues and not one’s vices that precipitate one into disaster.
    —Rebecca West (1892–1983)