American Imperialism

American imperialism is a term referring to the economic, military, and cultural influence of the United States on other countries. The concept of an American Empire was first popularized during the presidency of James K. Polk who led the United States into the Mexican–American War of 1846, and the eventual annexation of the territories like California and the Gadsden purchase.

Read more about American Imperialism:  Imperialism and Empire, American Exceptionalism, Imperialism At The Heart of U.S. Foreign Policy, Views of American Imperialism, U.S. Military Bases, Benevolent Imperialism, Factors Unique To The "Age of Imperialism", Debate Over U.S. Foreign Policy

Famous quotes containing the word american:

    Our security depends on the Allied Powers winning against aggressors. The Axis Powers intend to destroy democracy, it is anathema to them. We cannot provide that aid if the public are against it; therefore, it is our responsibility to persuade the public that aid to the victims of aggression is aid to American security. I expect the members of my administration to take every opportunity to speak to this issue wherever they are invited to address public forums in the weeks ahead.
    Franklin D. Roosevelt (1882–1945)