AP US Government
American imperialism refers to the policy and practice of the United States expanding its influence and control over foreign territories and peoples, especially during the late 19th and early 20th centuries. This era marked a significant shift as the U.S. moved from a nation focused on continental expansion to one that sought overseas territories, driven by economic interests, military strategy, and a belief in cultural superiority.