AP US History
New Territories refer to the lands acquired by the United States throughout the 19th century, significantly shaping the nation's expansion and identity. These acquisitions fueled a sense of Manifest Destiny and transformed the socio-economic landscape as people moved westward, driven by the promise of land and opportunity.