AP US History
The American West refers to the vast region of the western United States characterized by its diverse geography, including mountains, plains, deserts, and forests. This area became a focal point during the 19th century as the nation expanded westward, driven by a belief in Manifest Destiny, which promoted the idea that Americans were destined to expand across the continent.