AP US History
The term 'West' refers to the geographical and cultural region in the United States that encompasses the vast territories beyond the Mississippi River, characterized by its diverse landscapes, including mountains, plains, and deserts. It represents not only a physical space but also a symbolic frontier of opportunity, expansion, and conflict that shaped American identity and policy over different historical periods.