Hawaiian Studies
Manifest destiny is a 19th-century belief that it was the divine right and duty of the United States to expand its territory across North America. This ideology fueled the country's westward expansion and justified actions taken against Native American populations, as well as conflicts with foreign nations. The concept was closely tied to American nationalism and the notion of progress.
congrats on reading the definition of manifest destiny. now let's actually learn it.