AP US History
Post-World War I refers to the period following the conclusion of World War I in 1918, characterized by significant political, social, and economic changes worldwide. This era saw the emergence of new nation-states, the reshaping of international relations through treaties such as the Treaty of Versailles, and the rise of ideologies that would later influence global conflicts. The aftermath of the war also led to widespread disillusionment and a desire for change, impacting societies deeply.