Modernism to Postmodernism Theatre
Post-war America refers to the period following World War II, characterized by significant social, economic, and cultural changes in the United States. This era saw the rise of consumerism, the expansion of the middle class, and a quest for identity that influenced various art forms, including theatre. The tensions between traditional values and modern ideas during this time shaped the narratives found in many influential plays.
congrats on reading the definition of post-war america. now let's actually learn it.