Feminist Art History
Post-war America refers to the period following World War II, specifically the late 1940s through the 1960s, characterized by significant social, economic, and political changes. This era saw a booming economy, the rise of consumer culture, and shifts in gender roles and family structures, which collectively influenced the development of various cultural movements, including art. The New York School emerged during this time, highlighting the complexities of gender dynamics in the art world.
congrats on reading the definition of post-war America. now let's actually learn it.