History of Art Criticism
Post-war America refers to the period in the United States following World War II, roughly from 1945 to the early 1960s, characterized by significant social, economic, and cultural changes. This era saw a booming economy, a rise in consumerism, and the emergence of new art movements, including Minimalism, which sought to reduce form and embrace simplicity in artistic expression.
congrats on reading the definition of post-war America. now let's actually learn it.