American Cinema – Before 1960
Post-World War I refers to the period following the end of World War I in 1918, characterized by significant social, political, and cultural changes globally. This era saw the emergence of new artistic movements, notably German Expressionism, which profoundly influenced various art forms, including cinema, leading to the development of styles that explored psychological depth and social issues.
congrats on reading the definition of Post-World War I. now let's actually learn it.