American Cinema – Before 1960
Feminist film theory is a critical framework that examines the representation of women in film and how films perpetuate or challenge gender roles and stereotypes. It seeks to understand the ways in which cinema reflects and shapes societal attitudes towards gender, sexuality, and power dynamics. This theory often critiques the male gaze and highlights the need for diverse female perspectives in storytelling.
congrats on reading the definition of feminist film theory. now let's actually learn it.