Intro to Women's Studies
Gender roles are societal expectations and norms that dictate how individuals should behave, think, and feel based on their perceived gender. These roles can shape various aspects of life, influencing everything from personal relationships to professional opportunities, and they often reinforce stereotypes that define masculinity and femininity.
congrats on reading the definition of gender roles. now let's actually learn it.