AP US History
Women's roles in society refer to the expected behaviors, duties, and positions that women held in various social structures throughout history. During significant changes like the Market Revolution, women's roles evolved, particularly as the nature of work and family life transformed, which impacted how women participated in both the home and emerging economic activities.