AP US History
American labor refers to the workforce and the various dynamics, movements, and rights associated with workers in the United States throughout history. This term encompasses changes in working conditions, the rise of labor unions, and significant social and economic transformations that shaped the American workplace. As industries expanded and the economy evolved, American labor became central to discussions around worker rights, social justice, and economic policies.