Image for History of women in the U.S.

History of women in the U.S.

The history of women in the U.S. is marked by struggle and progress. Initially relegated to domestic roles in a patriarchal society, women began advocating for rights in the 19th century, with movements for suffrage gaining momentum. The 19th Amendment, ratified in 1920, granted women the right to vote. Throughout the 20th century, women fought for equal opportunities in education and the workplace, particularly during and after World War II. The feminist movements of the 1960s and 1970s further advanced gender equality, leading to legal protections against discrimination. Today, women continue to address ongoing challenges related to equality and representation.