What was the impact of World War I on women’s rights in the U.S.? 🔊
World War I had a significant impact on women's rights in the U.S. as the war created new opportunities for women in the workforce. With a large number of men serving in the military, women stepped into roles traditionally held by men in factories, offices, and other sectors, expanding their economic independence and societal contributions. This shift helped challenge gender norms and perceptions about women’s capabilities. Post-war, the struggle for women’s rights gained momentum, culminating in the passage of the 19th Amendment in 1920, granting women the right to vote. Women’s active participation during the war laid the groundwork for future advancements in gender equality.


Equestions.com Team – Verified by subject-matter experts