When did American women start wearing pants?

Answer American women began wearing pants during World War I while working in munitions factories. In the 1920s, it became more common for women to wear pants while playing sports such as golf and tennis.... Read More »

Top Q&A For: When did American women start wearing pants

When did women start wearing pants?

For centuries, wearing pants was associated with masculinity. An assertive woman is still said to "wear the pants in the family." Beginning in the late 19th century and continuing throughout the 20... Read More »

When did women start wearing dresses?

According to Science News for Kids, anthropologists estimate from lice DNA that women started wearing dresses about 190,000 years ago. In fact, everyone first wore "dresses," not just women. Human... Read More »

When did women start wearing work boots?

While women have been in the workforce for all of history, it was during U.S. involvement in World War II, from 1941 to 1945, that they begin entering manual labor jobs in large percentages. It was... Read More »

What year did women begin wearing pants?

The first American women to make history for wearing pants were members of the women's suffrage movement in the 1800s. As early as 1825, suffragists Amelia Jenks Bloomer, Elizabeth Smith Miller and... Read More »