When did women start working?
Women have always worked. They just haven't always been paid for their work. In earlier times, anything the earned belonged to their father or husband. It became more common for women to work during World War II when there was a shortage of male labor.
Join Alexa Answers
Help make Alexa smarter and share your knowledge with the worldLEARN MORE