History
When did women start working?
Women have always worked. They just haven't always been paid for their work. In earlier times, anything the earned belonged to their father or husband. It became more common for women to work during World War II when there was a shortage of male labor.
{{ relativeTimeResolver(1571785719320) }}
LIVE
Points
134
Rating
Similar Questions
History
•
2
Answers
History
•
1
Answer
History
•
1
Answer
History
•
2
Answers
History
•
2
Answers