Miscellaneous
What does the west mean?
1
Answer
The word west generally refers to the direction where the sun sets, as opposed to east, where the sun rises. It is a compass point, and can refer to the western side of a continent or country. It can also refer to the non-communist countries in Europe and the Americas.
LIVE
Points
34
Rating
Similar Questions
{{similarQuestion.category}}
•
{{similarQuestion.answerCount}}
Answer
Answers