Geography
Where is the west coast?
1
Answer
In the United States of America, the term "West Coast" refers to the states of Washington, Oregon and California, which all have western coastal borders on the Pacific Ocean.
LIVE
Points
112
Rating