Geography

Is california in the west?


1 Answer

Yes. California takes up a large percentage of the west coast of the United States.

{{ relativeTimeResolver(1561754941237) }}

LIVE
Points points 186
Rating 0
Sign-in to view all answers

Similar Questions
Geography 1 Answer

Join Alexa Answers

Help make Alexa smarter and share your knowledge with the world

LEARN MORE