Are the bahamas a territory of the u. s.?
No, the Bahamas are not, and never have been, part of the United States. It is an independent country that is part of the Commonwealth, having gained its independence from Britain in 1973. It was previously part of the British Empire, and before that home to Lucayan communities.
Join Alexa Answers
Help make Alexa smarter and share your knowledge with the worldLEARN MORE