Do bananas grow in the united states?
Yup. Bananas are grown in the warm, tropical corners of the US, namely Hawaii and Florida. People also grow bananas in other warm, non-tropical areas like California, Louisiana, Arizona, Texas.
Join Alexa Answers
Help make Alexa smarter and share your knowledge with the worldLEARN MORE