Do bananas grow in the united states?
Yup. Bananas are grown in the warm, tropical corners of the US, namely Hawaii and Florida. People also grow bananas in other warm, non-tropical areas like California, Louisiana, Arizona, Texas.
Bananas require tropical or warm subtropical climates, lots of water and very rich soil to grow. Hawaii is a popular banana-growing state and Florida has some small local businesses and research growers.
Join Alexa Answers
Help make Alexa smarter and share your knowledge with the worldLEARN MORE