Who are the south?
In the United States, the "south" are the southern states. This usually refers to Tennessee, North Carolina, South Carolina, Florida, Georgia, Mississippi, Louisiana and Arkansas. This is a tradition that goes back to the U.S. Civil war, in the 1860s.
Join Alexa Answers
Help make Alexa smarter and share your knowledge with the worldLEARN MORE