Who were the first people to set foot in america?
The first people to set foot in what is now known as America were early humanoids who either came from the Arctic region or came up from the south. Scientists don't know for certain.
Join Alexa Answers
Help make Alexa smarter and share your knowledge with the worldLEARN MORE