History
Has the united states ever lost the war?
No. The United States has never officially lost a war.
{{ relativeTimeResolver(1658005268675) }}
LIVE
Points
2
Rating
Similar Questions
History
•
1
Answer
History
•
1
Answer
History
•
1
Answer
History
•
2
Answers
History
•
3
Answers