Has france ever won a war?
Yes. In the last century alone they won World War I, and World War II, and the Gulf War (Desert Storm), as part of the Allied forces efforts in all those wars. From 1799 until 1815 Napoleon was master of France, he conquered Western Europe, Central Europe, and large parts of Eastern Europe.
Join Alexa Answers
Help make Alexa smarter and share your knowledge with the worldLEARN MORE