Who Won the War (of 1812)?
In 1812, the U.S. declared war on the United Kingdom, officially beginning the War of 1812. What started the war? Did it play a signficant role in history? The answers depend on who tells the story.
Below are excerpts from textbook descriptions of the War of 1812. Select the country where the textbook was written.