AP US History

World War II

march 20, 2019



Resources

On December 7th, 1941, the United States was pulled into the war that had encompassed most of Asia and Europe. Working with the allies, America fought to defeat the totalitarian governments of Japan, Italy, and Nazi Germany. American society was completely transformed during the war, as many women had to enter the workforce for the first time and many parts of the economy had to shift production to support the war efforts. Not only did World War II take down many fascist governments overseas, but it also helped pull the United States out of the Depression and permanently changed parts of the American culture.



© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Glossary