AP US History
november 8, 2018
During the 19th century, there was a widespread belief that it was the destiny of America to extend from coast to coast. This was a belief rooted in white supremacy and fueled by the rise of imperialism. As Europeans colonized Africa and Asia, the United States turned west in order to occupy all of the lands, despite the Indigenous populations that had inhabited these areas for centuries.