AP US History
Western territories refer to the regions located west of the original thirteen colonies during different periods in US history. These territories were acquired through exploration, purchase (such as the Louisiana Purchase), treaties (like after the Mexican-American War), or annexation (Hawaii). They played a significant role in shaping American expansionism.