The American west refers to the region of the United States located west of the Mississippi River and east of the Pacific Ocean. It includes states such as California, Texas, Colorado, and Oregon.
Related terms
Manifest Destiny: This term refers to the belief that it was America's destiny to expand its territory from coast to coast.
Homestead Act: This legislation provided free land to settlers in the west who were willing to cultivate it for at least five years.
Transcontinental Railroad: The construction of this railroad connected the east and west coasts of the United States, making travel and trade across the country much easier.