AP US History
The Southern United States, also known as the American South or simply the South, refers to a region of the United States that includes states in the lower part of the eastern U.S. This region has distinct cultural, historical, and geographic features.
congrats on reading the definition of Southern United States. now let's actually learn it.