History of Black Women in America
Detroit is a major city in the United States, known as the heart of the American automobile industry. It played a pivotal role during the Great Migration, as many African Americans moved there seeking better job opportunities and escaping racial discrimination in the South. The city became a symbol of industrial growth and the struggles for civil rights as its population transformed significantly during this period.
congrats on reading the definition of Detroit. now let's actually learn it.