AP US History
Women's rights refer to the social and legal entitlements claimed for women and girls worldwide. In the United States, this movement has fought for issues like voting rights (suffrage), equal pay, reproductive rights, and more.
congrats on reading the definition of Women’s Rights. now let's actually learn it.