Black Women

Black women have been the backbone of America's society since its founding. From being wet nurses to saving America from falling into the hands of fascists, Black women have done phenomenal things. Marley K talks about the burdens of Black women, the most unappreciated and unprotected group in America.