There is a lot of social media talk on the abuse against black men at the hands of white society, mainly white police. There is another group adversely affected by all of society and I don't think it's spoken on enough. Black women throughout history have been under satanic attack from all sides especially their own.
In music, we are called bitches, hoes, tricks, sluts and every other name thinkable by our own men. In TV, we are portrayed as loud mouth, aggressive, oversexed, violent,
and ghetto by black actresses who don't care how their character choices affect all black women. On social media, only the most negative images are posted. Fighting, twerking, & sexing, online are the only images some people see of black women.
It's hard for black women to be seen in the same light of professionalism as their white counter parts. I have experienced the "look" from white people when applying for a professional position. I have experienced my behavior being viewed as aggressive or
unprofessional at the same time having white coworkers who exhibit the same behavior viewed as no big deal. Black women are forced to hide emotion or not express concern or displeasure. We have to accept abuse so that we aren't viewed as "angry" or "bitter".
Where is the empathy and acknowledgement for what black women have to endured because of other black people willing to sell us out for money, fame, and social media hype?
How can we improve the way black women are presented?