There is a lot of social media talk on the abuse against black men at the hands of white society, mainly white police. There is another group adversely affected by all of society and I don't think it's spoken on enough. Black women throughout history have been under satanic attack from all sides especially their own.
In music, we are called bitches, hoes, tricks, sluts and every other name thinkable by our own men. In TV, we are portrayed as loud mouth, aggressive, oversexed, violent,
and ghetto by black actresses who don't care how their character choices affect all black women. On social media, only the most negative images are posted. Fighting, twerking, & sexing, online are the only images some people see of black women.
It's hard for black women to be seen in the same light of professionalism as their white counter parts. I have experienced the "look" from white people when applying for a professional position. I have experienced my behavior being viewed as aggressive or
unprofessional at the same time having white coworkers who exhibit the same behavior viewed as no big deal. Black women are forced to hide emotion or not express concern or displeasure. We have to accept abuse so that we aren't viewed as "angry" or "bitter".
Where is the empathy and acknowledgement for what black women have to endured because of other black people willing to sell us out for money, fame, and social media hype?
How can we improve the way black women are presented?
I'm not one to watch much TV but I was enjoying "Designated Survivor", mainly because of the lack of overly elicit sex scenes and political content. The show flirted with the idea of a love connection between Mike, a black secret service agent and a young black female White House staffer. They only flirted with that idea of course and instead introduced black love in the form of homosexuality. The show's only black "couple" are two black men. Hollywood sickens me with not only their perversion but their racism as well.
Those in power would rather perpetuate "black love" through the eyes of perversion than to allow the public to be exposed to the strength of the loving combination of the black man and the black woman. The show has had black females and black males but what I notice is the deliberate plot to keep them unattached in heterosexual relationships. Meanwhile other ethnic groups & whites on the show have explored and eluded to heterosexual relationships.
This is not the only show that seems to "separate" the black man and woman when it comes to healthy marriage and relationship. When black men and women are in a TV relationship, it is usually portrayed in a very.
stereotypical way. The woman is usually portrayed as aggressive, angry, argumentative, and emasculating. The man is usually portrayed as either submissively effeminate or abusive and womanizing.
Hollywood will showcase black men in homosexual relationships where one of them ALWAYS has to have aids, like in Empire, and Designated Survivor. Black love is being portrayed between two gay men whose love has to overcome sickness and disease. Their relationships never have a "goal" or "vision" just lust, sickness, and disease.
What saddens me even more is that some of the most popular shows have black writers and producers who are selling us out for profit. They perpetuate perversion and negative stereotypes to please the sadistic elite who own the platforms used to showcase their trash. They see no obligation to adequate representation of black men and women together in healthy relationships.
The church is silent in speaking out against the perversions of Hollywood. Where is the set standard? Where are the cries from the body of Christ? No one seems to care about the degradation of blacks by blacks in Hollywood.