For instance, I haven't heard any of them publicly denounce Obamacare. Do they not care that Obamacare's promise of "free" birth control actually results in higher premiums and overall costs to women since their insurance policies are forced to pay for it? Is it okay that Obamacare is cancelling millions of health insurance policies, leaving women with no insurance or forcing them into policies that carry extraordinarily high out-of-pocket deductibles?
What about the fact that Obamacare forces women to pay for maternity coverage, even if they choose not to have children? What happened to the liberal mantra of women's choice? Don't these women getting hysterical over a meaningless word like "bossy" think that women should be allowed to choose a cheaper health plan that fits their individual needs? Funny how they don't seem to mind that the Obama administration is being completely bossy over them regarding their healthcare.
On top of it, women (and men) are having their work hours reduced or jobs eliminated altogether because of Obamacare, so how does that help them? Oh, that's right: Medicaid is being expanded to help women pay for healthcare, so now they can be dependent on government, rather than being in charge of their own lives. But hey, as long as nobody is calling them names, it's all good.
Meanwhile, people like Beyonce, Miley Cyrus, and the other likewise unsavory, gyrate in near nudity for any camera that's pointed at them. They slither up and down poles, stick their privates into any object they can find, and leave nothing to the imagination. Dignity be damned. In fact, they're praised by other women for being creatively expressive in their sexuality. These "entertainers" make verbal reference to everything gauche, from using the word "whore" to things I won't even write here. But don't you dare use the "b" word to describe them (and I don't mean the one that rhymes with "witch."). Imagine the deafening uproar, scandal and offense that would cause.
Personally, I am so tired of women trying to convince other women that they are victims, when the truth is they thrive on making women think they're victims of some imaginary patriarchal society because it helps them sell more books, or music, or whatever other nonsense it is they're peddling. Yes, there are some people who are sexist (and racist, and any other "ist" you can conjure up). Find a way to deal. There are bigger problems in the world - serious ones, in fact. Besides, if you cannot withstand a label or two, you'll never make it in this world anyway.
What do you think? Click on the comments link in the bar below to share your thoughts. No registration necessary.