When Did White Men Become The Bad Guys in America?

Article here. Excerpt:

'Here in America, there are no massacres happening and white men are certainly not a minority, but white men are regularly passionately smeared, attacked and degraded in our country. White people (men included) are the only group in the country that is discriminated against via Affirmative Action by official government policy. In all fairness, Asians are also discriminated against by universities that often wave them off in favor of less qualified applicants from different racial groups, but there’s no widespread cultural assault against Asian Americans. To the contrary, when it comes to white men, outright hatred based on our skin color is commonplace. Just look at some of the stories we’ve covered at Right Wing News over the last couple of months.
...
It’s not like we’re looking for these stories or covering every one that comes down the pike; there are just so many hateful attacks aimed at whites in general and white men in particular that they’re bleeding into the news.

When you’ve spent your whole life believing that everyone should be judged by the content of his character and the merit of his actions, not the color of his skin, it’s a little shocking to find out that almost no one seems to believe in this idea any more except for mostly white conservative Americans.'

Like0 Dislike0