first i kind of want to vent. i've been so bummed. silly me, for the last several years thinking that times had REALLY changed. you know, with a black man being in the white house and all. and you know, i keep hearing about white people becoming the minority soon. but now, it seems like suddenly things are not so progressive. i feel like i've seen more racism ( white on black) than i ever have. and i feel like since we've had a black president, white people feel insecure. like they're losing their "place" in the world. i feel like it's a trickle down effect from the government even down to the average joe working a blue collar 9-5.
maybe i've been really naive and my eyes have finally been opened. i'm not sure why i've noticed so much blatant racism during a time where things are supposed to be so progressive. but i honestly don't recall seeing it so much before Obama was president.
what about you? is it just me? i'm really curious to hear your experiences.
also, i 'm am so annoyed by the hypocrisy in the media about racism. like, someone white will call someone a nigger and EVERYONE in the media ( news anchors, reporters, celebrities, etc), all shake their head and act like they can't imagine ever saying it, or knowing anyone who would say it, or even think it. but it's like: we keep hearing stories about this white person saying it, that white person saying it, but how come NO ONE white ever seems to know anything?
the truth is, they all know people, likely family who call black people niggers in their mind at least, if not behind closed doors. plain and simple. they all front like, "oh my god! that's terrible that paula deen said that! how could she? *gasp*!" knowing good and well many white people still use that word. they just try to keep it on the down low. it's not politically correct to dislike anyone publicly anymore. and that's just what it boils down to. racism is very alive and well. ugh.