yossarian
Active Member
I disagree that race relations are worse since Obama took office. Maybe in certain areas (maybe your area) but that does not mean that it is the same way nationwide.
I personally haven't seen any real shift since he took office.
I agree, race relations aren't worse in this country, and besides, for white middle class people to start accusing a black man of making race relations worse is pretty ironic.