I really don’t understand how people say racism has gone, just because you don’t witness something don’t mean it didn’t happen. The reason racism was s still a problem is that it’s not talked about enough. Ignoring a turd won’t stop the smell yo. Upon meeting someone the first thing you see is their face, the first thing you notice is their racial simplicity/ambiguity and that shouldn’t be so weird to talk about.
Racism needs to be taught in school, people should stop trying to act like it didn’t happen because that’s just frightening as well as pissing off a lot of black people and making the white folks uncomfortable and scared.
This show is a brilliant commentary on society, however no one needs to get offended because it is one page of the whole book