I am one of those "racialized" people, and I find this show a despicable portrayal of race as the only thing that matters. Not to say race is irrelevant, but it is shameful to brainwash people into seeing the world only through the lens of race. Whats to come out of this vengeful show, out of saying that only one group has a monopoly of racial hatred and the rest of the groups are forever the victims?