Any nuggets of truth and helpful insight are swallowed up in a sea of resentment towards White people felt by the cast. The show is presented as how most "people of color" see America but offers no proof other than the personal opinions of the cast members themselves. I just don't see how this show can help improve race relations and perhaps reconciliation is not the goal. On the other hand, if the show's intent was just to give a platform for resentful "people of color" to express themselves without consequence, then that goal has been realized.