The show has a great story line and the actors do a great job. This last season has been hard to watch because every white person is a racist. The have covered every single racist issue they can and it is all white peoples fault. People watch tv for entertainment not to watch Hollywood continue to push their thoughts into the shows. I don’t know if the original story was written this way. If they continue to keep the background of the show about racism then I will not watch it. It’s a disgrace. It’s not a true representative of our world.