I love watching this show but season 9 disappointed me. The point where they focused on racism really bothered me. With everything going on in the real world I would love to sit a watch a show with out it. I hope I future show they will not make the show about that. Dont get me wrong I respect it all but I dont want to keep hearing about it over and over. Thank you