Grey's anatomy had been my favorite show for years. I kept rewatching all the seasons over and over (at least 3 times now)... Up until season 14 when it has taken a political spin. I am extremely disdappointed in the show now. Grey's anatomy has now taken and anti-police stance and makes many remarks bashing our current president and that disappoints me to the core. Now I have no desire to continue to watch it. Why cant anyone just keep politics out? Grey's anatomy would still be my favorite show if it hadn't been for the expression of politics in it.