What started out as a great show has switched from medical issues to solving every perceived wrong in America. The cast is great but this season it has really gone down hill. Apparently the writers have lost their way and have decided to make each show a rant on leftists “woke” policies. They need to return to what the show focused on initially, medical issues and helping those who are in need of treatment.