My wife and I loved the show for a while but these newer shows are causing us to decide whether we want to continue watching. This week with the reality show story and the few past shows where the story of the homosexual doctor kissing. If only 3% of Americans are LBGTQ+ then why does every show have to portray some form of LGBTQ? To be honest I don’t even care for all the love drama between doctors and nurses. We just prefer the good stories and different medical cases. We will see what the future episodes bring to whether we will continue being fans. We used to think of this as a 5 star show but the stars are falling.