I have watched this show from day 1 and it's been my favorite show. That's the reason i kept watching the last couple of seasons hoping it would get better.
But no more. I wont be watching the next season. Ever since Maggie said to her sisters "check your white privilege" a few seasons ago, the amount of social inequality that has been shoved down our throats has been astounding. I thought I could look past it and enjoy the medical drama... but there is nothing medical anymore. It is especially annoying since half of the cast is black, the major titles are black, the richest and most successful are black, and yet they are still complaining?! While the white characters came from broken down homes and still need to apologize for their whiteness.. black people are so discriminated against that even COVID discriminates and only kills black people, according to the show. Okay.