The worst show ever. What ever happened to actually working and reporting on the truth. Instead of going for the wow factor and scare tactics. Are the reporters so afraid of losing their job that they won't report the truth and the facts. Why should you care about politics or being politically correct. The truth is what people want. Not what the rich or your bosses tell you to report. I understand it's all about money. Though sponsors will stay with popular shows.