Everyone being called a sexist for giving a bad review on a bad movie that's literally a turd. It's not the audience's fault for being alienated by calling out very obvious political statements - we don't care about your agenda, so stfu about it already. Hollywood needs to get their heads out of their rear end, stop trying to ride the coattails of previous successes, and stop trying to ram political messages down everyone's throats - we watch movies to have a good time, not for the real life drama of insecure extremists who are too arrogant to hold themselves accountable.