I honestly think this TV show is quite problematic and baseless. I don't understand what message they're passing. Why would you make a movie about countries of Africa dominating the world? What do you tend to accomplish with that?
Instead of making a movie that shows the injustice treatment brown skin people face everyday, police brutality or even better, tell the actual history of these countries you're stealing their culture from EXACTLY HOW IT WENT DOWN and not some sick twisted version of it, you're making a movie about white people facing injustice because they are colonised by African countries.
Basically retelling history to make us Nigerians or Ghanians or any other African descent look like the bad guys. Why on Earth would we want to colonise another country?
I really think this show should be cancelled.
It's useless, stupid, quite frankly annoying and just down right wrong.