• Pikache [none/use name]@hexbear.net
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 month ago

      For sure Hollywood has played a huge role.

      1. The desensitisation of war and violence.
      2. Celebrity worship. Politicians are treated like celebrities and when they do evil things, it’s treated like an episode of Entertainment Tonight.
      3. The fiction of the American saviour (e.g. Superheroes and all those damn apocalypse movies where the USA saves the world).