Skip Navigation

Question about using WWII as propaganda to refocus everyone on Nazis instead of the US.

Somewhat of a delicate matter as nobody wants to be seen as possibly downplaying how bad Nazis were. But I'm curious about the idea that the US played up WWII as a way to re-write history in a more beneficial way.

The general idea is that America and the UK re-framed the event around their heroism and making it about fighting tyranny. This was mainly to cover up the contributions of Communists and social democratic countries. To make it about tyranny covers up the material reasons for why the UK and the US opposed Hitler. Basically the industrial capitalists of the UK and the US needed to crush German industrialism. Then there is the matter of the holocaust. There was a tremendous focus on the holocaust to make it seem like the Allies (minus USSR of course because they actually would care) had a moral crusade. This is in spite of the documented antisemitic nature of the US (idk about the UK but would assume so) prior to WWII. Speaking of antisemitism, there would also be a re-framing of the holocaust to focus on Jewish ethnicity rather than all of the victims. Famously the poem "First they came for the communists..." was edited in the public memory. It plays into Zionism, which was rising before WWII. The UK playing a heavy role in helping Zionists take over Palestine.

I was reading some posts on r/TrueAnon about how Jewish people in the 60s and 70s (in the US) had no idea about the holocaust. It wasn't something heavily emphasized even among popular historians of the time. There were some books they mentioned but I forgot to jot them down. I think Finklestein has talked about t his idea as well, including some of his own experiences. So the US would go on to place heavy emphasis on the holocaust and Jewish persecution in the 70s as part of a broader project on Zionism.

Then in the 80s and 90s, there was a lot of pop culture about WWII. Adam Curtis touches on this but I don't really regard him as a totally reliable source.

I'm curious about all this because it seems to be the only piece of history anyone can relate with modern politics. When I learn about the US before WWII it seems like Nazis weren't exactly unique in their beliefs or cruelty. It used to be fairly common knowledge that the Nazis got a lot of inspiration from the US and UK. Concentration camps weren't new. Eugenics wasn't unique to Wiemar Germany. Industrial War had already occurred in WWI. Now I don't know about industrialized genocide, maybe that was unique, but US companies helped with that too.

I guess when it comes down to it, when we call people Nazis, I just see gilded age Americans. I think that even right wingers think they're Nazis only because the culture has been so refocused on that, even they don't know right wing history beyond it. All they really know is that Hitler was racist to the point of genocide and makes everyone they don't like mad.

Are there any books or anything on this? Am I getting a lot of stuff confused/wrong?

Comments

3