The U.S. Did Not Defeat Fascism in WWII, It Discretely Internationalized It
One of the founding myths of the contemporary Western world is that fascism was defeated in WWII by liberal democracies, and particularly by the United States. The material record suggests, however, a shockingly different reality.