Thursday, October 5, 2017

Germany’s View of Colonies in 1941

Germany lost its African colonies after WWI.  Even before the Nazi takeover in 1933, many Germans demanded the return of those colonies, a demand the Nazis fully supported. A variety of organizations promoted awareness of Germany’s former colonies.



In 1941, it seemed entirely reasonable to think that Germany would be able to regain those colonies.  The yearbook of the Reich Colonial Federation began that year by reviewing the indignities to which German colonists had been subjected, concluding that Germany was now in a position not to ask for the return of its colonies, but simply to take them.