
German imperialism
German imperialism refers to Germany's efforts to expand its influence and territory, particularly during the late 19th and early 20th centuries. Driven by a desire to compete with established colonial powers like Britain and France, Germany sought colonies in Africa, Asia, and the Pacific. This period saw Germany acquire territories such as German East Africa and German Southwest Africa. The quest for empire was fueled by economic interests, national pride, and a belief in racial superiority. Ultimately, German imperialism contributed to geopolitical tensions and played a role in the outbreak of World War I.