
American Imperialism
American Imperialism refers to the United States' policy of expansion and influence over other countries, particularly in the late 19th and early 20th centuries. This involved acquiring territories, such as Puerto Rico, Guam, and the Philippines, often through military force or treaties. The U.S. aimed to promote its economic interests, spread democracy, and establish itself as a world power. This period was marked by events like the Spanish-American War, which highlighted America's growing role on the global stage, raising questions about the ethics of expansion and the treatment of colonized peoples.