⋆Imperialism

KEY CONCEPTS

In U.S. History, Imperialism played an important role. After the American Revolutionary war the  United States planned to enlarge its area of influence.

Imperialism is an ideology, expressing the will of a nation to increase its authority and territory what can be realized in different ways:

war

purchase of land

annexation