The Age of U.S. Imperialism marked a significant shift in American foreign policy during the late 19th century. This era saw the United States expand its influence globally through territorial acquisition and economic dominance.
Key points:
- Imperialism is defined as the quest to build a territorial empire or control by powerful nations over less advanced areas
- The U.S. turned to imperialism due to economic, social, and political factors after the Civil War
- Major motivations included economic expansion, Social Darwinism, and the idea of "The White Man's Burden"
- Key events included the acquisition of Alaska and the annexation of Hawaii