American imperialism is the economic, military, and cultural influence of the United States on other countries.