“American” is the official English word when referring to people or things from the United States, but it heavily implies that it means either all of North America or all of North and South Americas. Most other languages have different words for American (country) and American (continents). If there were a campaign to replace the word “America” with something else when referring to the US, what would you think of it?
To add, daniel imraham points out that US leaders began referring to it as america (whereas before america referred to the americas) instead of the United States of america, after the US started capturing a lot of territories outside its borders.
The term america doesn’t lock you down to a specific place but could with the US’s imperial ambitions. Here’s a great video on it. https://www.youtube.com/watch?v=Df4R-xdKvpM