Americans only started calling our country "America" around the turn of the 20th Century. Previously, we called ourselves the "Union," "Republic," and even "Columbia" and Freedonia.
The transition to “America” came about the time the US became a bloody, brutal global empire, committing genocide in the Philippines and conquering Puerto Rico and Hawaii.