“American” is the official English word when referring to people or things from the United States, but it heavily implies that it means either all of North America or all of North and South Americas. Most other languages have different words for American (country) and American (continents). If there were a campaign to replace the word “America” with something else when referring to the US, what would you think of it?

    • AgreeableLandscapeOP
      link
      fedilink
      arrow-up
      1
      ·
      4 years ago

      Well, if that ever becomes mainstream, people have a convenient play-on-words insult for them whenever the US does something terrible!