Kind of a vague question. But I guess anyone that responds can state their interpretation.

Edit: I guess I’m asking because everything I’ve learned about America seems to not be what I was told? Idk how to explain it. Like it feels like USA is one event away from a civil war outright corruption and turning into a D class country.

  • tobogganablaze@lemmus.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    7 months ago

    The most fundamental aspect of a nation is to be able to enforce your sovereignty against anyone that thinks you’re not a “genuine nation” and the US probably does this better than most nations in the world.

    So very genuine.

    • Eol@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      7 months ago

      I guess that’s not what I’m thinking either. It just feels like the “image” of America isn’t what America actually is. Like there’s a marketed campaign to make things seem better than they actually are.