Kind of a vague question. But I guess anyone that responds can state their interpretation.
Edit:
I guess I’m asking because everything I’ve learned about America seems to not be what I was told? Idk how to explain it. Like it feels like USA is one event away from a civil war outright corruption and turning into a D class country.
deleted by creator
I feel like touching up the history books and even other areas of teaching is a disservice to humanity. It’s like it’s an active set up for failure or abuse. I was never taught anything realistic 20 years ago. It’s like I was cattle for someone else’s sick dream. Sometimes it feels like heartlessness is rewarded and masked as goodness.