America used to be a place you could come to work to be able to support yourself and your family.
Looking back over the last quarter century it seems to me that we’re seeing a reversal of this trend. American born citizens so overburdened by the failures of their country that they dream of moving somewhere they can support themselves, and maybe finally begin to live their lives.
What are your thoughts?
Fuck this country get out if you can. We are at the end stages here and full on death of capitalism and birth of fascism coming.