America used to be a place you could come to work to be able to support yourself and your family.
Looking back over the last quarter century it seems to me that we’re seeing a reversal of this trend. American born citizens so overburdened by the failures of their country that they dream of moving somewhere they can support themselves, and maybe finally begin to live their lives.
What are your thoughts?
My daughter speaks German, and briefly considered college in Germany instead of the U.S. She has been to several countries in the EU multiple times on her own as a teen, and still thinks of moving there after she gets her degree and some work experience. Unfortunately, top schools for her degree are not as accessible elsewhere. 97% job placement rate for surviving grads in her major.
As a metallurgical engineer there’s work for her around the world upon graduation, in just about every industry from mining to cosmetics to renewable energy and recycling. But I won’t be surprised if she bails on the U.S. immediately, trading a likely six figure starting salary for quality of life.