How do American citizens feel about how the US looks to the rest of the world?
This is not an aggressive post. It's purely based on how I perceive the US. it's not designed for you to attack me. it's nothing but an objective view. So try your best to answer my questions seriously. thank you.
When I was a kid my biggest dream was to live in the magical U.S.A. As I've grown older and witnessing all the war-crimes, the shameless propaganda-channels, and the loose connection the government has with its people, I feel like USA has sunk to the levels of what one would call "Developing Countries". There is very little democracy, the government can punish you as they please, few really has any political awareness apart from the very aggressive "you vs me"/Divide and Conquer tactics. Very few know anything about other countries. No free healthcare and life-destroying prices at hospitals. Lawless communities. Mass shootings. Being the aggressor in most illegal conflicts around the world. I can go on and on...
USA really has become one of the worst projects in history. How do you as americans feel about this? Does it make you want to move when you see people living healthy lives without being afraid of loosing all their money on an ambulance-ride? or maybe you don't see it at all because of propaganda?
This is a serious question. I am in no way trying to hurt anyone with this. This is my observations. I live in a country that has the same propaganda-channels as you guys.
Thankful for any serious answers.