I am an American, but this past year I find myself wishing that I were from Canada or Sweden or anywhere that they treat their people with more respect and don’t penalize you for growing older or being born with an illness that was no fault of your own.
Have you heard the phrase “American Greed”?
Well in my country it’s not only the rich people that are greedy it’s the poor that feel entitled.
It’s like the country that I was so proud of has changed.
What used to be wrong is now right, and what used to be right is now wrong.
I find myself shaking my head on a daily basis, what happened to love one another.
How can someone be against abortion but not take responsibility for feeding our children?
Why is someone so afraid of a woman president?
Why hate a black president when he did a great job?
WTF is wrong with us? Is it the water or the food that make us Americans so damn angry?