Should the U.S. Stay out of Other Countries?
I have been paying attention to U.S. media primarily along with other sources from around the world, and I always seem to here almost the same opinion. The obvious being in Iraq where violence continues as we speak. In my opinion, these "terrorist" that are scattered from region to region around the world seem to make one ask," Why is the U.S. there"? My question to anyone who reads this is should the U.S. stay out of other nations? I believe yes to a certain extent. It is my opinion that the U.S. should not have ties with any nation that it can't do business with in a civil and humane manor. What do you think?