Do most of you feel that politics have gotten in the way of American growth ?
The battle between the Dem and Rep. parties trying to out do each other
on making political science they are losen America.
There are a lot of stuff to be fixed in America. Someone once said there a right way of doing thing and a wrong way of fixing thing.
Obama once said doing something is better then doing better but if your making things worse then better . Then we'er better off waiting for a better plan that works.
I do not thing Government of both parties knows what there doing or knows what going on in America to do anything except bark at one another what do you think ?