Since when has the government decided that owning businesses is legal and ethical? The White House is trying to own health care, already owns several car companies and a good portion of several banks. Why does no one in America seem to have a problem or concern with this?
Talk about a conflict of interest! When the government owns something they can choose whether to pass tax laws or legislation that will hurt privately owned business and help themselves. Who is going to oversee the government?
The government is already controlling whether or not banks can pay back their TARP money and buy back the government's stock. Are they basing their decision on the financial health of the banks? No, they are basing their decisions on the fact they want to "earn a profit for the American people." Sounds like a great basis for a major government decision.
I don't necessarily want to go back to the way things were, but I do want to see government get out of our business. I don't want politicians in charge of where I get medical care, what bank I can purchase stock in, or what cars I'm going to buy. I want freedom of speech, freedom of choice, and freedom of religion. I want what our country was founded on.
I'm willing to give up some of the government "benefits" being provided if I can keep more of my own money I earn. I don't want to help Microsoft build a bridge it doesn't need, or sponsor new roads in West Virginia or horse trails in Texas. I want to be able to choose who I support locally instead of having big government decide where 35% of my money goes. I believe certain government programs are fine, but all of this pork belly spending is making me sick. I don't particularly blame republicans or democrats more than the other, I just want the cycle to end!