By using this site, you agree to our Privacy Policy and our Terms of Use. Close

I will concede more americans are insured, but from what i have read and seen that has not lead to better care, or cheaper costs, or better health for americans. Costs are up for individuals and the government, but that is  do to regulation that has been in place for many years, not neccisarily just Obamacare, Obamacare just excaerbates those problems. But americans have always bee able to go to the doctors with out insurance. I never had an issue, never had insurance and  i was fine going to the eye doctor and getting contacts while only working at low paying job, i also made it to the dentist, and when i got beat up once i had no issues going to the doctor and getting it taken care of. What obama care will do in the long is kill off medical advancement, and that is the real tradgedy.


But insurance isn't for when life is going peachy and you just need to get contacts, it's for when shit hits the fan and you couldn't possibly afford the care that's necessary to live a comfortable life.  Believe me, if you ever get diagnosed with anything like that you'll be glad that insurance companies aren't totally up to the free market or they'd drop you like a bad habbit the second they could.  And for that system to work those that are healthy have to pay in as well as those of us that are less so.  



...