Just ran across a great editorial by a doctor in the L.A. Times titled "There's No Place for Rampant Capitalism in Treating the Sick" that everyone should read. You know our system in America is pretty messed up when even the doctors start saying it isn't working. Medicine isn't about saving lives anymore, it's about making money. And we all have to pay the bill.
I've ranted before that medical care in America should be taken over by government, the same government that runs our military, our fire and police departments, and other services that maintain the quality of our lives. Would it be perfect? No, there isn't a perfect health care system on the planet, but we might be able to finally make it more affordable.