I was watching bits of the John Stewart interviewing Michael Moore on the Daily Show. At one point Moore made something of a sarcastic comment about the U.S. and health care that went something like this "where else in the world would people talk about trying to get profit when treating cancer." This is a foolish if completely pervasive sentiment. One of the basic things we forget is that things like cancer have become treatable is because it is profitable, ney, lucrative to develop treatments, whether they be diagnostic technology or pharmaceuticals. That's not a bad thing, it's a good thing.
When discussing health insurance people often will speak of how their employer "pays" for their health care among various and sundry benefits. Your employer does not pay for your health care. Your employer compensates you and a portion of your benefits is health care. Your employer does not provide health care out of the goodness of his heart. People think of health insurance as some employer provided freebie when in fact it comes at the expense of higher wages you would otherwise have been paid.