Leave it to Dick Armey to sum up the idiocy I'm referring to:
Does America have the best health care system in the word? In my view the entire question is irrelevant since I cannot afford coverage (though I would still say no). And until health care insurance is readily available for all Americans, I think the question is too narrow and pretty much useless.
There was a link to this video in the middle of the video above that I also wanted to include because it talks about some of the myths regarding health care in America and around the world:
You can read the entire article Cenk is referring to here.
The key difference is that foreign health insurance plans exist only to pay people's medical bills, not to make a profit. The United States is the only developed country that lets insurance companies profit from basic health coverage.I highly recommend checking the article out. It rocks. (And by rocks, I mean it confirms your worst fears and makes you hope you never get sick again.)