I’m in my 30s, eat right, do bodybuilding, and have never been sick a day in my life. I’m covered by my employer’s health plan. I’ve never had a bill covered by the plan because I do not go to the doctor. So, why is it important for me to have health insurance? I wouldn’t be hurting anybody by not having it. Even though it won’t change my life much, I still resent the Obamacare rule that everyone has to have insurance.
Dear Healthy Adonis,
You have had a good run of good health, which is great. But if you had something terrible happen – say, your spotter drops a 25-pound weight on your head – you would need a lot more medical care than you could afford to pay for by yourself. Your doctor and hospital would press you for payment, but after you had paid all that you could (which might involve selling your beloved worldly possessions), they would do one of two things: write off your bills or stop providing care for you.
When they write off your bills, it means that they give up on having you pay. They hope that other people (most of whom have health insurance) can pay their bills, so they won’t go out of business. Or they stop serving you, which leaves you without a way to get your bruised brain attended to.
By requiring everyone to have health insurance, health reform is striving to have the costs for medical services more evenly spread across the whole population — not just those who happen to buy health insurance.