Medical technology advancements aside, the insurance industry that began to expand from the 20s onward began to supplant the notion that health care could come without insurance.
It just wasn’t as necessary… most ailments don’t reveal symptoms until they are too late. And now, there are so many perverse incentives that the whole industry is diseased itself.
Health care has absolutely been better. I honestly don’t know about insurance though, which is why I asked. The two are tightly related so I included both in my question.
This is where it all went wrong, where health insurance got tied to employment
Was health insurance and health care better before 1943?
Those are two very different things.
Medical technology advancements aside, the insurance industry that began to expand from the 20s onward began to supplant the notion that health care could come without insurance.
It just wasn’t as necessary… most ailments don’t reveal symptoms until they are too late. And now, there are so many perverse incentives that the whole industry is diseased itself.
Has it been better since?
Health care has absolutely been better. I honestly don’t know about insurance though, which is why I asked. The two are tightly related so I included both in my question.