What first comes to mind when you hear the word “hospital”?
Your reaction may depend on your past experiences. You may feel gratitude for the birth of a child or the treatment of acute appendicitis. You may feel sorrow, remembering a loved one who passed away on a hospital bed.
Regardless of our experiences, many of us assume the closer our hospital is to where we live, the safer and better off we are. But that assumption is wrong. Fewer hospitals with increased volumes would lead to higher quality of care and better clinical outcomes.
Some hospitals were born to fail
In the early 1700s, hospitals provided little medical care. Instead they served as isolation facilities for those with contagious illnesses, as shelters for vagrants and those with mental illness, or as almshouses for the poor. Those who could afford medical care (middle- and upper-class families) received it in their own homes, including surgery.
By the end of the 19th century, medical care was becoming too complex to be delivered in the home. As a result, care shifted to centralized facilities where patients benefited from the latest medical advancements and around-the-clock physician and nursing availability.
A century ago, traveling even moderate distances was incredibly slow and expensive compared to the cost of hospital care. Therefore, building a hospital in every town made sense. Hospitals became a source of great civic pride for community leaders who comprised the governing boards. And so the “community hospital” was born.
Founded by physicians, religious groups and public municipalities, the number of U.S. hospitals grew exponentially from 178 in 1873 to 4,300 in 1909 to 6,000 in 1946. The passage of the Hill-Burton Act in 1946 helped further expand that number to 7,200 by 1970.
With the introduction of the publicly funded Medicare and Medicaid programs in 1966, the number of individuals with health insurance skyrocketed — as did the demand for inpatient services, as did hospital costs.