8 Ways to Reduce Health Care Costs

Currently, most discussion in the United States regarding health care reform is focused on the extent to which health care costs are covered by public or private insurance. However, it will make little difference whether health care is funded privately or publicly if the cost of health care is not held in check. Even with the current predominantly privatized system, the federal budget will be overwhelmed in a few decades by Medicare outlays, which are rising much faster than inflation due to increased health care costs. If the government will not be able to afford paying for health care for seniors, it will hardly be able to provide it for others. Our energies would be better directed, therefore, at reducing the cost of health care.

There are several ways that we can reduce health care costs almost immediately; others will take more time. These approaches are primarily directed at undoing the antiquated system of medical education and hospital regulation, much of which is grounded in the politics of the 1930s. Supposedly non-profit hospitals and universities are able to gouge the public for their services, yet receive federal subsidies. It would be considered abominable if a non-profit charity or church indulged in the extravagances of these institutions, many of which have billion-dollar endowments. Anyone who has seen an $8000 bill for routine testing or a night’s hospital stay should know that the major culprits behind health care cost are the service providers, not the insurance companies.

Indeed, the oft-maligned HMOs are themselves the product of the last great effort at liberal health care reform, the HMO Act of 1973. Prior to the HMO act, most health insurance only covered major illnesses or surgeries, just as auto insurance only covers accidents, not routine maintenance. Ordinary care could be paid out of pocket, or in the cases of the extremely poor, not at all. This was not burdensome, since the cost of a routine doctor’s visit was not much. With the advent of mandatory employer health insurance, much more extensive coverage was required, driving up the cost of insurance premiums. Further, since the patient no longer paid for routine health care out of pocket (except a fixed co-payment), he did not care what exorbitant fee the hospital charged the insurance company. Since patients no longer had any incentive to keep costs down, and indeed were often unaware of the price, the cost of health care could rise, and the insurance companies would pass on this cost in their premiums.

Going further back, our medical education system has relied on a bizarre system of requiring four years of medical school in addition to university, and then three or four years of grossly underpaid internship or “residency”, after which the young doctor emerges with staggering debts often well over $100,000. Given this burden of education and debt, it is no wonder that a large proportion of American doctors choose to become high-priced specialists, leaving a deficiency of general practitioners. Only the extremely intelligent can pass the rigors of U.S. medical schools, where emphasis is placed on the mathematical and analytical aspects of medicine, and less on preventive nutrition and humanitarian aspects. The typical medical school student is intellectually overqualified to be a GP; many highly competent doctors in other countries would not pass muster in U.S. schools, even though experience proves they are quite capable in their profession.

Nurses possess the education necessary to diagnose and treat most common ailments; and pharmacists have the education to prescribe medications, yet both are prohibited from exercising their craft due to our arcane medical system that requires an MD to be involved in every diagnosis and treatment. Given that this same system induces a shortage of GPs, this can only drive up cost. What is worse, these science-oriented, non-humanitarian doctors tend to think everything is to be solved with expensive testing and drugs, especially since ordering tests and drugs takes less of their time than getting to know the patient and his behavior. I once had an “old school” doctor who resigned out of disgust with the increasing pressure to become a pill dispenser.

The residency program, which is supposed to be a time of apprenticeship, is often just an opportunity for the hospitals to exploit cheap labor. Residents often work 100 hours a week, with little sleep, thereby impairing performance, and much of their time is spent on menial administrative tasks unrelated to patient treatment. With yet another three years of lost income, it is no wonder that they all wish to cash in on a high-paying position. To cover their increased debt, they will actually have to be paid more than if they had been paid justly in the first place.

In sum, I propose the following cost-cutting measures:

  1. Abolish Direct-to-Consumer (DTC) advertising for prescription drugs, as this was illegal before the 1990s, and consumers (not to mention doctors) are often poor judges of what medications they should take. Billions of dollars would be saved from this act, since the cost of TV advertising would no longer need to be built into drug costs.
  2. Reform the residency program so interns have the right to demand competitive salaries and reasonable hours. Increasing their salaries should actually lower costs in the long run, as doctors will begin their careers less debt-laden.
  3. Allow RNs and pharmacists to diagnose and treat ailments within their competency. This will reduce costs, as they have lower salaries.
  4. Reduce the cost of medical school (by student grants) or allow admission to medical school direct from high school. Less stringent mathematical requirements should apply to GPs, as mathematical geniuses don’t necessarily make better family doctors. A certain amount of scientific hubris would have to be swallowed here.
  5. Set limits on tort claims related to medical malpractice. This will lower malpractice insurance premiums, greatly reducing the salary demands of doctors.
  6. Stress preventive medicine in medical education, including nutrition and exercise. Less emphasis on surgery, drugs, and high-tech testing as solutions to preventable diseases will greatly reduce costs.
  7. Require full advance disclosure of costs to the patient. Often the patient does not know or care what the cost is, allowing gouging by the hospitals. Ideally, insurance should not cover routine care, which would force caregivers to drive down their prices in order to be competitive among consumers. Emergency care should be price-regulated, since the patient often has no choice of caregiver in such a situation.
  8. The government could demand more stringent accounting from hospitals and universities to account for their non-profit status. If the institution is federally funded, salaries derived from ordinary revenue should be held to the federal executive pay limit. Alternatively, abolish their non-profit status, and dispense with the myth that hospitals and universities are not just businesses.

Mushroom Clouds and Moral Mediocrity

Leave it to a comedian to state plainly that Truman’s use of atomic weapons was a war crime, only to backpedal out of political expediency faster than you can say Arlen Specter. This short-lived moment of lucidity has occasioned a lively discussion on the topic, by no means confined to the political left. Although many paleoconservatives in Truman’s day and beyond expressed horror at the bombing of Hiroshima and Nagasaki, conservative opinion today generally favors nationalism over morality, and we are invited to examine the supposed moral complexity of the issue. In today’s political right, only a few social conservatives and libertarians are willing to state the obvious moral truth that mass incineration of civilians is not justified by military expediency.

It would be easy to argue that the callousness of mainstream conservatives toward the rights of foreign prisoners and noncombatants is a major cause of its recent decline in popularity. It would also be wrong. Sadly, most Americans are willing to hedge basic moral principles for the sake of security. To the chagrin of the liberal news media, a majority of Americans polled in favor of the recent NSA phone surveillance program, even well after details of the program were leaked by the press. A strong majority of the public has supported keeping the Guantanamo prison open, and slim majorities have favored the use of waterboarding and other harsh interrogation tactics. As for pre-emptive wars of choice, public support of the Iraq war from 2003 to 2007 closely matched the perception that the war was going well. It would seem that we support illegal wars in the short term as long as we appear to be winning, though the perceived success of the 2007-08 surge did not restore the war’s popularity.

Unprincipled compromise of moral values can be found even among academics, who are supposedly more thoughtful. Although the vast majority of American historians are politically liberal, their vaunted concern for human rights diminishes when discussing Truman’s use of the atom bomb. Indeed, the man who wiped out two cities and later led the nation into the disastrous Korean War is cited by most historians as an example of an unpopular president who was later vindicated by history. This supposed vindication consists only in the approval of liberal historians, who are evidently as prone to place partisanship over principle as their conservative counterparts. We can only imagine what they would write if a Republican had dropped the bomb.

The general coarsening of morality, even among the educated and among those who claim to preserve traditional social values, is a worrisome development. Some paleoconservatives such as Pat Buchanan have adduced from this reality that the left has won the culture war, through their domination of academia and the entertainment media that shape public opinion. Those who would defend the classical virtues must find themselves in a constant struggle against societal tendencies, and they must risk ostracism and ridicule for merely holding what has been held by practically all the great moral philosophers in history. The tyranny of the majority of which Tocqueville warned is evinced in the perception that the rectitude of same-sex “marriage” can be determined by persuasion of the majority. The majority, as we know, is notoriously fickle. Fifteen years ago, even liberals shrank from same-sex “marriage”; now, the propaganda machine would like to portray any opponent of such unions as a Neanderthal.

All too often, shifts in opinion on moral matters (and associated historical, sociological and anthropological judgments) hinge upon nothing more than emotion and propaganda. A thing becomes right or wrong simply because the current majority says it is. Such a hermeneutic is utterly unworthy of an adult human being, yet democratic culture makes it seem natural. Few even among the paleocons will go so far as to identify democracy as the root of moral relativism. Most have held some form of the naive view that the majority would freely accept virtue if only it were presented to them clearly. In actual experience, the morality of the masses, when uncoerced, gravitates toward mediocrity. We can see this with the gradual shedding of social constraints and the coarsening of mores over the last forty years. This coarsening is expressed in dress, diction, and bearing, as well as more quantifiable sociological phenomena. On the Internet, the more popular sites invariably attract cruder and more degenerate discourse. While democracy romanticizes the virtue of the masses, reality teaches that we can hardly expect great virtue from a people fed a steady diet of mind-numbing television and Twittering.

The ancient Athenians recognized that democracy, or indeed any form of government, could work only if it was governed by laws, which they called nomoi. The nomoi were not the acts of a legislature, but basic moral precepts that defined the legal principles of society. Even the popular assembly did not presume to have direct authority to change the nomoi, though they sometimes appointed a committee of jurists to recommend additions to the laws. Even this limited power was too much in the eyes of Plato and Aristotle, who emphasized that nomoi, especially those that are unwritten, must bind even the people as a whole. For this reason, they posited the necessity of founding a polity with a lawgiver such as Solon, a man (or men) of eminent ability, whose superior wisdom would establish basic laws that are better standards than most would choose for themselves.

When basic moral principles are considered immutable, or at least not subject to popular sovereignty, the nomoi rule, and people only implement them. They are to be amended only after grave circumspection by the most competent men. The basic morality of society as a whole is shaped in large part by the excellence of the lawgiver. If, on the other hand, people are given full sovereignty even over right and wrong, we will invariably gravitate toward social mores that reflect the moral mediocrity of the majority. Few would work if there was no need, and few would strive for excellence unless they were constrained to do so. We should expect a society just moral enough to keep the economy functioning, and indeed we increasingly expect our statesmen to be little more than business managers.

When a political or religious institution commits some crime, demagogues like to say that the institution has lost its moral authority or credibility. What, then, shall we say about the moral authority of the masses? Most of the great crimes of powerful institutions were committed with popular consent, and even when the people are sovereign, they seem to be unable to discern whether it is moral to exterminate hundreds of thousands of unarmed people, or to invade a nation without provocation. Given this dismal track record, I should not like to entrust any of our civilization’s most revered values to the whims of popular sovereignty. To quote Horace, Quid leges sine moribus vanae proficiunt? Modern political society seeks in vain its salvation through statutes and policies, as long it pursues moral mediocrity. The notion that the people are sovereign even over morals has led to the enshrinement of our baser instincts as rights. Those on the political left wallow in the sins of eros while those on the right commit those of thanatos. If society exists for something more than the fulfillment of animal impulse, it ought to strive for something better than the natural human condition.

Keynesianism Reconsidered

The current global financial crisis has proven profoundly resistant to typical Keynesian solutions. Interest rates have been reduced to nearly zero, the money supply is being expanded indefinitely, and fantastic levels of government spending are proposed, yet the stock market continues to fall, investors flock to gold and other safe havens, and banks refuse to extend credit. Whatever merits Keynesianism may have in a crisis of supply or demand, it is not a panacea for every type of economic problem, as was proved by the stagflation of the 1970s. Not all government interventions stimulate production and consumption, nor is it always economically wise for the government to attempt such stimulation.

The current crisis is the product of a burst credit bubble, fueled in large part by U.S. mortgage-backed securities and derivative financial instruments. Most Americans are only able to acquire substantial wealth through equity in a home or other real estate, and this has been facilitated by long-term mortgages, the primary mechanism of home “ownership” in the U.S. since the 1930s. In the last twenty years, a bewildering array of more complicated mortgages, with variable interest rates and even interest-only payments, have made home “ownership” more accessible, resulting in a housing boom. However, many of these mortgages were issued to people who ultimately could not pay, either because their credit risk was ignored or the terms were too complicated and interest rate changes made the mortgage unaffordable. An increase in defaults, combined with a collapse in the value of mortgage-backed securities, further magnified by irresponsible overleveraging in the financial sector, resulted in a crisis where investment banks no longer knew the value of their securities (since these were complex mixtures of many mortgages) and institutions were reluctant to lend. Housing prices fell even more, as the cost of housing in the U.S. had been artificially driven up by the ease of obtaining large mortgage loans, yet homeowners were still obligated to pay according to their original terms. Real estate has ceased to be an effective means of adding wealth, as the long-term value of a home does not appreciate better than the price of gold.

Since Americans have little savings, and many of them have lost their primary means of obtaining wealth, the most rational thing for them to do is spend less and save more. A Keynesian stimulus of consumption would be an irrational demand for people to act contrary to their best interest. Similarly, the burst of the housing bubble and corresponding depreciation of securities is the result of an overvaluation of assets that is now being corrected. It is only natural, therefore, that banks should wish to build up their assets before lending extensively. We should not be surprised, then, that beneficiaries of last year’s Troubled Assets Relief Program (TARP) have done just that. Interest rates were fairly low even before the crisis, and the government has made it known for several months that it will print money ad infinitum if necessary, yet banks still are reluctant to lend. This is because the Keynesian remedy of easy credit and an infinite money supply cannot induce private entities to act against their own interests.

Nonetheless, there is a danger of a vicious cycle, as the rational behavior of banks forces businesses to contract, as they can not rely on as much credit as previously. A contraction in industry raises unemployment, and weakens the basis of the economy, causing further contraction. On the one hand, it is certainly a good thing for businesses to be less dependent on credit, but on the other hand, the optimization of private enterprise depends on a consolidation of capital, which often means large costs up front, requiring lines of credit or some other form of investment. Therefore it would be wise for the government to pursue policies that encourage or stimulate capital investment. Not only is this critical to breaking the downward spiral of consumption and production, but it would also show a commitment to long-term growth. When a government program guarantees long-term benefit to a company, that company will feel secure in expanding its operations, including hiring more employees. People are intelligent enough to realize that a one-time payout will not help them in the long term, so that sort of stimulus will be ineffective, as people will not use the money for capital investment.

Just as the Bush administration failed to recognize that not all tax cuts stimulate growth, so the Obama administration has apparent difficulty understanding that not all spending plans stimulate capital investment. Mr. Obama recently defended his proposed program to upgrade the light fixtures in government buildings as beneficial simply by virtue of the fact that it “creates jobs,” neglecting to realize that even digging a ditch and filling it “creates jobs,” yet has no long-term benefit. A one-time construction project or other government contract will help a company’s balance sheet, but it is not a long-term guarantee, so the company will not make capital investments or permanent expansion in payroll. Good spending programs will specifically finance capital investments: physical plants, engineering research, transportation infrastructure. Government assistance for these investments secures long-term benefit for businesses, and circumvents the current lack of credit availability.

Another means of stimulating capital investment and job creation is to reduce the cost of these through tax cuts. A reduction in the long-term capital gains tax will make it worthwhile for small investors to buy equities, without rewarding speculators. More generous tax exemptions for capital assets will encourage capital investment, and a reduction in payroll taxes would make hiring more affordable, as well as create more long-term disposable income for the working class. Other significant ways to reduce cost include reforming the economically dysfunctional healthcare system (a legacy of 1930s politics and the ill-conceived HMO Act of 1973), and tackling the cost of higher education, a principal source of large individual debt, perhaps by improving secondary education. It is unlikely that the current administration has the mettle to overcome the formidable entrenched interests defending the status quo on health care and education, but Mr. Obama has at least acknowledged that both parties will need to re-evaluate their traditional positions on entitlements.

Determining economic policies is a tricky business, since one can never be sure of a policy’s effectiveness, as we will never know the counterfactual. Economists debate to this day whether the New Deal prolonged or shortened the Depression; there is no way to resolve the question empirically. A national economy is a highly complex, non-linear system, so we should expect it to be unpredictable, and hesitate to equate correlation with causality. Further, it is driven not by mindless, random variables, but by thinking human beings, who may thwart our models. For example, a reduction in interest rates designed to boost investor confidence may eventually have the opposite effect, since everyone knows it is intended to boost confidence, and therefore infer that things must be bad. The surest path to achieving the desired result is to pursue policies that are consistent with what is in the rational economic interest of those to be benefited. If it is in someone’s rational interest to save, do not encourage them to spend. Instead, reward capital investment and other activities that directly improve productivity. Short of a command economy, the swiftest means of implementing an economic policy is to make it conform with what private entities would willingly choose for themselves given the opportunity.