The Chimera of Life on Mars

Last November, NASA scientists rehashed an old claim – no, not that the polar cap will disappear in four years – but that there is evidence of life on Mars on an asteroid discovered in 1984. Further clarification has been offered over the weekend. Behind all the hullaballoo over how these “biomorphs” were found, one inconvenient fact is ignored: they are too small to be living things.

Back in the nineties, “nanobacteria” were theoretical self-replicating calcifying particles believed to be necessary to account for certain bacterial functions. At the time the first “life on Mars” announcement was made in 1996, it was still believed that nanobacteria could be true lifeforms. The shape and chemical composition of formations on the ALH84001 asteroid were similar to putatively identified nanobacteria of Earth. However, a geological process was suggested to explain the formations, so the nanobacterial explanation of the asteroid subsided in public discourse.

It should be emphasized that “nanobacteria”, contrary to their name, are not bacteria, or even lifeforms at all. At 20-100 nm across, they are too small to have nucleic acids (DNA or RNA) or conduct even the most basic bacterial biochemistry. The smallest known bacterium, by contrast, is well over 300 nm across, and it is physically impossible for a true lifeform (with nucleic acid, ribosome, protein) to be smaller than 200 nm. All known nanobacteria have no organelles of any sort; they are just globules of carbonate compounds. To call them lifeforms is a serious abuse of language.

The death knell for nanobacteria was sounded in a series of papers in 2007, 2008 , and 2009 proving that putative “nanobacteria” were just carbonate nanoparticles. They may replicate, as do many crystals and polymers, but they have no nutritive or genetic functions, so they can hardly be considered life unless we lower the bar considerably.

Ironically, although the star of nanobacteria had by then faded, in 2009 NASA published evidence that the observed iron sulfide and iron oxide grains on the 1984 asteroid were not the result of a geological process, but were likely caused by nanobacteria activity. The NASA scientists, followed by the press, proclaimed this as evidence of life on Mars, even though by now it was understood that nanobacteria are not lifeforms in any standard sense of the term. Some reports even referred to nanobacteria as tiny bacteria, as if the only difference were one of size, when in fact the inaptly named “nanobacteria” are not bacteria at all. The lay reader could be excused for not knowing that this “life from Mars” is too small and structureless to have DNA or RNA or any internal metabolism.

What the new evidence really shows at most is that the grains on the Mars asteroid really were produced by nanobacteria and not by some other geological process, so the globule formations on the asteroid are likely to be fossil nanobacteria. This identification however does not have any bearing on the microbiological question of whether nanobacteria themselves are living things, a question that is increasingly being answered in the negative as the crystallization processes that form these nanoparticles become better understood.

It is certainly possible that so-called nanobacteria are a precursor to life, just as water, as far as we know, is a necessary condition for life. Yet a necessary condition is not a sufficient condition, so the presence of carbonate nanoparticles is a far cry from proof that Mars ever developed life at some point, just as the presence of water is no proof that Mars was ever habitable. I may need wood to build a house, but the presence of wood in a forest is no proof that there were ever houses there. Such is the vast gap between nanoparticles and the simplest bacterium. Abiogenesis, it should be noted, remains pure speculation, without any empirical verification or well-defined theoretical mechanism.

The real reason to tart up the evidence for life on Mars is to justify the boondoggle of our manned space program, which is still struggling to develop a vehicle to return to the Moon. There is practically no real science that can be done with manned missions that cannot be done with robots, except studies on the effects of microgravity on humans. The idea that there might be other life out there is what gives the whole concept of manned space exploration some credible purpose. Otherwise, we’re just exploring rocks and gas giants that could just as easily be observed by probes.

While one should never say never, the Fermi paradox is looking as strong as ever, and we may at some point be forced to admit that life is so rare in the universe that we are unlikely to encounter it in any form for thousands of years, if ever. Our efforts might be better spent, then, in tending to our own world, while others perhaps tend to theirs.

Update – 29 Apr 2010

In a case of “be careful what you wish for,” the Obama administration has effectively killed the U.S. manned space program, canceling the Constellation program without any alternative, while the shuttles will be grounded after this year. Responding to a negative backlash, the politically adroit president backpedaled, making a vague promise (not a concrete mission) of an eventual trip to Mars, and partially reactivating the Orion capsule program as an unmanned payload and escape capsule. These are purely face-saving measures. The President does not even have to decide on a heavy lift design until 2015, which means there will likely be at least a full decade when the U.S. does not have manned space flight capability, assuming the program isn’t postponed or killed along the way. In the meantime, we can expect an exodus of aerospace talent to Europe, Russia and China.

Taking a page from George W. Bush, Obama has sought cover from the private sector, promising that companies will be able to take astronauts into low earth orbit for ISS missions. In fact, they are nowhere near such capability. There is a vast technological gap between launching manned and unmanned spacecraft, and between orbital and suborbital flights. Only one company, SpaceX, has successfully achieved suborbital manned space flight. Their “spacecraft” are actually rocket-powered aircraft that exceed Mach 3. It would require more than 60 times the energy to reach orbital speed, not to mention the formidable engineering difficulties of space navigation, docking, life support, and re-entry. For every SpaceX, there are a dozen failed space ventures, yet this is supposed to be our most cost-effective means of returning to LEO.

Make no mistake: Obama is not a fool. He simply is not terribly interested in the manned space program, as is proved by his actions. Nor can he expect sound advice from his space policy advisor Lori Garver, a career policy wonk with zero engineering experience and a fetish for “climate change” pseudoscience. It is true that much good science can be done more cost-effectively with unmanned probes, as I myself have argued above. But let us have no illusions that the scuttling of the manned space program is actually an intelligent plan to improve it. More than a few scientists, who would have derided Bush as an anti-intellectual giving handouts to companies had he done this, are falling over themselves to buy into Obama’s promises of next-generation lift technology to Mars, despite losing our capability of this-generation lift technology to the Moon. Yet I have been exposed to the academic community long enough to know that liberal sympathies trump logic, notwithstanding their pretensions of scientific objectivity.

8 Ways to Reduce Health Care Costs

Currently, most discussion in the United States regarding health care reform is focused on the extent to which health care costs are covered by public or private insurance. However, it will make little difference whether health care is funded privately or publicly if the cost of health care is not held in check. Even with the current predominantly privatized system, the federal budget will be overwhelmed in a few decades by Medicare outlays, which are rising much faster than inflation due to increased health care costs. If the government will not be able to afford paying for health care for seniors, it will hardly be able to provide it for others. Our energies would be better directed, therefore, at reducing the cost of health care.

There are several ways that we can reduce health care costs almost immediately; others will take more time. These approaches are primarily directed at undoing the antiquated system of medical education and hospital regulation, much of which is grounded in the politics of the 1930s. Supposedly non-profit hospitals and universities are able to gouge the public for their services, yet receive federal subsidies. It would be considered abominable if a non-profit charity or church indulged in the extravagances of these institutions, many of which have billion-dollar endowments. Anyone who has seen an $8000 bill for routine testing or a night’s hospital stay should know that the major culprits behind health care cost are the service providers, not the insurance companies.

Indeed, the oft-maligned HMOs are themselves the product of the last great effort at liberal health care reform, the HMO Act of 1973. Prior to the HMO act, most health insurance only covered major illnesses or surgeries, just as auto insurance only covers accidents, not routine maintenance. Ordinary care could be paid out of pocket, or in the cases of the extremely poor, not at all. This was not burdensome, since the cost of a routine doctor’s visit was not much. With the advent of mandatory employer health insurance, much more extensive coverage was required, driving up the cost of insurance premiums. Further, since the patient no longer paid for routine health care out of pocket (except a fixed co-payment), he did not care what exorbitant fee the hospital charged the insurance company. Since patients no longer had any incentive to keep costs down, and indeed were often unaware of the price, the cost of health care could rise, and the insurance companies would pass on this cost in their premiums.

Going further back, our medical education system has relied on a bizarre system of requiring four years of medical school in addition to university, and then three or four years of grossly underpaid internship or “residency”, after which the young doctor emerges with staggering debts often well over $100,000. Given this burden of education and debt, it is no wonder that a large proportion of American doctors choose to become high-priced specialists, leaving a deficiency of general practitioners. Only the extremely intelligent can pass the rigors of U.S. medical schools, where emphasis is placed on the mathematical and analytical aspects of medicine, and less on preventive nutrition and humanitarian aspects. The typical medical school student is intellectually overqualified to be a GP; many highly competent doctors in other countries would not pass muster in U.S. schools, even though experience proves they are quite capable in their profession.

Nurses possess the education necessary to diagnose and treat most common ailments; and pharmacists have the education to prescribe medications, yet both are prohibited from exercising their craft due to our arcane medical system that requires an MD to be involved in every diagnosis and treatment. Given that this same system induces a shortage of GPs, this can only drive up cost. What is worse, these science-oriented, non-humanitarian doctors tend to think everything is to be solved with expensive testing and drugs, especially since ordering tests and drugs takes less of their time than getting to know the patient and his behavior. I once had an “old school” doctor who resigned out of disgust with the increasing pressure to become a pill dispenser.

The residency program, which is supposed to be a time of apprenticeship, is often just an opportunity for the hospitals to exploit cheap labor. Residents often work 100 hours a week, with little sleep, thereby impairing performance, and much of their time is spent on menial administrative tasks unrelated to patient treatment. With yet another three years of lost income, it is no wonder that they all wish to cash in on a high-paying position. To cover their increased debt, they will actually have to be paid more than if they had been paid justly in the first place.

In sum, I propose the following cost-cutting measures:

  1. Abolish Direct-to-Consumer (DTC) advertising for prescription drugs, as this was illegal before the 1990s, and consumers (not to mention doctors) are often poor judges of what medications they should take. Billions of dollars would be saved from this act, since the cost of TV advertising would no longer need to be built into drug costs.
  2. Reform the residency program so interns have the right to demand competitive salaries and reasonable hours. Increasing their salaries should actually lower costs in the long run, as doctors will begin their careers less debt-laden.
  3. Allow RNs and pharmacists to diagnose and treat ailments within their competency. This will reduce costs, as they have lower salaries.
  4. Reduce the cost of medical school (by student grants) or allow admission to medical school direct from high school. Less stringent mathematical requirements should apply to GPs, as mathematical geniuses don’t necessarily make better family doctors. A certain amount of scientific hubris would have to be swallowed here.
  5. Set limits on tort claims related to medical malpractice. This will lower malpractice insurance premiums, greatly reducing the salary demands of doctors.
  6. Stress preventive medicine in medical education, including nutrition and exercise. Less emphasis on surgery, drugs, and high-tech testing as solutions to preventable diseases will greatly reduce costs.
  7. Require full advance disclosure of costs to the patient. Often the patient does not know or care what the cost is, allowing gouging by the hospitals. Ideally, insurance should not cover routine care, which would force caregivers to drive down their prices in order to be competitive among consumers. Emergency care should be price-regulated, since the patient often has no choice of caregiver in such a situation.
  8. The government could demand more stringent accounting from hospitals and universities to account for their non-profit status. If the institution is federally funded, salaries derived from ordinary revenue should be held to the federal executive pay limit. Alternatively, abolish their non-profit status, and dispense with the myth that hospitals and universities are not just businesses.

Mushroom Clouds and Moral Mediocrity

Leave it to a comedian to state plainly that Truman’s use of atomic weapons was a war crime, only to backpedal out of political expediency faster than you can say Arlen Specter. This short-lived moment of lucidity has occasioned a lively discussion on the topic, by no means confined to the political left. Although many paleoconservatives in Truman’s day and beyond expressed horror at the bombing of Hiroshima and Nagasaki, conservative opinion today generally favors nationalism over morality, and we are invited to examine the supposed moral complexity of the issue. In today’s political right, only a few social conservatives and libertarians are willing to state the obvious moral truth that mass incineration of civilians is not justified by military expediency.

It would be easy to argue that the callousness of mainstream conservatives toward the rights of foreign prisoners and noncombatants is a major cause of its recent decline in popularity. It would also be wrong. Sadly, most Americans are willing to hedge basic moral principles for the sake of security. To the chagrin of the liberal news media, a majority of Americans polled in favor of the recent NSA phone surveillance program, even well after details of the program were leaked by the press. A strong majority of the public has supported keeping the Guantanamo prison open, and slim majorities have favored the use of waterboarding and other harsh interrogation tactics. As for pre-emptive wars of choice, public support of the Iraq war from 2003 to 2007 closely matched the perception that the war was going well. It would seem that we support illegal wars in the short term as long as we appear to be winning, though the perceived success of the 2007-08 surge did not restore the war’s popularity.

Unprincipled compromise of moral values can be found even among academics, who are supposedly more thoughtful. Although the vast majority of American historians are politically liberal, their vaunted concern for human rights diminishes when discussing Truman’s use of the atom bomb. Indeed, the man who wiped out two cities and later led the nation into the disastrous Korean War is cited by most historians as an example of an unpopular president who was later vindicated by history. This supposed vindication consists only in the approval of liberal historians, who are evidently as prone to place partisanship over principle as their conservative counterparts. We can only imagine what they would write if a Republican had dropped the bomb.

The general coarsening of morality, even among the educated and among those who claim to preserve traditional social values, is a worrisome development. Some paleoconservatives such as Pat Buchanan have adduced from this reality that the left has won the culture war, through their domination of academia and the entertainment media that shape public opinion. Those who would defend the classical virtues must find themselves in a constant struggle against societal tendencies, and they must risk ostracism and ridicule for merely holding what has been held by practically all the great moral philosophers in history. The tyranny of the majority of which Tocqueville warned is evinced in the perception that the rectitude of same-sex “marriage” can be determined by persuasion of the majority. The majority, as we know, is notoriously fickle. Fifteen years ago, even liberals shrank from same-sex “marriage”; now, the propaganda machine would like to portray any opponent of such unions as a Neanderthal.

All too often, shifts in opinion on moral matters (and associated historical, sociological and anthropological judgments) hinge upon nothing more than emotion and propaganda. A thing becomes right or wrong simply because the current majority says it is. Such a hermeneutic is utterly unworthy of an adult human being, yet democratic culture makes it seem natural. Few even among the paleocons will go so far as to identify democracy as the root of moral relativism. Most have held some form of the naive view that the majority would freely accept virtue if only it were presented to them clearly. In actual experience, the morality of the masses, when uncoerced, gravitates toward mediocrity. We can see this with the gradual shedding of social constraints and the coarsening of mores over the last forty years. This coarsening is expressed in dress, diction, and bearing, as well as more quantifiable sociological phenomena. On the Internet, the more popular sites invariably attract cruder and more degenerate discourse. While democracy romanticizes the virtue of the masses, reality teaches that we can hardly expect great virtue from a people fed a steady diet of mind-numbing television and Twittering.

The ancient Athenians recognized that democracy, or indeed any form of government, could work only if it was governed by laws, which they called nomoi. The nomoi were not the acts of a legislature, but basic moral precepts that defined the legal principles of society. Even the popular assembly did not presume to have direct authority to change the nomoi, though they sometimes appointed a committee of jurists to recommend additions to the laws. Even this limited power was too much in the eyes of Plato and Aristotle, who emphasized that nomoi, especially those that are unwritten, must bind even the people as a whole. For this reason, they posited the necessity of founding a polity with a lawgiver such as Solon, a man (or men) of eminent ability, whose superior wisdom would establish basic laws that are better standards than most would choose for themselves.

When basic moral principles are considered immutable, or at least not subject to popular sovereignty, the nomoi rule, and people only implement them. They are to be amended only after grave circumspection by the most competent men. The basic morality of society as a whole is shaped in large part by the excellence of the lawgiver. If, on the other hand, people are given full sovereignty even over right and wrong, we will invariably gravitate toward social mores that reflect the moral mediocrity of the majority. Few would work if there was no need, and few would strive for excellence unless they were constrained to do so. We should expect a society just moral enough to keep the economy functioning, and indeed we increasingly expect our statesmen to be little more than business managers.

When a political or religious institution commits some crime, demagogues like to say that the institution has lost its moral authority or credibility. What, then, shall we say about the moral authority of the masses? Most of the great crimes of powerful institutions were committed with popular consent, and even when the people are sovereign, they seem to be unable to discern whether it is moral to exterminate hundreds of thousands of unarmed people, or to invade a nation without provocation. Given this dismal track record, I should not like to entrust any of our civilization’s most revered values to the whims of popular sovereignty. To quote Horace, Quid leges sine moribus vanae proficiunt? Modern political society seeks in vain its salvation through statutes and policies, as long it pursues moral mediocrity. The notion that the people are sovereign even over morals has led to the enshrinement of our baser instincts as rights. Those on the political left wallow in the sins of eros while those on the right commit those of thanatos. If society exists for something more than the fulfillment of animal impulse, it ought to strive for something better than the natural human condition.