Cancer: A Disease of Civilization

The only thing surprising about the Villanova study indicating that “cancer-causing factors are limited to societies affected by modern industrialization” is that this is actually considered news. It has long been known that the incidence of cancer is extremely small in pre-industrial societies, nearly non-existent. In order to defend the current culture of emphasizing intervention over prevention, oncologists and others have argued that this is only an artifact of people living longer. The oft-repeated excuse that ancient people rarely lived to be more than 30 is profoundly ignorant and misleading.

When we say that ancient societies had life expectancies of 30 or so, this does not mean that people typically died around age 30. This is an average life expectancy at birth, a figure that is heavily skewed downward by a high infant mortality rate. Those who reached adulthood could reasonably expect to live into their 40s or 50s, and many lived to be over 70. This is why, when you read ancient Greek and Roman texts, such as the lives of the philosophers, there is nothing considered remarkable about living to eighty years old, and it is considered tragic if someone dies in their thirties. The “life expectancy” was much shorter, but the lifespan of man was just as long. People didn’t start having their bodies break down at age 45 just because the average life expectancy was shorter.

The same is true among Native Americans, for which we have a better record, since they lived in technologically primitive conditions until relatively recently. Nonetheless, they were noted for their health and longevity. Plenty of seventeenth century documents attest to the long lives of the Indians in Mexico, for example, where there were many who lived to well over eighty, including some reputed centenarians. The Inuit had no incidence of cancer whatsoever, despite living long enough to develop it (after all, we start screening for cancer at age 40, or 50 at the latest). It was only in the twentieth century that cancer became prevalent among the Inuit, when they adopted modern diet and a sedentary lifestyle.

In the mid-twentieth century, it used to be thought that men having heart attacks at 50 was just something that happens as part of the aging process, but now we know that early heart disease is fully preventable, being a consequence of behavior: diet and lack of exercise. Drug companies and doctors who advocate aggressive interventions downplay prevention, since selling you the cure is much more lucrative. To this day, there is little emphasis on nutrition in medical school, and little serious research into the toxicity of processed food except in simplistic dose-response studies. Our bodies are complex organisms, and there are likely many interaction factors among substances that, by themselves, may appear harmless.

The Villanova researchers do not do themselves any favors when they make the excessive claim that “There is nothing in the natural environment that can cause cancer.” Still, the natural incidence of cancer among animals is astonishingly low, especially for a disease that is supposedly just a natural consequence of living long. Animals kept in captivity, protected from predation, can live their full natural lifespan, so we should certainly expect to see a significant incidence of cancer among them, but we do not. Up until the eighteenth century, cancer was comparably rare among humans. Indeed, when you read older documents mentioning a “cancer,” they are often referring to gangrene or some other degenerative growth. Cancer as we know it was extremely rare, though there were plenty of people who lived to old age.

The real turning point appears to have been in the last few centuries. The hallmarks of societies that have what we now consider “normal” cancer rates are consumption of refined sugars and carbohydrates (following the discovery of the Americas), tobacco use, giving first childbirth later in life, and universal use of toxic pesticides and other pollutants. The sheer abundance of toxins in our food makes it practically impossible to single out a cause in a simple dose-response relationship, which is why countless harmful things, each of which is minimally harmful, are allowed to remain on the market.

The embarrassing fact that many of the technological niceties that are forced upon us ostensibly to improve our lives (but in fact to reduce the cost of production and increase profit) are actually killing us makes a mockery of the notion of science and technology as a sort of earthly salvation. The technocratic establishment expects to be praised for saving us from diseases created through their own uncritical hubris (“Don’t be silly, this won’t hurt you, you Luddite”). For those who are hoping for salvation through medical science, I should remind you that researchers are just flesh-and-blood human beings, many of whom are susceptible to vanity and self-aggrandizement. These sort understand perfectly well that an ounce of prevention is worth a pound of cure, or ten pounds of “treatment”. The medical establishment has judged it is much more profitable to sell you the ten pounds, and it’s hard to argue with their math, as the cancer treatment industry is currently worth over $200 billion.

N.B. Nonetheless, there is some good research that addresses the root causes of cancer and how to prevent it. One especially worthy institution in this regard is the American Institute for Cancer Research.

Mushroom Clouds and Moral Mediocrity

Leave it to a comedian to state plainly that Truman’s use of atomic weapons was a war crime, only to backpedal out of political expediency faster than you can say Arlen Specter. This short-lived moment of lucidity has occasioned a lively discussion on the topic, by no means confined to the political left. Although many paleoconservatives in Truman’s day and beyond expressed horror at the bombing of Hiroshima and Nagasaki, conservative opinion today generally favors nationalism over morality, and we are invited to examine the supposed moral complexity of the issue. In today’s political right, only a few social conservatives and libertarians are willing to state the obvious moral truth that mass incineration of civilians is not justified by military expediency.

It would be easy to argue that the callousness of mainstream conservatives toward the rights of foreign prisoners and noncombatants is a major cause of its recent decline in popularity. It would also be wrong. Sadly, most Americans are willing to hedge basic moral principles for the sake of security. To the chagrin of the liberal news media, a majority of Americans polled in favor of the recent NSA phone surveillance program, even well after details of the program were leaked by the press. A strong majority of the public has supported keeping the Guantanamo prison open, and slim majorities have favored the use of waterboarding and other harsh interrogation tactics. As for pre-emptive wars of choice, public support of the Iraq war from 2003 to 2007 closely matched the perception that the war was going well. It would seem that we support illegal wars in the short term as long as we appear to be winning, though the perceived success of the 2007-08 surge did not restore the war’s popularity.

Unprincipled compromise of moral values can be found even among academics, who are supposedly more thoughtful. Although the vast majority of American historians are politically liberal, their vaunted concern for human rights diminishes when discussing Truman’s use of the atom bomb. Indeed, the man who wiped out two cities and later led the nation into the disastrous Korean War is cited by most historians as an example of an unpopular president who was later vindicated by history. This supposed vindication consists only in the approval of liberal historians, who are evidently as prone to place partisanship over principle as their conservative counterparts. We can only imagine what they would write if a Republican had dropped the bomb.

The general coarsening of morality, even among the educated and among those who claim to preserve traditional social values, is a worrisome development. Some paleoconservatives such as Pat Buchanan have adduced from this reality that the left has won the culture war, through their domination of academia and the entertainment media that shape public opinion. Those who would defend the classical virtues must find themselves in a constant struggle against societal tendencies, and they must risk ostracism and ridicule for merely holding what has been held by practically all the great moral philosophers in history. The tyranny of the majority of which Tocqueville warned is evinced in the perception that the rectitude of same-sex “marriage” can be determined by persuasion of the majority. The majority, as we know, is notoriously fickle. Fifteen years ago, even liberals shrank from same-sex “marriage”; now, the propaganda machine would like to portray any opponent of such unions as a Neanderthal.

All too often, shifts in opinion on moral matters (and associated historical, sociological and anthropological judgments) hinge upon nothing more than emotion and propaganda. A thing becomes right or wrong simply because the current majority says it is. Such a hermeneutic is utterly unworthy of an adult human being, yet democratic culture makes it seem natural. Few even among the paleocons will go so far as to identify democracy as the root of moral relativism. Most have held some form of the naive view that the majority would freely accept virtue if only it were presented to them clearly. In actual experience, the morality of the masses, when uncoerced, gravitates toward mediocrity. We can see this with the gradual shedding of social constraints and the coarsening of mores over the last forty years. This coarsening is expressed in dress, diction, and bearing, as well as more quantifiable sociological phenomena. On the Internet, the more popular sites invariably attract cruder and more degenerate discourse. While democracy romanticizes the virtue of the masses, reality teaches that we can hardly expect great virtue from a people fed a steady diet of mind-numbing television and Twittering.

The ancient Athenians recognized that democracy, or indeed any form of government, could work only if it was governed by laws, which they called nomoi. The nomoi were not the acts of a legislature, but basic moral precepts that defined the legal principles of society. Even the popular assembly did not presume to have direct authority to change the nomoi, though they sometimes appointed a committee of jurists to recommend additions to the laws. Even this limited power was too much in the eyes of Plato and Aristotle, who emphasized that nomoi, especially those that are unwritten, must bind even the people as a whole. For this reason, they posited the necessity of founding a polity with a lawgiver such as Solon, a man (or men) of eminent ability, whose superior wisdom would establish basic laws that are better standards than most would choose for themselves.

When basic moral principles are considered immutable, or at least not subject to popular sovereignty, the nomoi rule, and people only implement them. They are to be amended only after grave circumspection by the most competent men. The basic morality of society as a whole is shaped in large part by the excellence of the lawgiver. If, on the other hand, people are given full sovereignty even over right and wrong, we will invariably gravitate toward social mores that reflect the moral mediocrity of the majority. Few would work if there was no need, and few would strive for excellence unless they were constrained to do so. We should expect a society just moral enough to keep the economy functioning, and indeed we increasingly expect our statesmen to be little more than business managers.

When a political or religious institution commits some crime, demagogues like to say that the institution has lost its moral authority or credibility. What, then, shall we say about the moral authority of the masses? Most of the great crimes of powerful institutions were committed with popular consent, and even when the people are sovereign, they seem to be unable to discern whether it is moral to exterminate hundreds of thousands of unarmed people, or to invade a nation without provocation. Given this dismal track record, I should not like to entrust any of our civilization’s most revered values to the whims of popular sovereignty. To quote Horace, Quid leges sine moribus vanae proficiunt? Modern political society seeks in vain its salvation through statutes and policies, as long it pursues moral mediocrity. The notion that the people are sovereign even over morals has led to the enshrinement of our baser instincts as rights. Those on the political left wallow in the sins of eros while those on the right commit those of thanatos. If society exists for something more than the fulfillment of animal impulse, it ought to strive for something better than the natural human condition.

Sharia and the Weberian State

The brouhaha raised in Britain over the Archbishop of Canterbury’s suggestion that sharia law be partially legitimized has exposed the statism at the core of secularism and the fragility of the truce that traditional cultures have made with one another through liberalism. Secularist peace comes at the price of ethnic and religious identity; man is freed from his church and neighbor only to be enslaved by the state, which claims unlimited jurisdiction over civil society.

Several forces combined to create the excessive reaction to Archbishop Rowan Williams’ comments. First, sharia itself is popularly demonized as being brutal or barbaric, in the sense of endorsing draconian violence and not respecting liberal ideals regarding gender equality, apparently forgetting that these are less than a century old, and we can hardly expect all rational people to share our arbitrary mores. This line of criticism ignores the fact that Dr. Williams only proposed applying sharia in the way that Jewish courts are applied to financial and marital contracts, as a civil arbitration agreed to by both parties, so there is no question of coercion. Further, as Dr. Williams points out, sharia is not a monolithic system of law, but a method of jurisprudence applicable only to Muslims who voluntarily submit to the law, so it is not incompatible with pluralism.

Another point of criticism is found in the Archbishop’s astonishing statement that for citizens “to be under the rule of the uniform law of a sovereign state” with all other commitments being private in character “is a very unsatisfactory account of political reality in modern societies.” He attacks the heart of liberal political theory, which takes pride in equality before the law, without regard for personality. Before the so-called Enlightenment, political philosophy was more sophisticated on this point, recognizing that laws ought to be tailored to local circumstances and customs, and that there is no one-size-fits-all system of government or law. This political wisdom was retained in the federal system of the United States in the nineteenth century, though this was gradually undermined in favor of a European model of increased centralization. The centralized state had expanded at the expense of local government and civil society long before the French Revolution; indeed, the revolutionaries simply inherited the marvelous administrative apparatus of the ancien regime. As liberal democracy spread, clerical and aristocratic privileges were abolished, so all political power was consolidated in the state. In the process, the state acquired powers historically foreign to it, including the regulation of marriage and private financial matters.

These historical developments have led to a profoundly statist European culture that views any attempt by businesses, churches, or other institutions to assert their legal independence as cause for alarm. In the case of churches, a panic over theocracy arises whenever a church refuses to submit to the generic morality of the state. Such fears are thoroughly irrational, as the state, being far more powerful than any church, is a far greater threat to liberty than any ecclesiastical bogeymen. The masses flee the supposed tyranny of traditional institutions to labor under the much heavier yoke of modern government, which levies higher taxes than any ancient tyrant ever dared, and claims unlimited jurisdiction over all human affairs. This idolatrous concept of the state, circumscribed by neither natural nor divine law, was candidly described by Max Weber as the monopolization of the legitimate use of force, and of lawmaking. Today, the Weberian state so jealously guards its monopoly over violence, that it would pretend to have authority to decide whether parents may spank their children. Where one stands on such an issue is only of secondary importance; what matters is that the state actually claims the right to decide the issue, as for all human affairs. The state alone can coerce; the state alone can demand obedience, while other institutions only meekly request it of their members.

In the United States and much of the Americas, statist tendencies are checked by a robust cultural heritage of limited government, but in most of Western Europe, including Britain, statism is conventional wisdom. Indeed, the more stridently secularist parties tend to have the firmest conviction that the state ought to have plenary jurisdiction over civil society. The fatal mistake of statist liberalism is to mistake democracy as the basis of liberty, when the real basis of liberty is limited government. A state with absolute sovereignty is just as tyrannical whether it is monarchically or democratically constituted.

Rowan Williams has touched a sore nerve by pointing out that liberal democracy contradicts its promise of tolerance and multiculturalism by insisting on a uniform rule of law, without regard for what is reasonable in specific cultural contexts. By doing so, liberal governments deny many groups “the right to speak in their own voice”, as when they pass laws and rulings that admit no exception for religious conscience, defining their secularist views (often a minority opinion!) as the basis for what is reasonable, coherent and acceptable. It would seem they do not despise tyranny so much as they prefer their own sort of tyranny.

Archbishop Williams did not dare to suggest that the state’s sovereignty be circumscribed, but only proposed that people may voluntarily submit to other sources of authority. He is not ready to abandon the Weberian state, but even the concession that there should be any human authority besides the state is too much for many secularists to bear, even though in fact Britain already allows civil arbitration by religious courts. Hatred of religion is strong enough in some to make them forget their contempt of bureaucrats and strenuously endorse the monopoly of the state.

Just as the Romans subverted all local cultures and religions by including foreign gods in the Roman pantheon and requiring only that Caesar be worshipped universally, so too does liberalism undermine the freedom of local culture, as witnessed by the devastation of French and Italian rural cultures, replacing them with an increasingly amorphous, bland consumerism. The great paradox of the Enlightenment is that supposed political liberty has led to cultural homogenization, masked by a bewildering diversity of consumer goods. This is because the only social mores that are enforceable are those of the state. With the magnitude of modern states, an individual’s vote counts for practically nothing, and in fact he can do little to alter the bureaucratic system in which he is enmeshed. He is nominally a citizen, but practically a subject, and if he calls himself an atheist, he should be humbled to learn he is no less a slave of Caesar than the most superstitious pagan.

In addition to the issues raised by Dr. Williams, sharia presents a special challenge to the West in that it does not recognize a neat distinction between religious and civil law (a medieval European development), much less a Weberian concept of the state and civil society. In Islam, society is an integrated whole. One does not need to share this holistic view of society to present a stumbling block to the liberal model; one only needs social principles at odds with those of the liberal state. The social peace of liberalism is a sort of devil’s peace where all one’s enemies are dead or enslaved, as all institutions are silent before the overwhelming force of the state (or more realistically, the state and its friends in the private sector). The Weberian state can make life materially pleasant, but woe to the one who conscientiously prefers a distinct set of social principles, not to be relegated to the ghetto of “private” activity.