Cultural Amnesia and the Marriage Question

Four leftist judges in California have substituted their confused moral philosophy for judicial precedent and popular sovereignty in order to advance a nonsensical definition of marriage. This act reflects the ill-founded presumption that the Weberian state has the authority to re-define pre-existing institutions such as the family, from which the state’s own authority is derived. The absurdity of applying a twenty-first century leftist interpretation to a nineteenth-century constitution is lost among those who value the end over the means.

For the moment, let us not concern ourselves with the incoherence of liberal jurisprudence, nor with the usual arguments regarding same-sex attraction. The popular misconceptions surrounding these issues do not admit of a simple, pithy response, though at least one attempt has been made to summarize the relevant arguments from a Catholic and natural law perspective. Instead, I should like to turn attention to the extraordinary fact that most defenders of the California court’s decision seem to be wilfully unaware of the fact that this position was recently considered extremist even among liberals, yet no account has been made of the reason for this sea-change of opinion.

Only fifteen years ago, the idea of same-sex unions constituting a marriage in a sense fully equivalent to conventional marriage was not mainstream opinion, even among social liberals. Indeed, the anthropologist searches in vain for a precedent, finding at best some rites of friendship or clandestine practices. Even the numerous ancient societies that approved of homosexual acts did not pretend that this had anything to do with marriage, which was irrevocably tied up with the rearing of children. As recently as forty years ago, the weight of scientific opinion, even among atheists, was that homosexuality was a psychological disorder, and numerous case studies indicated that it could be cured. Of course, the classification of a behavior as a mental disease involves a normative moral judgment. As mores changed, so would the definition of health.

The de-classification of homosexuality as a mental disorder in the 1970s was likely a reflection of changes in attitudes toward sexuality rather than the result of any scientific breakthrough, since to this day our understanding of same-sex attraction remains rudimentary and speculative. The influence of social attitudes on scientific inquiry could be seen in the 1990s, when there were several premature claims to have found a genetic basis for the attraction, evincing a desire to find such a basis. Given the lack of scientific progress, and the further fact that most people are scientifically illiterate, we cannot invoke science as a primary reason for the recent sea-change in opinion regarding same-sex unions.

Changes in modern social mores are dictated by two main arbiters of propriety: the state and the cultural media. Church, family and ethnicity tend to conserve values rather than fabricate new ones, while the liberal state constantly creates new mores via legislation and judicial rulings, and the cultural media, especially through the verisimilitude of television and film, have the power to shape impressions about what is normal behavior. We have just seen the power of a compact judicial majority of four over three to dictate mores to a state of 50 million people. Television, film, and journalism have also shaped mores, thanks to the effective nullification of decency standards in the 1990s, promoting a progressively vulgar and sexually hedonistic ethos, in which there is, understandably, nothing remarkable about same-sex attraction.

To this day, the majority of male homosexuals have little interest in monogamous marriage, yet they have advocated this issue strenuously, in order to achieve their ideological goal of full equivalence with “traditional” marriage, or marriage as it has always been known. Already aided by leftist judges and the increasingly libertine media, some have sought to indoctrinate children via public schools. Some U.S. federal judges have actually ruled that parents do not have a right to be notified of such indoctrination, which includes explicit references to same-sex “marriage”. Unable to produce children in their sterile relationships, the more militant homosexuals and their defenders insist on the right to shape the mores of other people’s children.

This act of violence against the family is not peculiar to this issue, but reflects a broader presumption that the state has greater rights over a child than a family. The proper response to this assault on familial rights is to resist such forms of state control, with force if necessary, to make clear that the state is but a servant of the households that formed it. The state that loses sight of this fact deserves to be dismantled until it returns to its proper role in society.

Vast resources of the state and private media have been directed (though not always consistently) toward shaping public opinion in precisely the form it has taken, often using unscrupulous tactics to suppress contrary evidence. Yet this trick could never work upon people who have a cultural memory independent of what is served through mass discourse. For such people, it would not suffice to assent to some vague sense that this is the “modern” or “progressive” thing to do, without any notion of how we got here. Here we come upon the most stunning aspect of such social changes: the complete cultural amnesia that makes it possible to forget the state of affairs of only fifteen years ago.

Fifteen years ago, the West was socially liberal and highly literate, yet by no means advocated the current paradigm of same-sex unions that is now proffered as a standard of liberalism, rationality, and open-mindedness. To regard anyone who differs on this issue as a hidebound conservative evinces a breathtaking ignorance of the recent past, where one could be liberal, secular and rational, yet regard same-sex “marriage” as bunk. I recall the famous words of Cicero:

Nescire autem quid ante quam natus sis acciderit, id est semper esse puerum. Quid enim est aetas hominis, nisi ea memoria rerum veterum cum superiorum aetate contexitur?
– Cicero, Orator ad M. Brutum (XXXIV, 120)

Not to know what occurred before one was born, that is to remain always a child. For what is the lifetime of man, if it is not connected with the remembrance of the histories of previous generations?

I wonder what Cicero would have said of those who cannot even remember what happened in their own lifetime, but are so faddish that they must disavow memory even of their recent past. As Cicero observes, the importance of recalling the past is that our lives are irrevocably connected to and derivative of the deeds of our predecessors. In other words, we need to know the past in order to understand where the present came from. In this way, we could see through many of the rhetorical tricks and misdirections that would-be opinion-makers cast at us, for we could see that they are not grounded in any substantive rational or empirical development. Knowledge of the past keeps liars honest, for they must be forced to account for why what they say now differs from what they said yesterday. Who remembers that, in the early nineties, “gay rights” advocates would deny that they sought the right to marry, dismissing such claims as alarmist fear-mongering? Yet why should we be surprised if no one remembers this, or makes nothing of it, when we permit a president to re-invent his reasons for invading Iraq in a matter of months?

We cannot expect the mass media, or even liberal academia, to place much emphasis on the past, since that would divest them of their cultural authority. The manufacture of an ignorant yet educated populace has been the work of a hundred years, beginning with the elimination of the classics from grammar school, so that today most university graduates wouldn’t know Cicero from Adam. Yet, as time goes on, the horizon of our collective memory shrinks ever further, so that “modern” only means the last twenty years or so, often even less. We surrender this collective memory only at the expense of our sovereignty against the encroachments of the state and the shallow intelligentsia who do not wish us to learn how flimsy is their philosophy.

Another Blow for Cholesterol Drugs

Merck & Co. has taken a large hit with the rejection of its latest cholesterol drug, as the FDA finally takes a stand against such useless medications. Although the U.S. Food & Drug Administration, like the medical establishment, has by no means abandoned the untenable hypothesis that cholesterol causes heart disease, they at least recognize that a drug ought to be judged on its clinical outcomes rather than its ability to change numbers on lab tests.

It has long been known that many cholesterol drugs are of limited or negligible effectiveness. Statins are not effective at primary prevention and have a host of serious side effects. Fibrates are clinically worthless, having no impact on clinical outcomes, though they do successfully lower LDL, “bad cholesterol,” and in the case of fibrates, raise HDL, “good cholesterol”. This fact alone might cause rational people to see problems with the cholesterol hypothesis as a cause of heart disease. Instead, it is more likely that high LDL is an indicator of other factors, such as lack of exercise, that are also linked to heart disease risk. Unfortunately, the distinction between correlation and causation is often confused in medicine, and the misleading term “risk factor,” which refers only to statistical correlation, does much to confuse lay persons.

The lack of substantial positive clinical outcomes from cholesterol drugs ought to be weighed against the negative side effects, including weakening of the nervous system and Co-Q10 depletion by statins, and increased risk of liver disease from fibrates. The supposed benefit of these drugs is to alter cholesterol levels, which may or may not reflect an underlying unhealthy condition. High LDL is positively correlated to obesity, smoking, lack of exercise, excessive alcohol consumption and other unhealthy conditions. Yet it would be a mistake to consider LDL “bad cholesterol” as it is essential to the formation of hormones. Indeed, cholesterol that is too low can also be dangerous, yet there is no recommended minimum LDL level, only a maximum.

Far more devastating to our health than the false equation between serum cholesterol and heart disease is the fictitious link between blood serum cholesterol and dietary cholesterol, particularly saturated fats. There has been no proven link between dietary cholesterol and blood cholesterol; in fact, most blood cholesterol is produced by the body. Further, hundreds of tribes throughout the world, known for the near total absence of heart disease among them, eat food rich in fat, especially game meat. This hunter-gatherer diet should be proof positive against any simplistic correlation between dietary fat and heart disease. In many Western countries, animal meat is unhealthy for other reasons, such as the fact that cattle are not grass-fed, and thus have all the unhealthy traits of a high-carbohydrate diet.

This leads to the core problem of heart health in modern societies: the consumption of refined sugars and other carbohydrates. These, combined with hydrogenated oils and other processed oils, are far greater health dangers than the unfairly maligned red meats and saturated fats. Indeed, much of the nutritional advice proffered in the U.S. over the last fifty years has been the exact opposite of what would promote heart health, which is one reason age-adjusted incidence of heart disease has increased.

The phenomenon of cholesterol drugs is but one aspect of the greater problem of Western medicine, which regards the human body as a mechanical composition of chemicals, and tries to crudely tinker with certain quantities, often valuing these quantities more than clinical outcomes. This modality has the effect of emphasizing the use of drugs and surgical remedies rather than effective preventive measures. Ironically, such preventive measures would include rejecting some of the technology-laden aspects of food production that have ignored potential negative effects on the human organism.

How Not to Save the Planet

With an unseasonably cold spring comes a new wave of alarmism regarding global warming, of all things. The science of climate change is rife with political interest on both sides of the issue, so it should be useful to cut through the veneer of “science” (which is often just the ideologically charged opinions of scientists) to the actual facts that are known, and we will arrive at a picture that is quite different from either of the standard views on global warming.

First, we should observe that global climate and weather is a complex, non-linear system with far too many feedback parameters to solve analytically. All models of climate and weather involve probabilistic guesses and estimates of parameters based on past observations in similar conditions. This is why long-range weather forecasting is mathematically impossible, and even short-range forecasts are often wildly inaccurate. Estimating something like “global” climate can be a misleading construct, as the “global climate” is just a mathematical synthesis or average of quite disparate regional climates. A polar climate may become warmer while the climate of a temperate zone becomes cooler. Net global increase or decrease in temperature may or may not affect items of interest such as sea level rise or the greenhouse effect. Geographic distribution of climate is every bit as important as the overall average, arguably more so.

Since climate and weather models depend on previous data in order to estimate parameters, the science of global climate is essentially limited to data from the last half century. Earlier data is not truly global, and anything older than 200 years is almost certainly confined to Eurasia and America. Our estimates of long-scale historical trends are largely based on qualitative European accounts of rivers freezing, malaria outbreaks, and other indications of climate. On the geological scale, we can measure the carbon dioxide content of the ancient atmosphere, and though this is correlated to temperature, it is not the sole determinant of temperature.

Our knowledge of the historical and ancient past provides some apparently conflicting information, which often gets lost in the all-or-none approach to anthropogenic climate change. On the one hand, the carbon dioxide level of our atmosphere is higher than it is been in ages, almost since the time of the dinosaurs. Yet, despite this fact, the current global temperature is not dramatically hotter than that of recorded history; indeed the temperature of Europe was probably warmer during the medieval period. Even in the past century and a half of accelerated industrialization, global temperature has crept up slowly, in fact decreasing in the mid-twentieth century, before increasing only a fraction of a degree Celsius per decade. Indeed, it is debatable whether global temperature has risen so much as a full degree this century, such is the measurement error involved in a global average.

The synthesis of these observations points to a strange conclusion. Although human industrial activity is certainly correlated to a dramatic rise in carbon dioxide levels, this has not sufficed to raise global temperature as much as we might expect. This suggests that, were it not for human activity, we might still be in the “Little Ice Age” that extended from the sixteenth to nineteenth centuries. Global warming, what little there may be, is actually helping to keep the temperature reasonable. This is not necessarily a reason to be blase about future environmental change.

Thre remain serious concerns about ozone depletion and polar cap melting, which in theory could have catastrophic effects. Ozone depletion has already been addressed by a ban on cholofluorocarbons and freon, which are chemically capable of depleting ozone, assuming they actually rise to the ozone layer in sufficient quantities. Polar cap melting is projected to cause a rise in ocean levels by tens of centimeters, not meters, over the next century. This can cause serious crises in some regions, but there will not be any global cataclysm.

Built into the notion of “saving the planet” from some fictional calamity, be it a rogue asteroid or a global tsunami, is the fantasy that man can be his own savior, along with the equally vain belief that man is capable of destroying the planet. Global industrial activities have negligible climactic effects compared to, say, Krakatoa-scale eruptions that can cover a third of the world in darkness and cold. Man is but a caretaker of this planet, not its savior, and if he really wishes to improve the environment, he should try a different approach than the policy-wonking regulation of carbon emissions. Such regulations have only yielded the farce of exchanging carbon credits, and the ridiculously energy-wasteful enterprise of corn-based biofuels, which has driven up the cost of agricultural products in the United States.

Instead of creating a new industry of planet-saving, people should recall their role as caretakers and conservators and learn to consume less, not because they will “save the planet,” but because they ought to make good use of natural goods rather than waste them on frivolities. Yet as long as consumerism and frivolity are considered virtues, environmental do-gooders might as well try to empty the ocean a bucket at a time. Our wastefulness may bring no danger of destroying the planet, but it does threaten our existence as serious human beings.