Leaks in Government Intelligence

The WikiLeaks scandal is exposing the internal contradictions of liberal democracy, which pretends to promote an open society while its heavily entrenched power structure relies on coercion and espionage. The hypocrisy of Western democracies is not a new thing; in fact, we can turn to the revolutionary movements of the eighteenth and nineteenth centuries to see how the principles of “liberty” and “equality” were often imposed by brute force or deceit.

The infamous “Reign of Terror” in revolutionary France was but the culmination of an increasingly aggressive secularism that had sought to turn the Church into an arm of the state with its Civil Constitution of the Clergy in 1791. Thousands of priests, aristocrats, and peasants who opposed the new order were killed or exiled. Similarly, in Mexico, the liberals, once securely in power, abolished the Catholic university, and eventually all public manifestations of Catholicism. Back in Europe, democratic and republican revolutions were prompted through coups and assassinations, including the infamous throat-slitting of Pellegrino Rossi, interior minister of the Papal States.

The use of secret societies, Masonic or otherwise, was a staple of liberal and revolutionary movements of the nineteenth century, as is well documented. In England, whose Protestant culture is also the product of a successful intrigue – the “Glorious Revolution” of 1688 – and its cultural heir the United States, it has been common to dismiss such claims as paranoia, since they are vested in defending the legitimacy of the liberal power structure. In fact, the existence of a real Masonic conspiracy (as opposed to the various bugaboos of more recent times) was publicly exposed in France in 1904. In “l’Affaire des Fiches,” the anticlerical war minister General Louis Andre was found to have been determining promotions and denunciations on the basis of a card file he kept of who was a Catholic and who attended Mass. Both Andre and Prime Minister Emile Combes were Freemasons, and had acquired their information from the spy activities of the Masonic Grand Orient of France.

Once victorious in destroying the cultural infrastructure of its rivals, democrats hypocritically proclaim “freedom of education”, where all education dogmatically accepts the tenets of secular democracy, which is why, to a historically educated person, there is surprisingly little ideological diversity in modern academia. The dissenting academics having been purged or rendered impotent, a new form of coercion is free to impose itself. It is only through a historically inaccurate demonization of previous forms of government that our modern Western governments can pretend to any virtue.

After the outbreak of the First World War, the governments of the West gradually abandoned any notion of aristocratic honor, and learned to wage war and peace mercilessly. Apart from the mechanical destruction of much of Europe’s cultural patrimony in the two world wars, there arose intelligence agencies that took the ancient art of espionage to new levels, systematically intercepting communications even of allies, and attempting to corrupt foreign citizens into betraying their governments (i.e., act as “agents”). The U.S. and Britain were most advanced in this regard, especially in the aftermath of World War II.

In the U.S., the presence of an agency that systematically violates personal privacy would run into constitutional problems. These were circumvented by allowing the CIA only to spy on foreigners, and by collaborating with the FBI, which spies on U.S. citizens in the name of police security. The NSA can spy on U.S. citizens indirectly by making use of data gathered by British collaborators. Between the U.S. and the U.K., no one’s information is safe, as was proved by the infamous scandal of the U.S. and U.K. spying on UN Secretary-General Kofi Annan and other officials.

We know from bitter experience that the U.S. and its allies are ruthlessly amoral when it comes to espionage – absolutely nothing is exempt from the catchall excuse of “security”. The successful 9/11 attacks only allowed the U.S. to declare openly what it had already been doing in secret for decades: illegally spying on its own citizens, and detaining people in foreign prisons for years without trial. A majority of the docile public actually supports these policies. So much for their supposed love of liberty.

Wikileaks and its founder Julian Assange have lain bare the contradictions of our liberal establishment, which demands that we respect the authority of governments that were founded by murdering and destroying the previous social order throughout the nineteenth century. Liberal democracy, at its core, was founded on intrigue and brute force, so it is only fitting that it clings to power by using coercive tactics against its citizens. We should find it blessedly ironic, then, that the government of Sweden makes use of its extraordinarily liberal laws regarding sexual consent as an excuse to arrest Assange shortly after his latest wave of embarassing leaks. Liberal efforts to defend women, the poor, and the disabled are really designed to pit the citizenry in competition with one another, always asserting their rights against each other, rather than joining forces against their real enemy.

When the core of the liberal democratic establishment (and I use “liberal” in the classical sense, which includes today’s so-called “conservatives”) is challenged, it is remarkable how everyone suddenly marches in lockstep. Credit card companies refuse to process payments, hosting companies drop the site, and even the ostensibly neutral ICANN takes away the domain name ‘wikilieaks.org’. It is chilling how all the powers in the world can line up against someone and try to suppress them, even on the supposedly free and open Internet.

Fortunately, there are countermeasures available. Already, a host of mirror sites have been created, and there are even instructions online for creating a mirror site. Supporters of Wikileaks have initiated distributed denial of service (DDoS) attacks against some of the companies that tried to kill the original site. While such activity is illegal, it is surprisingly easy to accomplish, as are other forms of online hacking. A person needs only to set himself up behind a proxy in a country that does not forward tracking records to the U.S. or other Western countries, or simply login through a neighboring wireless network. These tricks and other hacking tools can be found by searching on the scraper site Scroogle, which lets you search without being tracked by Google.

The weapons of the hacker can provide a potent counterbalance to the temerity of our governments, which pretend that they own you and have the right to spy on you or do whatever they want with you, but heaven forbid that you should find out what they are doing. These illegitimate usurpers, who use public authority for private aggrandizement, should be thankful that we do not do anything worse to them.

Cancer: A Disease of Civilization

The only thing surprising about the Villanova study indicating that “cancer-causing factors are limited to societies affected by modern industrialization” is that this is actually considered news. It has long been known that the incidence of cancer is extremely small in pre-industrial societies, nearly non-existent. In order to defend the current culture of emphasizing intervention over prevention, oncologists and others have argued that this is only an artifact of people living longer. The oft-repeated excuse that ancient people rarely lived to be more than 30 is profoundly ignorant and misleading.

When we say that ancient societies had life expectancies of 30 or so, this does not mean that people typically died around age 30. This is an average life expectancy at birth, a figure that is heavily skewed downward by a high infant mortality rate. Those who reached adulthood could reasonably expect to live into their 40s or 50s, and many lived to be over 70. This is why, when you read ancient Greek and Roman texts, such as the lives of the philosophers, there is nothing considered remarkable about living to eighty years old, and it is considered tragic if someone dies in their thirties. The “life expectancy” was much shorter, but the lifespan of man was just as long. People didn’t start having their bodies break down at age 45 just because the average life expectancy was shorter.

The same is true among Native Americans, for which we have a better record, since they lived in technologically primitive conditions until relatively recently. Nonetheless, they were noted for their health and longevity. Plenty of seventeenth century documents attest to the long lives of the Indians in Mexico, for example, where there were many who lived to well over eighty, including some reputed centenarians. The Inuit had no incidence of cancer whatsoever, despite living long enough to develop it (after all, we start screening for cancer at age 40, or 50 at the latest). It was only in the twentieth century that cancer became prevalent among the Inuit, when they adopted modern diet and a sedentary lifestyle.

In the mid-twentieth century, it used to be thought that men having heart attacks at 50 was just something that happens as part of the aging process, but now we know that early heart disease is fully preventable, being a consequence of behavior: diet and lack of exercise. Drug companies and doctors who advocate aggressive interventions downplay prevention, since selling you the cure is much more lucrative. To this day, there is little emphasis on nutrition in medical school, and little serious research into the toxicity of processed food except in simplistic dose-response studies. Our bodies are complex organisms, and there are likely many interaction factors among substances that, by themselves, may appear harmless.

The Villanova researchers do not do themselves any favors when they make the excessive claim that “There is nothing in the natural environment that can cause cancer.” Still, the natural incidence of cancer among animals is astonishingly low, especially for a disease that is supposedly just a natural consequence of living long. Animals kept in captivity, protected from predation, can live their full natural lifespan, so we should certainly expect to see a significant incidence of cancer among them, but we do not. Up until the eighteenth century, cancer was comparably rare among humans. Indeed, when you read older documents mentioning a “cancer,” they are often referring to gangrene or some other degenerative growth. Cancer as we know it was extremely rare, though there were plenty of people who lived to old age.

The real turning point appears to have been in the last few centuries. The hallmarks of societies that have what we now consider “normal” cancer rates are consumption of refined sugars and carbohydrates (following the discovery of the Americas), tobacco use, giving first childbirth later in life, and universal use of toxic pesticides and other pollutants. The sheer abundance of toxins in our food makes it practically impossible to single out a cause in a simple dose-response relationship, which is why countless harmful things, each of which is minimally harmful, are allowed to remain on the market.

The embarrassing fact that many of the technological niceties that are forced upon us ostensibly to improve our lives (but in fact to reduce the cost of production and increase profit) are actually killing us makes a mockery of the notion of science and technology as a sort of earthly salvation. The technocratic establishment expects to be praised for saving us from diseases created through their own uncritical hubris (“Don’t be silly, this won’t hurt you, you Luddite”). For those who are hoping for salvation through medical science, I should remind you that researchers are just flesh-and-blood human beings, many of whom are susceptible to vanity and self-aggrandizement. These sort understand perfectly well that an ounce of prevention is worth a pound of cure, or ten pounds of “treatment”. The medical establishment has judged it is much more profitable to sell you the ten pounds, and it’s hard to argue with their math, as the cancer treatment industry is currently worth over $200 billion.

N.B. Nonetheless, there is some good research that addresses the root causes of cancer and how to prevent it. One especially worthy institution in this regard is the American Institute for Cancer Research.

Mushroom Clouds and Moral Mediocrity

Leave it to a comedian to state plainly that Truman’s use of atomic weapons was a war crime, only to backpedal out of political expediency faster than you can say Arlen Specter. This short-lived moment of lucidity has occasioned a lively discussion on the topic, by no means confined to the political left. Although many paleoconservatives in Truman’s day and beyond expressed horror at the bombing of Hiroshima and Nagasaki, conservative opinion today generally favors nationalism over morality, and we are invited to examine the supposed moral complexity of the issue. In today’s political right, only a few social conservatives and libertarians are willing to state the obvious moral truth that mass incineration of civilians is not justified by military expediency.

It would be easy to argue that the callousness of mainstream conservatives toward the rights of foreign prisoners and noncombatants is a major cause of its recent decline in popularity. It would also be wrong. Sadly, most Americans are willing to hedge basic moral principles for the sake of security. To the chagrin of the liberal news media, a majority of Americans polled in favor of the recent NSA phone surveillance program, even well after details of the program were leaked by the press. A strong majority of the public has supported keeping the Guantanamo prison open, and slim majorities have favored the use of waterboarding and other harsh interrogation tactics. As for pre-emptive wars of choice, public support of the Iraq war from 2003 to 2007 closely matched the perception that the war was going well. It would seem that we support illegal wars in the short term as long as we appear to be winning, though the perceived success of the 2007-08 surge did not restore the war’s popularity.

Unprincipled compromise of moral values can be found even among academics, who are supposedly more thoughtful. Although the vast majority of American historians are politically liberal, their vaunted concern for human rights diminishes when discussing Truman’s use of the atom bomb. Indeed, the man who wiped out two cities and later led the nation into the disastrous Korean War is cited by most historians as an example of an unpopular president who was later vindicated by history. This supposed vindication consists only in the approval of liberal historians, who are evidently as prone to place partisanship over principle as their conservative counterparts. We can only imagine what they would write if a Republican had dropped the bomb.

The general coarsening of morality, even among the educated and among those who claim to preserve traditional social values, is a worrisome development. Some paleoconservatives such as Pat Buchanan have adduced from this reality that the left has won the culture war, through their domination of academia and the entertainment media that shape public opinion. Those who would defend the classical virtues must find themselves in a constant struggle against societal tendencies, and they must risk ostracism and ridicule for merely holding what has been held by practically all the great moral philosophers in history. The tyranny of the majority of which Tocqueville warned is evinced in the perception that the rectitude of same-sex “marriage” can be determined by persuasion of the majority. The majority, as we know, is notoriously fickle. Fifteen years ago, even liberals shrank from same-sex “marriage”; now, the propaganda machine would like to portray any opponent of such unions as a Neanderthal.

All too often, shifts in opinion on moral matters (and associated historical, sociological and anthropological judgments) hinge upon nothing more than emotion and propaganda. A thing becomes right or wrong simply because the current majority says it is. Such a hermeneutic is utterly unworthy of an adult human being, yet democratic culture makes it seem natural. Few even among the paleocons will go so far as to identify democracy as the root of moral relativism. Most have held some form of the naive view that the majority would freely accept virtue if only it were presented to them clearly. In actual experience, the morality of the masses, when uncoerced, gravitates toward mediocrity. We can see this with the gradual shedding of social constraints and the coarsening of mores over the last forty years. This coarsening is expressed in dress, diction, and bearing, as well as more quantifiable sociological phenomena. On the Internet, the more popular sites invariably attract cruder and more degenerate discourse. While democracy romanticizes the virtue of the masses, reality teaches that we can hardly expect great virtue from a people fed a steady diet of mind-numbing television and Twittering.

The ancient Athenians recognized that democracy, or indeed any form of government, could work only if it was governed by laws, which they called nomoi. The nomoi were not the acts of a legislature, but basic moral precepts that defined the legal principles of society. Even the popular assembly did not presume to have direct authority to change the nomoi, though they sometimes appointed a committee of jurists to recommend additions to the laws. Even this limited power was too much in the eyes of Plato and Aristotle, who emphasized that nomoi, especially those that are unwritten, must bind even the people as a whole. For this reason, they posited the necessity of founding a polity with a lawgiver such as Solon, a man (or men) of eminent ability, whose superior wisdom would establish basic laws that are better standards than most would choose for themselves.

When basic moral principles are considered immutable, or at least not subject to popular sovereignty, the nomoi rule, and people only implement them. They are to be amended only after grave circumspection by the most competent men. The basic morality of society as a whole is shaped in large part by the excellence of the lawgiver. If, on the other hand, people are given full sovereignty even over right and wrong, we will invariably gravitate toward social mores that reflect the moral mediocrity of the majority. Few would work if there was no need, and few would strive for excellence unless they were constrained to do so. We should expect a society just moral enough to keep the economy functioning, and indeed we increasingly expect our statesmen to be little more than business managers.

When a political or religious institution commits some crime, demagogues like to say that the institution has lost its moral authority or credibility. What, then, shall we say about the moral authority of the masses? Most of the great crimes of powerful institutions were committed with popular consent, and even when the people are sovereign, they seem to be unable to discern whether it is moral to exterminate hundreds of thousands of unarmed people, or to invade a nation without provocation. Given this dismal track record, I should not like to entrust any of our civilization’s most revered values to the whims of popular sovereignty. To quote Horace, Quid leges sine moribus vanae proficiunt? Modern political society seeks in vain its salvation through statutes and policies, as long it pursues moral mediocrity. The notion that the people are sovereign even over morals has led to the enshrinement of our baser instincts as rights. Those on the political left wallow in the sins of eros while those on the right commit those of thanatos. If society exists for something more than the fulfillment of animal impulse, it ought to strive for something better than the natural human condition.