One probably shouldn”€™t look to the New York Times for analysis of the ongoing Death of the West; however, Russell Shorto’s latest article in the Magazine, “€œNo Babies?”€ is worth considering, if only because it’s one of the more interesting”€”and interestingly wrong“€”Left-liberal responses to the European birth dearth.

Quoting Hans-Peter Kohler of University of Pennsylvania, Shorto opines,

“€˜high fertility was associated with high female labor-force participation . . . and the lowest fertility levels in Europe since the mid-1990s are often found in countries with the lowest female labor-force participation.”€™ In other words, working mothers are having more babies than stay-at-home moms.

Yes, if we just get enough women into the work force, we”€™ll solve this baby shortage thing in a jiffy! Shorto even echoes British “€œconservative”€ MP David Willetts in chanting “€œFeminism is the new natalism,”€ and that we need to install lots of new state initiatives to help manage women’s corporate lives. Christendom is dying, and it’s only the post-feminist career gals who can save it!

None of this really matters, of course. For even if Shorto’s thesis were true, every single European country is well below replacement level reproduction, even those bastions of women’s liberation.  Beyond this, Shorto’s conceit is a classic example of the collective fallacy, that is, logic in which parts are confused for the whole.

Shorto makes statements like, “€œ[T]he societies most wedded to maintaining that traditional family structure seem to be those with the lowest birthrates.”€ And “€œSocieties that support working couples have higher birthrates than those in which mothers are housewives.”€

“€œSocieties”€ don”€™t have children, women do. And the women pursuing careers and the women having babies ain”€™t necessarily the same people.
 
In this regard, the numbers are readily available online and easy to crunch. Take for instance, the demographics of the Netherlands, a country Shorto praises for its bounteous, natalist welfare-state, which offers women direct tax credits for each child and guarantees generous maternal (and soon paternal) leave from work.

Shorto boasts that this system results in a Total Fertility Rate of 1.78 children per woman, well below replacement level (2.1) but then fairly high for the continent. But the national TFR is a rather bird’s-eye view of things. What’s more striking is that non-European immigrants to the Netherlands comprise 12.4% of the population but 16.4% of the births, Moroccans being largest immigrant group and the most fecund with a fertility rate of 2.87. The Dutch population is shrinking as a whole, but new groups within are growing larger and larger.  

Without question, many of the fertile Dutch immigrants are taking advantage of the kinderbijslag supplement for each new child. But these are also the people doing the odd jobs, running the kebob stands, or idling on welfare checks”€”definitely not the beneficiaries of all the new “€œfeminist”€ measures in the corporate workplace.

Shorto’s talk of “€œsociety”€ masks a division of labor emerging across the continent in which the white people pursue careers and maximize self-fulfillment and the immigrants have the babies. This arrangement appears to Shorto as a “€œwin-win”€”€”feminism and children, too”€”only because he refrains from looking at the phenomenon too closely.  

The Baron Münchhausen was allegedly able to pull himself up out of a swamp by his own hair. I”€™m afraid Shorto will be less successful in arguing that sponsoring feminism in the work place is the only way to save the European family. 

How did it come to pass that the “€œconservative”€ position on foreign policy involves proclaiming the virtue of revolutionary upheaval around the world, worrying that the survival at freedom at home depends on the active spread of American-style democracy abroad, and arguing that the standard for determining whether a country is friendly to the United States is not what it does to affect U.S. interests but the extent to which its domestic political institutions conform to Washington’s preferences?

The answer, I have been told, has a lot to do with “Reaganism” and the flowering of the foreign policy vision of our 40th president.

Just as strange, it is now commonplace for many of the setbacks of U.S. foreign policy of the last seven years”€”including what is transpiring in the Middle East and the rising tensions with Russia and China”€”to be explained away as the Bush administration’s incompetent application of Ronald Reagan’s vision of spreading freedom abroad. Indeed, one of the underlying themes of Sen. John McCain’s bid for the presidency is that he will be much more capable and effective in implementing the so-called “€œfreedom agenda”€ and the efforts to secure a “€œbenevolent global hegemony”€”€”and that this makes him Reagan’s heir.

Of course, there is a problem. Reagan did not employ terms like “€œregime change”€ and “€œcreative destruction”€ in his rhetoric. He did not celebrate the onset of anarchy in another country as the birth-pangs of freedom. In his relations with a variety of authoritarian powers, he took seriously Thomas Jefferson’s dictum in the first part of the Declaration of Independence “€œthat governments long established should not be changed for light and transient causes.”€ (The corollary to this was that bad, even tyrannical, government can sometimes be a lesser evil than what results from upheaval.)

Yes, he firmly believed that America could serve as guide and example to the rest of the world, but non-democratic states, especially those ruled by traditional forms of governance, were not treated as ipso facto enemies of the American republic if such countries were not actively hostile to U.S. interests. 

Perhaps this is why in their famous Foreign Affairs article from 1996, Robert Kagan and William Kristol call their proposed foreign policy “Neo-Reaganite,” for many of the authors”€™ sentiments were neither expressed nor endorsed by the 40th president. And, when Senator McCain spoke at the Reagan Library two years ago, the quote he selected from the former president did not speak to the aggressive promotion of the American form of governance, but was decidedly more prudent: “€œLet us go to our strength. Let us offer hope. Let us tell the world that a new age is not only possible but probable.”€

Neoconservatives like to play up”€”and in my opinion exaggerate”€”the differences between Reagan and his immediate Republican predecessors, especially Dwight D. Eisenhower and Richard M. Nixon. Jonah Goldberg’s shorthand comment, made in the latest issue of National Review and many times before, that “€œReagan was put on this earth to kick a** and chew gum (and he ran out of gum a long time ago)”€ plays to the stereotype of the 40th president as a belligerent activist, in contrast to the more prudential, cautious, and pragmatic presidents who came before.

I find that a bit odd. It is true that, in comparison with Nixon, Reagan was more inclined to actively encourage the spread of American values. Yet both men were united in their view that as conservatives, the morality of results trumped any morality of intentions”€”and that the maintenance of the United States as a great power took priority. Indeed, their views are nearly identical.

After taking office, Nixon declared that the “€œpower of the United States must be used more effectively, at home and abroad, or we go down the drain as a great power.”€ Upon leaving office, Ronald Reagan told students at the University of Virginia, “€œAmerican power must be exercised morally, of course, but it must also be exercised, and exercised effectively.”€ I don”€™t know if that is a judgment President Reagan would bestow on the current administration.

Of course, this impression of Reagan as belligerent crusader is easier to maintain if one defines Reagan by selected soundbites”€””€œevil empire,”€ “€œtear down this wall,”€ and so on. One might conclude that President Reagan would have been an avid supporter of the “€œfreedom crusade.”€

It is very true that Reagan belonged to a wing of the American conservative movement that sought to go beyond an America as a “€œcity on the hill”€ serving as a passive example to others of a society grounded in ordered liberty in favor of more active encouragement.

But Reagan never let his rhetoric interfere with his understanding of reality and of the limitations faced by the United States. In his biography of Reagan, Lou Cannon, notes: “€œ[Reagan] had placed a high premium on success throughout his various careers, and he often complained that some of his erstwhile conservative supporters wanted to “€œgo off the cliff with all flags flying.”€ That was rarely Reagan’s way.”€ It is difficult to escape the conclusion that Reagan, were he alive today, would be very uncomfortable with some of the foreign policy propositions being advanced by some associated with the current presidential campaign of John McCain, a self-described “€œfoot soldier of the Reagan Revolution.”€

Let’s look closer at his famous Westminster Address in June 1982, which has often been cited as the foundational document for the current calls to promote democracy. What is striking is the cautious, prudential tone, as well as a humility about the task at hand:

No, democracy is not a fragile flower. Still it needs cultivating. If the rest of this century is to witness the gradual growth of freedom and democratic ideals, we must take actions to assist the campaign for democracy.

… We ask only for a process, a direction, a basic code of decency, not for an instant transformation.

… While we must be cautious about forcing the pace of change, we must not hesitate to declare our ultimate objectives and to take concrete actions to move toward them.

… The objective I propose is quite simple to state: to foster the infrastructure of democracy, the system of a free press, unions, political parties, universities, which allows a people to choose their own way to develop their own culture, to reconcile their own differences through peaceful means.

Yes, Reagan did use the term “€œcrusade for freedom“€ in his remarks. But the tone and scope is anything but crusading. It is evolutionary in nature”€”and reflects what the president told then-Chinese prime minister Zhao Ziyang two years later, “€œIf you ask our advice, we can only answer with truth as we see it.”€ It commits the United States not to democracy promotion but to democracy encouragement”€”and places a premium on stable development.

After re-reading the Westminster Address, somehow, I can”€™t see Reagan being all that enthusiastic about the approach taken in Iraq. How does what has transpired there fit in with what Reagan told Soviet students in Moscow in 1988? “€œ[P]ositive change must be rooted in traditional values”€”in the land, in culture, in family and community … Such change will lead to new understandings, new opportunities, to a broader future in which the tradition is not supplanted but finds its full flowering.”€ Straight out of Edmund Burke!

Or take Reagan’s famous remarks in Berlin. He did not, after calling on Gorbachev to “€œtear down this wall,”€ say, “€œor I”€™ll do it myself!”€  He issued no blood-curling statements about how the United States would “€œbury”€ the Soviet Union. He concluded his address at the Brandenburg Gate in June 1987 with a call for cooperative action: “€œWe in the West stand ready to cooperate with the East to promote true openness, to break down barriers that separate people, to create a safer, freer world.”€

Reagan, indeed, was quite a realist in the American tradition”€”that formerly hostile powers might, given the right incentives, develop a stake in working with the United States. This was the message he delivered to his former opponents the “€œRed Chinese”€ when he visited the People’s Republic in 1984. In his toast in the Great Hall of the People, Reagan, the long-time anti-communist, nonetheless was prepared to declare, “€œI see America and our Pacific neighbors going forward in a mighty enterprise to build strong economies and a safer world. … We can work together as equals in a spirit of mutual respect and mutual benefit.”€

I know that some today might dispute this interpretation of Reagan. But, 20 years ago, this was the mainstream approach and endorsed by anyone who considered himself to be a conservative. No less a figure than Irving Kristol, sometimes described as the “€œgodfather of neoconservatism”€, endorsed this perspective in a 1985 National Interest essay, writing:

The United States can coexist peacefully enough with non-capitalist, non-democratic nations, so long as these nations are willing to coexist with the United States. Such easy acceptance of coexistence is made possible by the firm assumption that, in time, these nations will discover for themselves the superiority of the American way of life. … The task of American foreign policy is … not so that the world can be made “€˜safe for democracy”€™ but so that the nations of the world can have the opportunity to realize whatever potential for popular government and economic prosperity they may possess or come to possess.”€

Three years later, Owen Harries”€™ provocative article, “€œExporting Democracy”€”and Getting It Wrong,”€ acknowledged a long-standing American impulse to encourage the spread freedom and democracy around the world, but declared that conservatives had to balance this desire by “€œother interests that the United States must necessarily pursue, more mundane ones like security, order and prosperity. For these represent not merely legitimate competing claims but the preconditions for a lasting extension of democracy.”€ He concluded, “€œthe attempt to force history in the direction of democracy by an exercise of will is likely to produce more unintended than intended consequences.”€

Even neoconservatives who argued that the United States should place the promotion of democracy as one of the central organizing principles of its foreign policy recognized that “€œyou do not blindly threaten or weaken regimes where there exists no democratic alternative”€ and contrasted a conservative, prudential approach with the “€œtouching and grandiose belief”€ of liberals “€œin the power of the United States to redeem the politics of benighted lands.”€ (The words of Charles Krauthammer in the Washington Post, July 10, 1987)

If Reagan is held as the “€œgold standard”€ for defining a conservative approach to foreign policy, then one cannot escape his emphasis on prudent action. And while he firmly believed in the universality of American ideals, his departure point for policy was to create a world where countries would be able to make their own choices free from outside pressure. But he made the central organizing principle of his foreign policy the prosperity and stability of the United States and its allies. Non-democratic and non-capitalist states that did not threaten the U.S. were not his enemies, and he could make common cause with them to resist the efforts of the Soviet Union to spread its ideology around the globe.

This pragmatic, prudential approach is what enabled the president to reinvigorate the Atlantic alliance and form coalitions in Asia, the Middle East and Latin America that helped to contain the USSR and bring about the end of the Cold War.

So now the question is: what will it take for this prudential, Burkean approach to foreign affairs to once again become the mainstream view among U.S. conservatives? Reclaiming Reagan might be the first step.

Nikolas K. Gvosdev is editor of The National Interest. The views expressed herein do not necessarily reflect those of The National Interest.

During the Cold War, conservatives rightly pointed out that the collectivist materialism of the Soviet Union was anti-human in the worst ways.  It elevated the state to mythic proportions.  It denied the value of individual human beings.  It suppressed the human spirit and focused on minimal material comfort to the exclusion of other values.  The state could undo social injustices, we were told, but conservatives reminded us that life always would involve certain unavoidable inconveniences and inequalities.  No law could completely eliminate evil, and the attempt to do so would lead to other evils that have been the constant fellow traveler of the leftist program.

Every state that has sought heaven-on-earth has imposed crushing burdens on qualities such as initiative, enterprise, idiosyncrasy, self-reliance, law-abidingness, trust, and regard for one’s own.  During the post-war period, conservatives made common cause with libertarian critics of “The State.”  Individualism was the watchword of the day.  But the emphasis on individualism was always a bit out of tune with the conservative ethos.  As other disorders worked their way through society since the 50s, including nihilistic disregard for family and social obligations in general, conservatives expressed their concerns about the breakdown of civil society and community, trends rooted in an “atomistic” individualism.

Conservative political philosophy is concerned above all with balance.  Excessive individualism and excessive collectivism both exhibit genuine evils in political life.  We are skeptical of change not least because the happy balance of traditional Anglo-American liberties avoided the evils of both.  It has been difficult to preserve these liberties under the American Constitution and even harder for others to replicate.  The uniquely American balance of our historical liberties is expressed perfectly in the Second Amendment:

A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.

The Second Amendment has always flummoxed modern observers.  For starters, it has a preamble.  In the law, there is always an issue of interpretation—whether in contract law, property deeds, or statutes—about whether a preamble limits the meaning of the words to follow.  Is it surplusage, an exhortation, or a restriction on the specification that follows?  In this instance, it is what it appears to be:  an expression of purpose.  The right remains “one of the people,” but that right is in the service of a broader objective:  “the security of a free State.”  The Founders rightly worried that the federal government’s power to “provide for organizing, arming, and disciplining, the militia, and for governing such part of them as may be employed in the service of the United States” would be abused to create a federal “select militia” that weakened states’ and individuals’ right to create militias and bear individual arms respectively. 

The Second Amendment is also confusing today because of the degradation of the militia over the last 100 years.  A true militia may be thought of as a cooperative arrangement of the people and the state.  Like the jury system, it injects the sensibility of ordinary people into the state’s exertion of power.  The formalized National Guard appeared in 1903 taking over the role of the formerly more numerous and less uniform state militias.  The routine use of the posse comitatus has also gone by the wayside in the age of professional policing, though it still persists in various locales.  

The right to bear arms at the time of the founding, while an individual right, was not conceived completely individualistically.  In this sense, Scalia’s recent opinion in Heller, with its focus on self-defense, downplays unfairly the “classical republicanism” of the Founders.   The right to keep and bear arms undoubtedly allows arms as a means of self-defense from ordinary criminals, as well as the predators of nature.  But the Heller decision’s dicta—including its gratuitous dig at the M-16--paves the way for eliminating weapons chiefly useful for a broader and more political concept of self defense:  resistance to military enemies of the Constitution, whether foreign or domestic, through the actions of the citizen-militia.

The Founders knew that a community was a fragile thing.  It can be harmed from moral disorder within, a foreign conquest, and, most insidiously, the evil of “faction.”  A purely individualistic focus on the right to bear arms—typical in the rhetoric of libertarians and the Founding era’s Francophile left-wing—does not take into account that the Founding generation, soon after enacting the Second Amendment, imposed certain duties that relate to this right.  The federal Militia Act of 1792 provided as follows: 

That every citizen so enrolled and notified, shall, within six months thereafter, provide himself with a good musket or firelock, a sufficient bayonet and belt, two spare flints, and a knapsack, a pouch with a box therein to contain not less than twenty-four cartridges, suited to the bore of his musket or firelock, each cartridge to contain a proper quantity of powder and ball: or with a good rifle, knapsack, shot-pouch and powder-horn, twenty balls suited to the bore of his rifle, and a quarter of a pound of powder; and shall appear, so armed, accoutered and provided, when called out to exercise, or into service, except, that when called out on company days to exercise only, he may appear without a knapsack.

Would that the United States mandated such training today!  Gun control would have an entirely different meaning involving shot groups and tactical reloads.  The founding era’s rhetoric was more than a recitation of rights.  Even among the more liberal elements, a right was rarely disembodied from some sense of community obligation.  The right to bear arms existed alongside a duty to bear arms.  While the counterbalancing action of the different branches of government figures prominently in the Federalist Papers, the authors of that hoary work emphasized the need for a virtuous citizenry to preserve republican government. They knew that the political liberty of all depended upon the widespread inculcation of individual virtues—such as self-reliance—but also political virtues, such as watchfulness over the state and the willingness to forego private advantage when the common good was at stake.  After all, the term republic comes from the Latin “res publica,” literally public things but better translated as the common good.  The limitations on majority control contained in the Constitution could work at most to stop a temporary majority in the grip of some passion or mania.  The Constitution could not, in Rube-Goldberg fashion, forever channel any sort of collection of people, however devoid of virtue and public spiritedness, away from the natural results of their collective character. The Founders knew that character and liberty were mutually reinforcing and necessary for republican government to serve the individual and common good. 

As Patrick Henry put the matter:

Are we at last brought to such an humiliating and debasing degradation that we cannot be trusted with arms for our own defense? Where is the difference between having our arms under our own possession and under our own direction, and having them under the management of Congress? If our defense be the real object of having those arms, in whose hands can they be trusted with more propriety, or equal safety to us, as in our own hands?

A robust militia serves to improve the virtue of the people and ties their fortunes with those of the state.  Military drill instills characteristics of physical courage and discipline, while also giving the people the necessary skills to resist any threats to their liberties.  It is worth remembering that that the Founders were not only concerned with preventing tyranny; another important intervening event preceded the Constitutional Convention of 1787.  That event is Shay’s Rebellion, a lawless veteran’s movement that threatened the fragile order that prevailed under the Articles of Confederation.  In other words, the Second Amendment in particular evinces the U.S. Constitution’s dual aims:  liberty and order.  The liberties the Constitution recognizes are historical in nature, and certain seeming inconsistencies—in truth, necessary limitations—flow from their historical contours, which are by necessity more circumscribed that the abstract liberty one might imagine from a purely theoretical point of view. 

The Founders’ Solomon-like solution to the problem of creating a government energetic enough to discharge its duties, but not so powerful to oppress the people, finds itself most emphatically in the concept of the militia.  The militia is simply ordinary male citizens assembled to perform some necessary government task such as preventing a riot, responding to a foreign invader, pursuing a fugitive, or, if need be, breaking off from de jure control and responding to some emergency from within the apparatus of the government itself.  For those who find this institution an anachronism in the age of nuclear weapons, consider the relative inability of modern militaries to suppress insurrections with small arms in such varied locales as Iraq, Vietnam, Algeria, and New Orleans.  How much happier would the events in New Orleans have been if some reasonable percentage of the citizenry were routinely accustomed to assisting law enforcement and the National Guard in preserving order and responding to disasters.

Like a strong military in foreign relations, a well-organized militia has a deterrent to would-be tyrants both at home and abroad.  While some standing military is necessary today, how much less of a threat such a military would pose to our liberties if it were counter-balanced by tens of millions of American men armed, trained and organized at the county and state level, enforcing laws that they have chosen to live under as a free, self-governing people. 

As it stands, the American people are disorganized and increasingly servile.  Partly because of the proliferation of meddlesome laws, their relationship to law enforcement and the military is typically one of indifference or hostility.  The increasing professionalization of law enforcement and military functions has reinforced this gap between the State and the People.  A robust militia working hand-in-hand with full-time government officials would do much to restore civic pride, reduce tension between the government and the community, and deter the worst government excesses. The common extreme individualist notion of gun rights is problematic.  Without some sense of common destiny and moral courage, an armed but selfish population would be of little use against either foreign or domestic threats.   Why?  Because it would always be in one’s individual interest to let some other guy do the fighting.  To paraphrase General Patton, without teamwork you can’t fight your way out of a “piss-soaked paper bag.”  This criticism of the disorganized militia was commonly levied during the War for American Independence.  Consider the account of George Washington in a letter to the Continental Congress dated September 1776:

To place any dependence upon Militia, is, assuredly, resting upon a broken staff.  Men just dragged from the tender Scenes of domestick life; unaccustomed to the din of Arms; totally unacquainted with every kind of Military skill, which being followed by a want of confidence in themselves, when opposed to Troops regularly train’d, disciplined, and appointed, superior in knowledge, and superior in Arms, makes them timid, and ready to fly from their own shadows.

Does this line of criticism mean we should not have a militia?  Hardly.  But it does mean that a disorganized militia is not nearly so useful in securing a free state as an organized and well-armed one.  Without some subordination to law and public purpose, armed Americans acting as lone wolves or in some other disorganized groupings would more likely become a rabble like the Quantrill gang.   Without some concern beyond the self and without some coordination with self-governing and local political life, the right to keep and bear arms is nearly useless as a bulwark of liberty.

Constitutional and republican government aims to preserve liberty and government without extinguishing either.  As Burke put the matter:

To make a government requires no great prudence. Settle the seat of power, teach obedience, and the work is done. To give freedom is still more easy. It is not necessary to guide; it only requires to let go the rein. But to form a free government, that is, to temper together these opposite elements of liberty and restraint in one consistent work, requires much thought, deep reflection, a sagacious, powerful, and combining mind.

The historical right to keep and bear arms is the product of such minds.  But conservatives should consider the Founders’ solution in all of its detail.  They preserved an uncompromising individual right to keep and bear arms.  But that right existed in a larger tableau of duties and institutions that balanced the individual good with the need for cooperation in social life.  In an age of out-of-control crime, rampant illegal immigration, natural disaster, and threats of terrorism and urban disorder, a revitalized militia movement to assist local law enforcement and the National Guard, something like a well-armed variation on the Cold War Civil Defense programs, would be a worthy conservative endeavor that would secure a great number of the benefits of our historical right to keep and bear arms.

Some immigrants assimilate more slowly than others. My best friend from childhood had parents from Abruzzi who lived in NYC for 40 years and never learned English”€”the need for it never arose. I guess I”€™m another sort of “€œunmeltable ethnic,”€ having been for long periods of my life an expat New Yorker who wouldn”€™t (or couldn”€™t) learn how to drive. And it has given me a very different perspective on America”€”the point of view shared by the legally blind, by poor folks who”€™ve had their Pontiacs re-pod, and backsliding alcoholics with DUIs. Call it the bum’s eye view.

In high school, my folks wouldn”€™t “€œwaste”€ money on Driver’s Ed, to them an extravagance just shy of flying lessons. There was no point driving in college”€”the Gothic buildings were all within walking distance, and the rest of 1980s New Haven glowered at us across the moat like a primeval darkness full of wolves.

All of which explains why at 22 and in graduate school, I”€™d wander the streets of Baton Rouge with a vague but raging ache of longing and loss, on a quest for something undefined, unattainable, liberating…. What was I after? A glimpse of Plato’s Forms? A Christian epiphany? For weeks, I roamed those sun-baked, steamy roads until I stumbled upon the truth: I was looking for the subway.

When I realized this, I wasn”€™t even embarrassed; instead, I ranted to my friends about the absurdity of a town where the only public transit existed to ferry maids in from the ghetto in the morning, then whisk them back at dinner time. A few times I defiantly rode the bus”€”the only paleface on the rickety old vehicle, surrounded by tired-looking old cleaning ladies, with an occasional Indian physics grad student thrown in to spice the black-eyed peas. The ladies would look me up and down and roll their eyes, as if to say, “€œWell he ain”€™t gwine to last long “€˜round here. Don”€™t have the good sense God gave an alley cat.”€

Despite the wise advice of Cajun friends, I didn”€™t take driving lessons, but lingered most of the time in the dank and druggy dives surrounding campus, slogged through the ghetto to Latin Mass, and tagged along with any friend who was driving… pretty much anywhere. (“€œI”€™m fixin”€™ to go the landfill, wanna come?”€ Absolutely.) When dating opportunities rolled around, I would blithely break it to the girl that she”€™d need to pick me up. And I really didn”€™t get what a turn-off this was, why these first dates were also my last. It didn”€™t help that my notion of “€œtropical wear”€ was Bermuda shorts, argyle knee socks, and bankers”€™ shoes.

Native New Yorkers see cars not as attributes of masculine power or vehicles of freedom, but lumbering millstones that must be shifted first thing in the morning, twice a week. (One must imagine Sisyphus parking.) Finding a spot in the City is like playing Rubik’s Cube at gunpoint, and if you don”€™t solve the puzzle, the tickets you earn will soon exceed the value of your vehicle. At least one person I know was grateful when the tow truck finally dragged the damned thing away.

After seven years of sedentary sipping at slime-coated campus bars, I had to cave in and admit that my boycott wasn”€™t working: However long I held my breath, the city fathers weren”€™t going to dig a subway”€”especially since in Louisiana it would have promptly filled in with three feet of water. I found a friend intrepid enough to brave the roads alongside me and give me lessons, and took the driving test. At the time and in that town, this consisted of driving around a parking lot without knocking anyone over. I passed on the second try.

Now I was legal to pilot the crumbling “€™79 Dodge Aspen station wagon that I”€™d bought for $250 and decorated with a Battle Flag bumper sticker, but it didn”€™t mean I was ready. As I”€™ve written here before, it took me over a year to rouse the nerve for the interstate”€”which meant that in my capacity as Press Secretary to a (successful) governor candidate, I drove what the campaign director fondly called “€œthat G-ddamn Klan-mobile”€ on back roads that wound through bayous and burned-over cane fields, and crossed the Mississippi using ferries, since driving over bridges gave me panic attacks. But after enough long, leisurely trips along River Road, I started to get comfortable behind the wheel. I found a few entrances to I-10 that had really long intake ramps, and began to run the roads like a real American.

Which apparently contravened the Dialectic which governs the World Historical Process, because it wasn”€™t three months after I”€™d started driving on the interstate before the Holy City called me home. I learned in April 1996 that my poor, chain-smoking mother was finally finishing her 50 year-pact with Marlboro. The lungs-for-trinkets scheme that had netted her plenty of sweaters, blankets, and hats over the years now was paying its final dividend. I sold the car, packed up my stuff, and moved back to Queens”€”to the building next door, in fact”€”to help out the hospice nurses, to straighten out mom’s IV and watch the morphine drip, drip, drip….

So my mother’s death drew me back to New York, to the same block where I was baptized, in the same rent-stabilized building. For the next ten years, I worked at a series of journalism jobs, aware that affirmative action (“€œWhitey Need Not Apply”€) and departmental politics had rendered my degree effectively honorary. When my father looked at my Ph.D. diploma, he smiled and said “€œYou”€™re a certified Post-Hole Digger.”€ I”€™d never realized that he subscribed to the Chronicle of Higher Education.

Then Dad died, too. His lifetime of manual labor, Little Debbies and Chef Boyardee ended in stomach cancer in summer 2005. Standing over his open coffin, dully reciting the Rosary, I felt the ties that bound me to the city slacken and slip. My family had relocated variously to Purgatory, or Long Island. My upstairs neighbors ran what sounded like a tap dancing school, from the number of feet which hammered on their bare wood floors all day as I tried to write”€”but whenever I put on classical music to drown it out, the enormous husband would storm down to hammer on my door. My sane friends were mostly wed and busy paying to keep their kids out of public school. More and more, it seemed that being unmarried at 40 in New York City meant simply that you were unmarriageable, a near-miss for the species which Darwin in his infinite wisdom had chosen to cull from the herd. I needed a fresh start, a new career, a chance to wear tweed jackets and pontificate. I needed to go.

Apparently God agreed, deciding to lift from New York its 90-year plague of Zmiraks, because he sent me the opportunity of a lifetime”€”to teach writing classes at a Catholic Great Books college, in a beautiful part of the country, in a state with plentiful seafood and no income tax. For the first time in my life, I could actually live in a house. My beagles would have their very own yard, and chipmunks to chase. No more need to run them after tattooed skateboard punks down crowded sidewalks”€”rewarding as that is.

There was just one leash that tied me still to a parking meter in Queens, and that was public transit. You see, on my birthday in 2000 I turned 36. And lost my driver’s license, thanks to what my good friend Marty the cop might call a DFR: Dumbass Forgot to Renew. I”€™d overlooked the four-year expiration date on my old Louisiana license”€”never used in the deadly, whizzing traffic warrens of New York”€”and now I was no longer legal. When I got to New Hampshire, it would be Baton Rouge all over again, only this time in six feet of snow.

I promised myself that I”€™d take the test and get my license back, but by now the old anxieties which had kept me off the roads had all come roaring back. (To give you an idea of how long it has been since I drove a car: the last time I filled up a tank, the gas was 79 cents a gallon, and Monica Lewinsky was still in high school.) I must have seen too many auto safety commercials in my formative years, because whenever I try to get behind the wheel again, my mind is flooded with images of wrecked cars, bloody windshield glass, schoolchildren I’ve run over, and the automated wheelchair in which I”€™ll end up for the next 40 years.

So I”€™ve spent a long year cadging rides from colleagues in return for various favors; I even let one parsimonious fellow prof subsist in a basement room for a nominal rent. In return, he’d ferry me to and from my classes in his aging Chevrolet. When he wasn”€™t around, I”€™d shell out $20 each way to ride in the dingiest cabs this side of Calcutta (oops, forgot the new and PC name for the place”€”aren”€™t we supposed to call it “€œKaddishak?”€):

“€¢ Broken down minivans with biological stains on the floor and doors that stick shut”€”and then fly open suddenly on the road.
“€¢ Converted cop cars where I sit in the back like a suspect, waiting for the driver to come let me out of the back door which has no handles.
“€¢ A rattletrap Cadillac steered by a garrulous, chain-smoking 80-year-old who miraculously manages to tailgate at 35 mph. (Women have been canonized for less.)

In New York, the cabs have video monitors that offer stock quotes and Bloomberg business news. Why are the taxis up here so broken down and depressing? One of the drivers clued me in: “€œApart from DUIs and visiting businessmen, most of our customers are on public assistance. Once a month, we take them to cash their welfare checks and buy some groceries. Otherwise, we spend a lot of time picking up bottles of Smirnoff for alchies and delivering them”€”you know, for the guys who don”€™t like to leave the house.”€

That made up my mind. I can”€™t spend one more year riding the food stamps and vodka express. My roommate/driver has moved away, leaving me his car in lieu of rent. It’s gathering pollen in the parking lot at the college, waiting for me to wangle a license, insurance, and plates. Every week now, I cab it downtown to a slightly seedy driving school and sit with the 16-year-olds waiting for driving lessons. A little humbling, but it beats sitting in the perp’s seat en route to giving a lecture about Chesterton. Hands down.

Of the Axis-of-Evil nations named in his State of the Union in 2002, President Bush has often said, “The United States will not permit the world’s most dangerous regimes to threaten us with the world’s most destructive weapons.”

He failed with North Korea. Will he accept failure in Iran, though there is no hard evidence Iran has an active nuclear weapons program?

William Kristol of The Weekly Standard said Sunday a U.S. attack on Iran after the election is more likely should Barack Obama win. Presumably, Bush would trust John McCain to keep Iran nuclear free.

Yet, to start a third war in the Middle East against a nation three times as large as Iraq, and leave it to a new president to fight, would be a daylight hijacking of the congressional war power and a criminally irresponsible act. For Congress alone has the power to authorize war.

Yet Israel is even today pushing Bush into a pre-emptive war with a naked threat to attack Iran itself should Bush refuse the cup.

In April, Israel held a five-day civil defense drill. In June, Israel sent 100 F-15s and F-16s, with refueling tankers and helicopters to pick up downed pilots, toward Greece in a simulated attack, a dress rehearsal for war. The planes flew 1,400 kilometers, the distance to Iran’s uranium enrichment facility at Natanz.

Ehud Olmert came home from a June meeting with Bush to tell Israelis: “We reached agreement on the need to take care of the Iranian threat. … I left with a lot less question marks regarding the means, the timetable restrictions and American resoluteness. …

“George Bush understands the severity of the Iranian threat and the need to vanquish it, and intends to act on the matter before the end of his term. … The Iranian problem requires urgent attention, and I see no reason to delay this just because there will be a new president in the White House seven and a half months from now.”

If Bush is discussing war on Iran with Ehud Olmert, why is he not discussing it with Congress or the nation?

On June 6, Deputy Prime Minister Shaul Mofaz threatened, “If Iran continues its nuclear weapons program, we will attack it.” The price of oil shot up 9 percent.

Is Israel bluffing—or planning to attack Iran if America balks?

Previous air strikes on the PLO command in Tunis, on the Osirak reactor in Iraq and on the presumed nuclear reactor site in Syria last September give Israel a high degree of credibility.

Still, attacking Iran would be no piece of cake.

Israel lacks the stealth and cruise-missile capacity to degrade Iran’s air defenses systematically and no longer has the element of surprise. Israeli planes and pilots would likely be lost.

Israel also lacks the ability to stay over the target or conduct follow-up strikes. The U.S. Air Force bombed Iraq for five weeks with hundreds of daily runs in 1991 before Gen. Schwarzkopf moved.

Moreover, if Iran has achieved the capacity to enrich uranium, she has surely moved centrifuges to parts of the country that Israel cannot reach—and can probably replicate anything lost.

Israel would also have to over-fly Turkey, or Syria and U.S.-occupied Iraq, or Saudi Arabia to reach Natanz. Turks, Syrians and Saudis would deny Israel permission and might resist. For the U.S. military to let Israel over-fly Iraq would make us an accomplice. How would that sit with the Europeans who are supporting our sanctions on Iran and want the nuclear issue settled diplomatically?

And who can predict with certitude how Iran would respond?

Would Iran attack Israel with rockets, inviting retaliation with Jericho and cruise missiles from Israeli submarines? Would she close the Gulf with suicide-boat attacks on tankers and U.S. warships?

With oil at $135 a barrel, Israeli air strikes on Iran would seem to ensure a 2,000-point drop in the Dow and a world recession.

What would Hamas, Hezbollah and Syria do? All three are now in indirect negotiations with Israel. U.S. forces in Afghanistan and Iraq could be made by Iran to pay a high price in blood that could force the United States to initiate its own air war in retaliation, and to finish a war Israel had begun. But a U.S. war on Iran is not a decision Bush can outsource to Ehud Olmert.

Tuesday, Chairman of the Joint Chiefs Adm. Michael Mullins left for Israel. CBS News cited U.S. officials as conceding the trip comes “just as the Israelis are mounting a full court press to get the Bush administration to strike Iran’s nuclear complex.”

Vice President Cheney is said to favor U.S. strikes. Secretary of Defense Robert Gates and Mullins are said to be opposed.

Moving through Congress, powered by the Israeli lobby, is House Resolution 362, which demands that President Bush impose a U.S. blockade of Iran, an act of war.

Is it not time the American people were consulted on the next war that is being planned for us?

Victor Davis Hanson has suggested in his “Letter to the Europeans” (National Review Online, January 6, 2006) that we Americans believe the European Union is a “flawed concept”. But is it possible to make such a pronouncement when there exist such varying definitions as to what precisely is the “concept” behind the E.U.? The original concept (at least the version released for public consumption) was European unity as a peaceful alliance of friendly nations which would cooperate in various endeavors such as the creation of a common market. Slowly, gradually, and without any legitimate mandate, The Powers That Be decided that the concept of European unity was not a free alliance but a federal state.

It was just supposed to be a trading club, a mere economic arrangement. “Sign up, join in, let’s trade and we’ll all be rich as lords”. Then it was decided there should be greater political aspects of international cooperation. After that, instead of international cooperation, a supranational entity. The final dream of the Euro-loons such as Giscard d’Estaing was a single centralized state.

This metamorphosis can be seen in the changing terminology of the united Europe. It began as the European Coal and Steel Community. Then the name was changed to the European Economic Community. Then to the European Community, still denoting that this was more of an assemblage than a single entity. But then it was changed to the European Union, beginning to show its true colors. And finally, the leaders of the convention drawing up the proposed European Constitution ardently desired to create the United States of Europe. The “United States of Europe” was only abandoned due to the refusal of the British government to go along with it, and even then, the refusal of Blair & Co. was more out of a realistic perception that it would be nigh impossible to sell “the United States of Europe” to the British people than out of any real lack of sympathy with the project.

So we are left with the statists’ dream of a centralized European state. What the great villains Bonaparte and Hitler merely dreamed of is coming very close to fruition thanks to the efforts of neither a conquering general nor an army but instead a horde of bureaucrats riding the Brussels gravy train to eternal glory.

Of course, as Paul Belien as pointed out, it is all too fitting that Brussels is the capital of the Eurostate. It is already the capital of Belgium: a country which was simply invented ex nihilo in 1830, just as the United States of Europe is being invented today. The Eurofederalists have even admitted that Belgium is an ideal example for Europe: Stat Belgium, stat Europa they have proclaimed (”As Belgium does, so does Europe”). The tale spun by the federalists is that Belgium is a model because it is a single state consisting of two separate national communities “€” the French Walloons and the Dutch Flemings (as well as a third very small German community) “€” which has existed successfully in peace and harmony for over a century and a half. A worthy model, n’est-ce pas?

A closer look is quite revealing. So what if Belgium didn’t exist before 1830? Poland’s a real country, isn’t it?, and it couldn’t be found on any map for over a century. Ah, but Poland is a nation and a culture, and organic community of peoples. Belgium is under no circumstances a natural organic entity. It did not erupt from below but was devised by fiat from above. And as for two separate national communities living together as one in peace and harmony, the history is quite different from the myth. In reality, the history of Belgium has been one of the Walloon minority ruling the Flemish majority through a combination of oppression, bribery, and divide-and-rule.

Still, why has it lasted since 1830? A shared Catholic faith is probably a significant factor in the maintenance of unity for many of those hundred-and-thirty years. But as Belgian Catholicism waned after the war, cohabitation by faith was replaced with cohabitation by bribery. Belgium became one great big spoils system. Everything was not only in duplicate but in duplicate and then triplicate. For everything there must be a Flemish and a Walloon version, but then within the Flemish and Walloon versions there must be three further divisions: one for Christians, one for Liberals, and one for Socialists. Nothing could be accomplished (or undone) if any one element was opposed.

Part of the spoils system is Belgium’s multiplicity of parliaments. In a shocking waste of public money, there is a national parliament, a parliament for the region of Flanders, a parliament for the Flemish people, a parliament for the region of Wallonia, a parliament for the Walloon people, a parliament for the mixed-language Brussels region, and a parliament for the German-speaking community. That’s seven parliaments already! And we haven’t even mentioned that both Flanders and Wallonia are divided into further provinces, each with their own parliament. It’s surprising that after members of parliaments plus their aides and staffs alone there are any Belgians left to do anything else.

The spoils system, however, is beginning to crack. The culprit? People. Flemings just can’t stop being Flemings and Walloons just can’t stop being Walloons. The French-speaking Walloons are more like the French, and thus more content to sit back and rely on the state. The Dutch-speaking Walloons are more like the Dutch, and are a bit more market-oriented and entrepreneurial. The Flemings are beginning to realize that the massive spoils system, which involves a bloated public sector and a welfare state, has been a drag on the economy. Maybe if we made a few cuts here and there and lowered taxes a little, we could encourage business and be a little more prosperous? Perhaps, but any change to the system requires the unanimous agreement of all the elements involved. No matter what the Flemings want, the Walloons have a veto (and, to be fair, vice versa).

Politically speaking, each internal element had a party. There are the Flemish Christian-democrats, the Flemish liberals, the Flemish socialists, the Walloon Christian-democrats (now renamed “Humanist-democrats”), the Walloon liberals, and the Walloon socialists. All these parties were and are ardent supporters of the spoils system, so freedom-minded Flemings had little choice but to vote for the outsider Vlaams Blok (“Flemish Bloc”) which the mainstream parties considered beyond the pale. With the continued refusal of the six “mainstream” parties to deal with the concerns of Flemish voters, the Vlaams Blok became the largest political party in Belgium. Suddenly, the spoils system seemed under threat, but the “mainstream” parties found an easy way out: they simply banned the Vlaams Blok.

What does it tell us when a government outlaws the largest political party in its domain? At the very least it suggests the government and the people are out of step; to those who believe in democracy it is heresy. Yet as the Belgian idea of democracy was exposed as a farce, few words of complaint were uttered in the European and American press over the affair. And so it must be worrying that The Powers That Be intend to reshape Europe while openly extolling that farce as a model.

We must also recall that the convention drafting the failed European constitution ardently rejected even so much as a mere mention of the importance of the Christian heritage of Europe, while the European Parliament refused to allow Rocco Buttiglione to take up his position on the European Commission because, horror of horrors, he actually professes Christian beliefs.

So what are we left with? A country “€” no less than the model of European unity, we are told “€” which outlaws its largest political party. A parliament which refuses to allow practicing Catholics to hold positions of high authority. A constitution intent on unifying a continent while refusing to so much as mention the only unifying feature of that continent, namely Christianity. Rather murky circumstances, wouldn’t you say?

But, just as they threw a spanner in the works of the Belgian spoils system, there is one hope: the people. As much as planners plan and plotters plot, the human element is incredibly difficult to predict and even more difficult to control. The people of Ireland were the only ones in all of Europe who were given a chance to vote on the Treaty of Lisbon “€” a rehashed version of the failed Constitution which had been rejected in the French and Dutch referenda “€” and the Irish people rejected it resoundingly. The jig is, increasingly, up: the people and the politicos are not on the same page. So while the rulers may continue to rule, we should keep an eye out to see how long and how much the obedient are willing to obey.

We”€™re all familiar with this cliché-ridden story line. A successful husband dumps his middle-aged and supposedly feeble wife for a younger woman. The estranged wife’s friends are worried that after so many years of being dependent on her spouse, she won”€™t be able to make it in the real world as a single woman. But to the surprise of everyone, she goes to college, gets a degree and then opens a small but profitable business. And after dieting and working-out in the gym, she looks great and starts dating attractive and successful guys. In fact, her life has become much better now that he husband isn’t around anymore. 

In a way, many of the doomsday scenarios that try to envision what would happen in the Middle East if the U.S. were to decide to withdraw its military troops from and end its diplomatic engagement in the region, assume that being dumped by the powerful American superpower, the region, starting with Iraq, and continuing with Lebanon, Syria, Israel, Palestine, and Egypt, would degenerate into an all-out and never-ending war between nation-states (Iran vs. Saudi Arabia), ethnic groups (Arabs vs. Kurds), religious sects (Sunnis vs. Shiites), and tribal groups (you name them).

In a Middle East sans America, we are being told by the members of Washington’s Foreign Policy Establishment, the pro-U.S. regimes in Saudi Arabia, Egypt and Jordan would instantly collapse and Iran and its proxies would emerge as the ultimate winners. Oil would cease to flow from the region which would eventually draw in other global players, like China, Russia and the European Union (EU) that would start fighting over its resources and divide the region between them. Everyone would then recall the good, old days of Pax America in the Middle East and would wonder. What were we thinking when we bashed American interventionism in the region? There was no way that the Middle East would have been able to survive without U.S. wise guidance and effective protection. Right?

Wrong. A counterargument would start by drawing attention to the devastating consequences of American diplomatic and military intervention in the Middle East during the first eight years of the twenty-first century. The ousting of Saddam Hussein and the occupation of Iraq that destroyed the balance of power in the Persian Gulf and strengthened the power of Iran and its Shiite proxies in the region, not to mention the humanitarian and economic costs of this American disastrous misadventure, including the death and destruction in Iraq, hundreds of thousands of refugees, and rising oil prices (and we are mentioning here the huge costs for the American people).

And lest we forget, a somewhat bizarre mix of an American crusade for democracy and an ambitious strive for hegemony brought about the election of Hamas in Palestine followed by an effort to isolate and punish it and the Palestinian people who elected it, and the strengthening of the power of the Hezbollah in Lebanon followed by Washington giving a green light to Israel to bomb Lebanon back to the stone ages.. The result of the American policy has been more bloodshed between Israelis and Palestinians, growing instability in Lebanon, and rising tensions between Syria and Israel, and the never-ending chatter about the U.S or Israel strikes against Iran.

If we apply our earlier analogy, we could argue that it is the wife (the Middle East) that has concluded that the time has come to dump the husband (Uncle Sam), and not the other way around. It is from this perspective that we w need evaluate some of the dramatic developments that have been taking place in the Middle East as some of the leading players in the region, operating based on their interests, have decided to disregard U.S. guidance and embrace independent action.

First, the Shiite controlled ruling and opposition parties in Iraq have all strengthened their ties to the Shiite regime in Tehran while raising objections to continuing American military occupation of their country. Indeed, it was Iran, and not the U.S., that played a critical role in mediating a cease fire between the government of Nouri al-Maliki and the forces of cleric Muqtada al-Sadr. The liberated Iraqis, it seems, are trying to liberate themselves from American rule and get closer to the Iranians (who according to Washington, are trying to destabilize Iran).

At the same time, the Saudis who have been harshly critical the decision to topple Saddam Hussein and recognize the constraints operating on U.S. power in the region are using their economic and diplomatic power to strengthening the Sunni regimes in the region while trying to appease Iran, hoping to create a stable balance of power in the Persian Gulf.

The sidelining of American power in the Middle East has been even more evident in the Levant, where leading American allies”€“Israel, Egypt and Turkey have been pursuing policies that run contrary to stated American policy.

Hence, while the Bush Administration and its neoconservative ideologues have depicted the secular Ba”€™ath regime in Damascus as an unofficial member of the Axis of Evil and part of the Islamo-Fascist threat, Turkey and Israel have been raising strong objections to this American dogma by arguing that the Syrian current partnership with Iran is tactical and not strategic and that Damascus is interested in negotiating a peace agreement with Israel and could be co-opted into a moderate pro-western bloc in the region.

Despite strong American opposition, the Israelis have decided to start to negotiate with the Syrians under Turkish auspices—and both sides have expressed satisfaction with the first phase of the talks in Turkey. Filling the vacuum that has been created by the American refusal to support the Israel-Syria talks has been France, with President Nicolas Sarkozy inviting Assad, together with all other Mediterranean heads of states, including that Israel, to attend the inaugural meeting of the “€œMediterranean Union”€ in Paris on July 13. The French leader is hoping that Israel and Syria would become part of a new “€œMediterranean Union”€ to complement the EU.

France could also play a constructive role in dealing with another consequence of the U.S. policy in the Levant. The Americans have been critical of the recent deal, backed by Syria and Iran, that was reached between the Lebanese government headed by Fouad Seniora and the Hezbollah movement that seemed to strengthen the power of the Shiite group. Sarkozy whose government has had maintained historic ties to Lebanon and Syria and could help facilitate a détente between the two countries that reflects the new balance of power in the region.

And finally, after President Bush’s visit to Israel during which he bashed diplomatic negotiations with rogue regimes and terrorist groups as “€œappeasement,”€ Israel has agreed to finalize a deal with the Hamas government in the Gaza Strip, mediated by Egypt, which could create the basis for a long-term cease-fire between the Israelis and the radical Islamic group which the Bush Administration has refused to engage and vowed to diplomatically isolate.

While some experts in Washington are suggesting that the Americans should support and even take part in the negotiations between Israel and Syria as well as between Israel and Hamas. In fact, one reason that these diplomatic engagements proved to be successful, has to do with the American disengagement from these processes which tend to provide incentives for the Middle Eastern players to take care of their respective interests.

Indeed, one could imagine the noisy opposition that an American involvement in talks with the Hamas and Syria would have ignited on Capitol Hill and other centers of political power during this heated election season and lead to the collapse of those talks. Moreover, the Syrians, the Palestinians and the Israelis would have probably tried to extract diplomatic and financial goodies from the American in exchange for their “€œpainful”€ concessions that they would have had to make anyway.

There is a certain lesson that the new American president could draw from these recent developments when he considers reassessing American presence in Iraq. A gradual U.S. disengagement from that country “€“ and from the entire Middle East—could actually put pressure on the main political forces in Mesopotamia as well on the other governments in the region to work together to protect their strategic and economic interests by ensuring that Iraq doesn”€™t disintegrate and the balance of power there remains stable. Indeed, these Middle Eastern players might all surprise Washington by doing better without American military interventions and futile “€œpeace processing”€. Indeed, dumping the Middle East “€“ could end up being a great bargain for the both the Middle Easterners and the Americans.

One thing about paleocons—they’re not predictable. When I mentioned in a previous article the fact that Pius XII helped save Jews and Serbs from genocide through (among many tactics) ordering priests to issue fake baptismal certificates, it never occurred to me that readers would write in denouncing me for slandering that pope. Would the great Pope Pius countenance deceit, even in such a cause? They demanded proof. It exists, in the form of two papal nuncios (the pope’s special ambassadors to countries) who each issued such life-saving forgeries by the thousands, and claimed that they did so on Pius XII’s direct orders: Angelo Rotta, nuncio to Hungary, and Angelo Roncalli (later Pope John XXIII, since beatified—and hence on the fast-track to sainthood), who worked in Turkey. Pius XII never contradicted their accounts, or disciplined them in any way, so there’s no reason to claim (as one commenter did) that these two archbishops were lying.

  

However, some folks really have seemed scandalized at this great pope’s willingness to wield deceit, and I really shouldn’t dismiss their concerns so lightly—given that Christians have agonized for centuries over the biblical injunction not to “bear false witness,” and have sometimes given their lives rather than tell a fib. Since I’ve already written an article in on just this question, I won’t re-invent the wheel. Here it is:

  

You Shall Not Bear False Witness Against Your Neighbor.”

  

This commandment seems innocuous enough. On its face, it only prevents us from telling malicious falsehoods damaging to others. Okay, we’re not really thrilled about that—especially if we’re active in politics—but we can understand it, and grudgingly agree. But like most other elements in Divine Revelation, it has grown over time and extended its reach into all sorts of analogous situations, as rabbis, then bishops and popes, strove to explore all its implications for human life. It’s as if each commandment were a pebble dropped into a pond, and our job were to trace all the ripples. But that metaphor doesn’t quite work, because it makes things too easy. Ripples from a pebble flow in clear, predictable waves, and a freshman physics student should be able to account for them. The pieces of Revelation that have fallen on us from space are not inert but active, and the pool in which they plop—human life—is murky and full of dark, swimmy things. And some of them have claws. So perhaps a better image is a giant Alka-Seltzer, dropped in a swamp: Plop-plop, fizz-fizz, Oh what a morass it is!

  

This commandment especially fits that description. On the one hand, the Catholic Church teaches us in the new Catechism (#2467):

  

Man tends by nature toward the truth. He is obliged to honor and bear witness to it: “It is in accordance with their dignity that all men, because they are persons . . . are both impelled by their nature and bound by a moral obligation to seek the truth, especially religious truth. They are also bound to adhere to the truth once they come to know it and direct their whole lives in accordance with the demands of truth.

  

Theologians point out that man is hardwired both to seek and speak the truth, and it’s on the assumption that people’s words are trustworthy that communication is predicated. Think what it would be like if that weren’t true: recall that really annoying example you had to study in Philosophy 101—where a man from Crete tells you “All Cretans lie,” and you have to figure out whether or not you should believe him? Remember how you reacted? (“Screw this! Let’s go get a keg.”) Imagine every encounter with other people turning into that kind of tedious brain-twister, and you’ll appreciate Yahweh’s point. Or let’s view this thing in terms of dollars and cents. Societies which don’t value straight dealing and honest business waste enormous resources on bribes, wire-tapping, bulletproof auto glass, and personal body guards named Ivan, impoverishing everyone except a tiny, corrupt elite. And if you don’t believe Yahweh, you should visit New Jersey yourself.

  

(It was telling that in 2004, when the stench of his corruption began to crowd out the reek of the Meadowlands, New Jersey Governor James McGreevey dodged investigation for his actual malfeasances by resigning over a sex scandal. In a press conference, McGreevey dabbed his eyes and said, “My truth is that I am a gay American.” As if anyone cared. What mattered was his orientation as a “corrupto-American.” Now there’s a persecuted minority: Even today, thousands of these fellow-citizens languish in minimum security cells all across America.)

  

On the other hand, there’s also such a thing as “too much information.” For instance, when a D.A. questions a priest about the contents of a confession. Or when women arise at romantic dinners and excuse themselves by announcing, “I gotta pee!” In each case, the speaker is under a solemn obligation to withhold this information—if need be, by throwing people off the scent. The priest can say, “I do not know.” Or the woman could say “I need to wash my hands,” and let you finish your lemon sorbet in peace.

  

The need for discretion arises not just from the sacramental obligation of secrecy, or the queasy demands of courtesy. The Church sees a duty in charity sometimes to withhold or even cloud the truth. For instance, when one is tempted to spread ugly facts about a third party without grave and sufficient reason. Dishing the dirt about somebody just for the fun of it can actually amount to a serious sin, even—and here’s the weird part—if what you’re saying is true. I know, I know….

  

It’s hard for modern readers of the press to wrap their heads around this one—accustomed as we are to hidden cameras poking into the bedrooms of Hollywood starlets, and congressional probes into the president’s pants. But this prohibition on “detraction” is reiterated in the most recent Catechism, with certain exceptions made for journalists. (Because of the nature of their profession, these wordsmiths are considered essentially subhuman, and are bound only by “journalistic ethics,” which are modeled on the rules governing bonobos. No throwing turds inside the troop.)

  

Theologians have argued for centuries about how to reconcile these two principles, truth-telling and charity, and have come to a wide variety of conclusions. Church Fathers Origen and St. John Chrysostom each believed that sometimes outright lying might be acceptable, if keeping silent wasn’t an option and telling the truth caused greater harm than the lie itself. Historians report that Martin Luther embraced this idea, once declaring: “What harm would it be if a man told a good lusty lie in a worthy cause; for the sake of the Christian Churches?” A curious quotation, which leaves Luther looking like a Protestant stereotype of a scheming Jesuit.

  

In fact, the Jesuits and other Scholastic Catholics wrestled mightily with the obligation to tell the truth, since they felt bound by the teaching of St. Augustine, who rejected as intrinsically evil every kind of fib. As the 1917 Catholic Encyclopedia explained:

  

St. Augustine held that the naked truth must be told whatever the consequences may be. He directs that in difficult cases silence should be observed if possible…. If a man is hid in your house, and his life is sought by murderers, and they come and ask you whether he is in the house, you may say that you know where he is, but will not tell: you may not deny that he is there.

  

Augustine’s position here is elegant, clear, consistent—and I must add, kind of crazy. It holds up truth-telling as a higher good than life, and encourages the Christian to keep his conscience clean at the cost of another man’s murder. (Immanuel Kant would later adopt the same position, keen as he was to create a system of perfectly self-consistent human Reason as a replacement for the God whose existence he’d started to doubt.) This seems strange until we consider that Augustine also taught that killing—even waging wholesale war—could be perfectly moral, if done in self-defense, or defense of the innocent. This means that for Augustine, when faced with murderers at the door, a good Christian may never mislead them. Instead, he may shoot them.

  

Leading Christian thinkers lined up behind St. Augustine in subsequent centuries, making fine distinctions about the types and gravity of lies. St. Thomas Aquinas—the Henry Ford of our Faith who liked to break things down into tiny, interchangeable parts—divided lies into three categories:

  

· Injurious, the kind of lie that leads men to Hell or gets them killed. For instance, if one were to say that Mother Teresa was “a demagogue, an obscurantist and a servant of earthly powers.” (Christopher Hitchens.) Or that Saddam Hussein had by 2003 amassed weapons of mass destruction which he planned to transfer to terrorists. (Also Christopher Hitchens.)

  

· Officious, the sort of lie designed to cover one’s butt or other body parts, as in “I did not have sex with that woman.”

  

· Jocose, a statement which is meant as a jest, but could be taken seriously—for instance: “No, honey, you don’t look fat in that dress.” Which is clearly a joke. Sweetie, if you have to ask….

  

The need to balance Augustine’s stark position with the demands of discretion and charity grew more urgent over time. When the Protestant English kings began to persecute the Church—hunting down priests and torturing them to death—moral thinkers began to look for ways to permit laymen to effectively hide these priests when questioned. (Many old English homes contain man-shaped “priest holes,” of obscure origin. Since England is the land of “British liberties,” some have theorized that these holes are a naturally occurring phenomenon. Just like Mt. Rushmore, according to Cher. Or so Sonny Bono claimed she believed—but then he was bitter, and a congressman.)

  

Theologians, many of them Jesuits, developed the notion of a “mental reservation,” which permitted someone to tell only part of the truth, in a somewhat misleading way—leaving the listener to draw an untrue conclusion. For instance, you might say, “I haven’t seen any priests,” while mentally reserving the rest of the sentence, “in the last 30 seconds.”

  

This seems a squirmy kind of loophole through which to preserve a principle, and it certainly would not have satisfied St. Augustine. By the late-nineteenth century, some theologians tried to formulate exceptions to the duty to tell the truth in a less back-handed way. They admitted that it is always wrong to lie, but redefined a lie as an untrue statement made to someone who has the right to the truth. And a killer or priest-hunter at the door had no such right. (Sometimes the Church can only break through a conundrum using this handy method, as we once redefined “usury,” “religious liberty,” and “baptism.” But if you’re hoping some pope will one day redefine, say “porn,” don’t hold your breath.)

  

Back in 1917, this seemed a bold position, and the theologians editing the good old Catholic Encyclopedia dismissed it as having “made little or no impression on the common teaching of the Catholic schools.” Then something happened; a concrete historical change requiring the quick development of doctrine. Historians call it “World War II.”

  

With the rise of a murderous dictatorship that hunted down millions of innocents because of their race, Catholics all across Europe were faced with the same dilemma once posed, almost idly, by theologians. For yet another time in history, there were indeed thousands of armed state-sponsored murderers banging on the doors in search of innocents hidden inside. The thousands of Polish Catholics who sheltered Jews from the Germans—and Pope Pius XII himself, who arranged for some 800,000 or more persecuted Jews to be hidden in monasteries and convents—now faced the terrible choice between telling the truth and betraying the innocent. Inside the Reich, conspirators such as Colonel Claus von Stauffenberg (an aristocrat of noble Catholic ancestry) were forced to tell hundreds of falsehoods as they plotted to assassinate Hitler in 1944; instead of condemning their efforts, Pope Pius helped them transmit messages to each other. The Frenchmen who fought in the Resistance had to deceive their occupiers and their own puppet government—and so on. The unprecedented phenomenon of a totalitarian state bent on genocide helped sweep away the squeamishness of theologians, and show the primacy of justice in defending the innocent. In the 1994 edition of Catechism of the Catholic Church, this once-daring distinction found itself enshrined as follows:

  

Lying is the most direct offense against the truth. To lie is to speak or act against the truth in order to lead into error someone who has the right to know the truth. By injuring man’s relation to truth and to his neighbor, a lie offends against the fundamental relation of man and of his word to the Lord. (#2483)

  

Sounds fine to us. But then in 1997, Pope John Paul threw another Alka-Seltzer into the swamp. His revised, Latin edition of the text removed the phrase “someone who has the right to know the truth,” thus reopening the question. And raising a question for us: Did Karol Wotyjla get through six years of German occupation—and take part in the Resistance—without ever telling the SS a falsehood? Should the Catholics who hid Jewish children and “lied” through their teeth to keep them out of Auschwitz have confessed this “sin”? This issue remains unresolved. Perhaps what I seek is “too much information.” I’d like to continue this inquiry, but I need to go… wash my hands.

  

Excerpted by the author from The Bad Catholic’s Guide to Wine, Whiskey, and Song.

Under consideration: Michael Pollan, The Omnivor’s Dilemma: A Natural History of Four Meals, Penguin (2006), 464 pages; and In Defense of Food: An Eater’s Manifesto, Penguin (2008), 256 pages. 

A few weeks ago I attended a meeting of Kansas secessionists. The participants were rowdy, complaining of economic gigantism squashing them flat and bureaucratic thugs hounding their every move. They were all sick and tired of worker-ant existence in the hive-mind of American groupthink and they wanted out. Despite the quintessentially political nature of the gathering, politics proper never came up. Conservative and liberal meant nothing in that room, and party affiliation even less. 

Kansas patriots fomenting disunion? No, though there are a few of those kicking around these parts. These were local farmers organizing a farmer’s market. I had offered the parking lot of my law firm for their use, and was mostly just an observer of the scene. The locals probably couldn”€™t tell you the first thing about the politics of secession, but the Spirit of “€™76 showed up in force. Damned were the federal busy-bodies who tell local farmers what they can and can”€™t sell; condemned were the centralized agents of agri-business who want ID chips implanted in livestock; mocked were the credentialed witch-doctors from the department of agriculture who own the brand “€œorganic.”€ 

All that was left undone was a patriotic march to the local Enormo-Mart to dump the limp and faded out-of-season tomatoes imported from South America into the local pond (which isn”€™t quite Boston Harbor, but it would have served). And while there was no Declaration, it was clear that these small growers wanted out”€”out of forced participation in the economic union of cheap mass production, central planning, credit money, and the ignorant consumerism they despised. 

Michael Pollan would understand. His The Omnivore’s Dilemma and its sequel, In Defense of Food, amount to a manifesto for farmer’s markets and locally produced food across the country. Meticulously researched, Pollan’s work chronicles and traces the gigantism that defines today’s food economy”€”and all the deleterious effects which result. 

“€œWhat’s the big deal?”€ many will ask. Let Pollan count the problems”€”declining health; an obesity epidemic; the collapse of the family meal; environmental degradation; a food system that will eventually tumble leading to food shortage and political unrest; the loss of joy and beauty in eating; the forgetfulness of a people bereft of one of the most basic pillars of tradition”€”grandma’s recipes; and ultimately, the loss of freedom for a people incapable of the ordinary work of self-provisioning.

If that’s not enough, our food also tastes like shit. In Wendell Berry‘s apt aphorism, our food economy is busy turning people into pigs rather than pigs into people. Or as Pollan puts it, “€œCheapness and ignorance are mutually reinforcing.”€ Tell me about it.

Pollan issues this simple dictate: “€œEat food. Not too much. Mostly plants.”€ The harping of a food-scold?  Perhaps. But Pollan fleshes his commandment out well, especially the first third. By “€œeat food”€ Pollan actually has in mind something quite revolutionary, because you can”€™t buy “€œfood”€ most places. Walk into your local Mega-lo-Mart and what you see is not food”€”it’s processed corn syrup and assorted chemicals and “€œnutrients”€ packaged in plastic and shot so full of preservatives it will never rot. 

Food rots. If it doesn”€™t rot, it’s not food. That’s a good principle to live by.

Or try this one: if your ancestors wouldn”€™t recognize it as something good to eat, it’s not food.  Imagine a pre-historic everyman fingering the oblong yellow cakey substance filled with white goo. It might be the turd of some exotic as yet unknown animal species. Good to eat?  Certainly not”€”until, that is, he experiences the artificial sugar intake which stampedes his natural resistance and enslaves him.

Twinkies and Mountain Dew”€”the body and blood of a new sacrament in the temple of foodshit.  

Pollan pulls the curtain back on the small cartel of priestly “€œnutrition experts”€ and “€œfood scientists”€ emanating from land grant universities who rule this temple, dominate our government’s food policy, and determine what we will eat.  He demonstrates convincingly that if one can penetrate beneath the glitzy plastic wrappers, the know-nothing food pyramids, and the seemingly interminable processing of our foodstuff, we are in reality little more than a nation of beasts in a continuous state of mastication at a Babelesque pile of corn so massive it stretches to the carbon infused heavens. Not pretty, and hardly the wholesome image of the American family at table.

Nature abhors a monoculture, but bean counters (kernel counters in reality) adore them. And corn is the monomania of American culture. We”€™ve even taken to pumping it into our SUV’s and minivans. 

An aside: In 1890 a small western Kansas town sponsored a public debate on the statement: “€œOpportunities have never been better in Kansas.”€ Taking the affirmative was a lawyer recently immigrated from the east. By all reported accounts, he acquitted himself well, giving a fine and persuasive speech.  When the lawyer finished, a local farmer, seizing the opportunity to take the negative, got up and proceeded to shovel a load of freshly harvested corn into the wood stove.  He sat down without saying a word.  As the local press reported it, those in attendance unanimously agreed that the farmer had won the debate. In 1890 corn was worth more as fuel than as food. Deja vu all over again, only now it’s “€œgreen.”€

In the wider (or narrower) world of the pundit “€œfood wars”€”€”think Rod Dreher’s Crunchy Cons“€”these discussions tend to illicit either a retreat into faux philistinism or a mockery of the same. Pollan’s own response illustrates this tension well.  His conclusions are in fact deeply traditional”€”one might even venture to call them conservative”€”a fact he acknowledges, yet one which clearly makes him uncomfortable. 

Simultaneously exploited and neglected in this debate are the virtues of the actual philistines. Conservatives defiantly celebrating their double-whopper and fries and liberals pacing the isle of Whole Foods in search of the perfect dinner party. Pollan’s work has the virtue of refusing both of these easy outs, but then he can”€™t he bring himself to tell the whole story. 

Pollan’s sensibility is that of the kitchen lover”€”an admirable thing to be sure”€”but it’s a love that tends to go unconsummated in an age of gentile decadence. He frets continuously over the ethics of killing a chicken for dinner. He admits he is uncomfortable with the conservative culture of the farm. His tentative solutions tend towards state intervention rather than true laissez faire

Honest redneckery comes by dint of sweat on the brow, clods underfoot, and mud on the frock. Down at the feed store, the sun-burned, dirty men I talk to would be more likely to open up a can of whup-ass on Pollan’s hand-wringing self than celebrate his latest gourmand achievement. 

This uniquely American disconnect is illustrated well by a short anecdote Pollan relates. In Martin Van Buren’s reelection campaign of 1840, his opponent William Henry Harrison effectively ridiculed Van Buren for bringing a personal French Chef to the White House. Harrison, as he let it be known, preferred “€œraw beef and salt.”€ The lesson, as Van Buren and Rod Dreher both learned, is that “€œto savor food, to conceive of a meal as an aesthetic experience, has been regarded as evidence of effeteness, a form of foreign foppery.”€  

To bridge this chasm requires a firm recognition that self-provisioning is dirty work done by sun hardened men who obtain not the rarefied sophistication of the credentialed witch-doctors and their organic brews but membership in the rarefied league of freemen who can pretty much tell anyone and everyone, as circumstances may require, to go to hell without concern for the consequences (taxman excepted). 

That’s the feed store definition of freedom in Jefferson (yes, that Jefferson) County, Kansas, though it’s not taught much in social studies textbooks. Only such men”€”rich or poor, barber or builder, clodhopper or shopkeeper”€”know true equality, for they know and honor the true measure of the other. They are “€œequal to their own needs”€ in Wendell Berry’s terms, which is the foundation of that quaint Aristotelian notion philia politike“€”political fraternity”€”otherwise known as peace and happiness. 

There is a sentiment in the punditry for what some have dubbed the emergence of the “€œMichael Pollan/Wendell Berry right.”€ Characterizing the food secessionist movement this way is a mistake because the food problem in this country described and catalogued so aptly by Pollan is ultimately a symptom of the must more disturbing problem: that the league of freemen has dwindled to near extinction.  The important questions start not with what we eat but about who we are. Pollan’s insight is to understand that the former tells us a lot about the latter. The Pollan temptation is to believe that the gold ring in the pig’s snout makes a difference. Gilding the sow doesn”€™t make her free when the slaughter house commeth.     

Meanwhile, every weekend in my parking lot the secessionists now gather to opt out of economic union of their food masters. Growers and eaters. Neighbors. Celebrating interdependence and independence. And not least of all, as Thoreau exclaimed: “€œI did taste!”€

Caleb Stegall practices law in Kerry, Kansas, and is a regular contributor to Taki’s Magazine.

In view of the numerous responses to my announcement of the death of paleoconservatism and my discussion of the transition from a paleo to a pospaleo opposition to the neoconservative-liberal media, there may be need for these further clarifications. One, the postpaleos”€™ indifference to the post-World War II conservative movement is a decided advantage that they enjoy in relation to their elders. They are not mired in a past that can offer only very limited direction in charting a future course.

As Nietzsche wisely pointed out in The Adantages and Disadvantages of History for Life, there are some historical narratives that have ceased to advance human intelligence and creativity, and it is therefore a good idea that we try to move beyond them. It is even unwise to keep symbols around that may hide or falsify what is really going on. One might argue that such relics as the Swedish monarchy and various European national churches do harm by fostering the illusion of historical continuity in countries that have sunk into multicultural confusion and socialist behavior control.

Similar considerations would apply to magazines that now dispense neoconservative poisons that were once identified with the Taft Republican tradition or even at their fringes with European royalism. The continued operation of such publications as Human Events and National Review under radically different auspices from their original ones may be even more harmful for the real Right than such explicitly neoconservative organs as Weekly Standard and New Criterion. The semblance of continuity in publications that were formerly on the right but have drifted into the neocon camp may promote the erroneous belief that these magazines still reflect the core values that had characterized them forty years ago.

Two, postpaleos do not intend to “€œtake us back”€ to the “€œold movement,”€ which is the fantasy of a golden age of American conservatism that never existed, or at least not in the stable form that the nostalgia-buffs believe it did. Since the 1950s the “€œconservative movement”€ has been in transition while exhibiting certain constant features. It has moved steadily leftward but has been micromanaged from the Northeast; and ever since the days of its construction, it has remained firmly in the hands of New York and Washington journalists. That this movement once pursued more traditional rightist politics and that it tolerated a higher degree of debate than it does now, are both indisputable facts.

But it is equally indisputable that the “€œmovement”€ has steered since the 1950s in its current direction; and there is no reason for us to become nostalgic over the spilled milk left by this cobbled together, largely journalistic enterprise. The CM should be viewed as a collection of resources that postpaleos should fight to take over or try to influence. Where such a possibility does not exist, the young Right should aim at destroying its enemies”€™ assets.

Two, unlike many of their elders, the postpaleos have no need to cozy up to their enemies. They do not expect to be invited to a cocktail party sponsored by The Nation or Commentary, and if they happened to receive and then take such unlikely invitations, it would be to gather information they could deploy in their continuing war of attrition. This kind of hard-headedness is often lacking in those of an older generation who are constantly hoping to “€œcrack the opposition”€ or to make belated careers as friends of the neocons or of the more radical left. As Tom Piatak argued in a perceptive comment in The American Conservative [not available online], inflamed anti-Christian leftists like Sid Blumenthal are not planning to befriend the traditionalist Right. Such ideologues are steaming at the neoconservatives for making even tactical alliances with those whom Blumenthal would like to sweep off the planet.

Recently I have developed the impression that at least some of those on the right are attacking the war in Iraq partly in search of sympathy from the Left. Although this war is plainly unnecessary and being fought for questionable ideological reasons, it is not the unprecedented series of inhumanities that it is made to appear in some rightwing venues. It is probably the least vicious and the most restrained war launched by the US in the last hundred years, give or take a few minor interventions such as the ones in Grenada. It is, moreover, possible to challenge the wisdom of the war, without descending into certain over-the-top practices, such as whitewashing the brutal mass murders of Saddam Hussein or bringing up the standard leftist charge of “€œfascist”€ when describing neoconservative military adventurers.

The invectives against the Bush administration as “€œfascist”€ and the focus on oil interests as the cause of the war are both tiresome leftist gestures that some paleos have begun to imitate. I would not be bothered by these outbursts if I did not believe that at least some of them look like pandering.  Some of my comrades-in-arms may be more upset than I by the war and I respect their moral feelings. But other “€œantiwar conservatives,”€ I have become convinced, appear to believe that by complaining about neocon “€œfascists,”€ the Left might eventually start to applaud. Those who think so are living in a delusional world. In any event, the Right should not be hallucinating about the prospect of swilling Martinis at a gathering at AEI or in the office of Victor Navasky.

Three, the postpaleos will have to pursue, and all the more vigorously as resources become available, the tasks of discrediting the neoconservatives and presenting themselves as the true Right. Pospaleos will have to get their hands dirty by continuing to go after their enemies and by doing so in a way that draws public notice. Dwelling on the images of Novalis’s Europa oder Christenheit? (a subject taken up in my first book and in a very long German essay) may be an aesthetically gratifying act, but it will not have any effect in counteracting the marginal position to which our side has been relegated.

To break out of this encirclement, there is need for aggressive action; and I”€™ve no doubt the postpaleos will rise to this challenge. Their enemy will be the managerial therapeutic state and its liberal-neocon shock troop; and the doctrines under which this order will continue to be defended will likely remain the same as it has until now: namely, propositional nationhood, antiracism, anti-homophobia, anti-anti-Semitism, and anti-fascism. All of the political class’s campaigns of intimidation relate back to the same ideology of control; and what divides its members may be nothing more substantive than whether their hegemonic ideology is to be spread through war or by some other means.

The correct position for dealing with the dominant class is not the kind of ranting I have heard from the extreme Right against Jews, Masons, Skull and Bones, or whatever. An intelligent Right must make well-reasoned and thoroughly documented attacks on political correctness, global democracy, and other tools of expanding public traditional social institutions.

Lastly I trust the postpaleos will never hold back from flattening those who claim to be on the right but who can”€™t resist paying homage to leftist heroes. Someone who recently distinguished himself by doing this is Marcus Epstein, who pounced on that onetime rightwing publication Human Events, for lying (as it now repeatedly does) about Martin Luther King. It is not coincidental that the same publication has begun to close itself off to the opinions of the non-neoconservative Right. The real Right should never lose an opportunity to accuse those who are blatantly catering to the Left of behaving indecently and mendaciously. Dissemblers who are playing to both sides are as much of a danger to us as such out-an-out foes as Sid Blumenthal and Victor Navasky. And when these dissemblers get caught on their lies they look even worse.