A Normal Country in a Normal Time Ever Again?

 | 

The collapse of the Soviet Empire in 1989–1991 closed an important chapter not only in Russian history, but in our own as well.

For 50 years after Pearl Harbor, the United States, a nation enjoined to isolation by its founders, had labored to save Western civilization, and indeed the world, from Nazi and Soviet totalitarianism. It had won through against both enemies, though at considerable cost to itself.

The war of 1941–1945 against Nazi Germany and militarist Japan cost the lives of 400,000 soldiers, sailors, airmen, and Marines. We must, of course, never forget the sacrifice those men made for victory. Lost lives aside, however, the war actually benefited America tremendously. We emerged from it as the greatest military power on earth, with unchallengeable air and sea power and a monopoly on the atomic bomb. Our economy in 1945 accounted for almost 50% of the world’s total output; we possessed a wealth of modern plant and equipment, and we were far ahead of the rest of the world in most if not all cutting-edge technologies. Our infrastructure was the most modern and efficient in the world, and there was more (such as the national highway system) to come. Our debt was high, but we owed most of it to ourselves, and were quite capable of paying it off. The terrible days of the Great Depression were over, seemingly for good; the soup kitchens and shantytowns of the 1930s were gone, while an expanding middle class that for the first time included blue-collar workers was enjoying a prosperity greater than any other nation had known.

If culturally the America of 1945 was in no way comparable to Periclean Athens or Augustan Rome, there was nevertheless a certain vitality evident in American arts and letters. Modernism was in its heyday, and its capital was no longer Paris but New York. The undifferentiated mass barbarism of the postmodernist present was, in the period 1945–1965, almost inconceivable.

We emerged from World War II as the greatest military power on earth, with unchallengeable air and sea power and a monopoly on the atomic bomb.

The costs of the Cold War against Soviet Communism were both more subtle and more profound than those incurred in World War II, although it was not until the 1960s that these costs began to be felt. Dallas and its legacies — the presidency of Lyndon Johnson and his war in Vietnam — initiated a period of decline in American power, prestige, and prosperity. The fall of Saigon in 1975 and the Soviet invasion of Afghanistan in 1979 (the latter, as it turned out, the last in a series of Communist takeovers in what was then known as the Third World) seemed to mark a turn in the historical tide. Not that communism, as a doctrine and system of government, could stand comparison to Western values; it most assuredly could not. But the West, and particularly the United States, appeared to be in terminal decline. By the late 1970s a failure of will, of morale, was palpably in the air. Vietnam looked increasingly like an American version of the expedition to Syracuse — that unnecessary and, ultimately, disastrous campaign undertaken by ancient Athens, and memorably recorded in the pages of Thucydides.

Yet Athens, despite its defeat at Syracuse, and despite waging war simultaneously against Sparta and the vast Persian Empire, rallied and regained the upper hand in the Peloponnesian War. It was only later that war à outrance and treason within brought about Athens’ final defeat and the end of its primacy in the ancient world.

America in the 1980s rallied in a similar fashion, emerging from the nadir of defeat in Vietnam to challenge Soviet imperialism once more, and then, by a policy of peace through strength, giving the sclerotic Soviet system a final push that sent it to its well-deserved place on the trash heap of history. With this the 50-year struggle against totalitarianism was over, and freedom had triumphed. Or had it? At just this moment, in 1990, Jeane Kirkpatrick, formerly Ronald Reagan’s UN Ambassador and a prominent neoconservative, published an article in the National Interest. It was titled “A Normal Country in a Normal Time,” and it put forth a vision utterly different from that held by most of her fellow neocons, who in the aftermath of victory were advocating that the United States seek to achieve “full-spectrum dominance,” i.e., world domination.

Kirkpatrick, a card-carrying member of the foreign policy establishment, began her essay by stating that a good society is not defined by its foreign policy but rather by the “existence of democracy, opportunity, fairness; by the relations among its citizens, the kind of character nurtured, and the quality of life lived.”

Kirkpatrick put forth a vision utterly different from that held by most of her fellow neocons, who in the aftermath of victory were advocating that the United States seek to achieve world domination.

She went on to write that “Foreign policy becomes a major aspect of a society only [emphasis added] if its government is expansionist, imperial, aggressive, or when it is threatened by aggression.” The end of the Cold War, she averred, “frees time, attention, and resources for American needs.”

Kirkpatrick’s vision was right for America in 1990, and it remains so now. But that vision, alas, has never been fulfilled.

In her essay Kirkpatrick warned that foreign policy elites — the denizens of government bureaucracies, universities, and thinktanks — had become altogether too influential and powerful during the 50 years’ emergency, and that their interests were by no means aligned with those of the citizenry as a whole. She made two other very important points: that restraint on the international stage is not the same thing as isolationism, and that popular control of foreign policy is vitally necessary to prevent elite, minority opinion from determining the perceived national interest. With respect to the latter point Kirkpatrick neither said nor implied that the American people should make policy directly. She acknowledged — correctly — that professional diplomats and other experts are required for the proper execution of national policy. But policy in the broad sense must reflect the views of the people and must be circumscribed by the amount of blood and treasure the people are willing to sacrifice for any particular foreign policy objective.

Her concept of a polity in which the citizenry sets or at least endorses the goals of foreign policy admittedly has its troubling aspects. For one thing, it is far from certain that the citizenry as a whole — the masses, to be blunt — will choose to adopt wise policies. In Athens the expedition against Syracuse was enthusiastically endorsed by the Assembly, and history is replete with further examples of the popular will leading to disaster. Flowing from this is a second problem: the ability of clever demagogues or cabals to sway or bypass popular opinion in favor of policies that are inimical to the general interest, and that often turn out to be disastrous. Post-World War II American history provides numerous examples of this: the CIA’s 1953 overthrow of a democratic government in Iran at the behest of British and American oil interests, with consequences that we are still trying to deal with today; the Bay of Pigs (1961), which set in motion a chain of events that nearly led to nuclear annihilation during the Cuban Missile Crisis; the wars in Vietnam and Iraq, both of which received initial popular support as a result of outright deception perpetrated by a few powerful men with an agenda. (The phony Tonkin Gulf incident opened the way to escalation in Vietnam, while the falsehoods about WMD, anthrax, and Saddam Hussein’s connection to 9/11 made possible George W. Bush’s war in Iraq.)

Even the 1940s had their dark side, for those years were marked by the beginning of the modern “Deep State.”

Nevertheless, the alternative to popular control over foreign policy is the placing of the nation’s destiny in the hands of an elite that, by its very nature, typically has little understanding of the needs and desires of the people as a whole. Such elites are, unfortunately, quite prone to committing disastrous errors of judgment — witness the events mentioned above. Plato’s guardians are rarely found in the flesh. Gibbon pointed to the Five Good Emperors who reigned over Rome in the period 96–180 CE, which the historian characterized as the happiest and most prosperous time in human history. But these men were almost the exceptions that prove the rule. British policy in the 19th century was guided by statesmen such as Palmerston and Salisbury — men who understood both Britain’s interests and the limits of its power. For a brief period of ten years, between the fall of France in 1940 and the decision to march to the Yalu in Korea in 1950, American foreign policy received, in general, wise elite guidance. These were critical years, and we should be thankful that men such as George Marshall and Dean Acheson were in power at that time. But except for that brief span, elite leadership of American foreign policy has entailed economic and blood costs far in excess of those we actually needed to pay. Even the 1940s had their dark side, for those years were marked by the beginning of the modern “Deep State.”

The Deep State, quite real though unacknowledged by most academic historians and the mainstream media, amounts to a partnership between nonstate actors and various groups inside government, working together to shape and carry out policies that are generally contrary to the popular will, and often to the national interest as well. The Deep State is not a second, shadow government or conspiracy central, with permanent members who manipulate puppets in the White House and the halls of Congress. Rather, it consists of shifting or ad hoc alliances between government insiders and groups of powerful people or institutions outside of government. The former are sometimes elected officials, sometimes holders of key posts in the bureaucracy or the military. Such alliances are typically formed in the name of “national security” but often benefit only the ideological, institutional, or pecuniary interests of Deep State actors.

Some of the nonstate actors are “respectable” (the big New York banks, the oil majors, defense contractors), while others are by no means so (the Mafia, international drug traffickers). But whether they can be mentioned in polite company or not, their influence has often been felt in the councils of government, and particularly with respect to American foreign policy. For example, the swift transformation of the CIA, originally conceived as an intelligence-gathering agency, into a covert operations juggernaut was the work of men drawn mainly from Wall Street law firms and investment banks. These men went on to cooperate with the Mafia in places such as Cuba, extending an overworld-underworld partnership that went back to World War II.

Malign influences of this sort had been present since at least the end of the Civil War, but in earlier times had been limited to buying votes in Congress or persuading the executive to dispatch the Marines to establish order and collect debts in Latin American banana republics. The great expansion of government in World War II, and especially during the Cold War, allowed the Deep State to metastasize. The collapse of the European colonial empires and the simultaneous ascension of America to superpower status meant that after 1945 the American Deep State could extend its tentacles globally.

The turning point was probably the National Security Act of 1947, which created the CIA, the Joint Chiefs of Staff, and the National Security Council. These institutions, and particularly the first two, were (and to an extent still are) beyond the effective control of either the Congress or the president of the moment. And they are not alone. The various intelligence services and the military, or parts thereof, often pursue agendas that are at variance with official policy as set out by the president. They sometimes partner with each other, or with powerful institutions and people outside of government, to achieve mutually desired objectives. President Eisenhower, with his immense personal popularity and prestige, was able to hold the line to the extent of keeping us out of another shooting war, though he nevertheless felt compelled to warn the people, in his farewell address, of the growing power and influence of the Deep State, which he termed the Military-Industrial Complex.

The “deep events” of the 1960s, ’70s, and ’80s — Dallas, Vietnam, Watergate, Iran-Contra — cannot be understood without reference to the Deep State. The role of the Bank of Credit and Commerce International (BCCI) in Iran-Contra is a good example of the Deep State in action. I mention BCCI specifically because its peculiar history has been revealed in several well-researched books and in investigations by the Congress. But the role of BCCI in Iran-Contra (and much else besides) is just one of the many strange manifestations of the Deep State in our history. The Deep State’s activities sometimes remain forever dark, are sometimes only partially revealed, or if revealed are explained away as aberrations.

The loss of liberty that resulted from the emergence and growth of the Deep State was real and perhaps irreversible. By the 1960s, the machinery of domestic surveillance, created in embryo by J. Edgar Hoover even before World War II, included spying on the populace by the FBI, CIA, NSA, and the military. Domestic spying was reined in somewhat during the 1970s, only to be ramped up again under Reagan in the 1980s. These abuses were part of the price paid for victory in the Cold War. Whether such abuses were inevitable under Cold War conditions is debatable; I personally would characterize them as the effluvia typical of a bloated imperium.

The Deep State’s activities sometimes remain forever dark, are sometimes only partially revealed, or if revealed are explained away as aberrations.

Be that as it may, the Cold War did end in a real victory, and with victory came the hope that the worrisome trends (“worrisome” is doubtless too mild a word) that the struggle against totalitarianism had initiated or exacerbated could be reversed.

It was therefore highly encouraging when in 1990 Kirkpatrick published her article calling on America to become once again a normal country. That the call was sounded by a leading representative of the neoconservative movement, rather than someone from the Left, was quite promising. If a hardliner such as Kirkpatrick could see the light, perhaps other important leaders of the American polity would, too.

In the 1990s there were some indications that we were heading in the right direction. Under Bush the First and Clinton, defense spending decreased by about 30% from Cold War highs. Internally, signs of health began to emerge — for example, the decline in crime to early 1960s levels, and the return to a balanced federal budget (the latter, admittedly, achieved with some accounting legerdemain). A slow but steady healing process appeared to be underway.

In retrospect, one can see that these were mere surface phenomena. America’s role in the world did not undergo a fundamental reappraisal, as Kirkpatrick’s thesis demanded. The almost bloodless Gulf War of 1991 (paid for by our allies) seemed to indicate that empire could now be done on the cheap. Meddling elsewhere — in Somalia, Bosnia, Kosovo — reinforced this view, even though Somalia turned out badly (and of course Bosnia eventually became a hotbed of Islamic fundamentalism and jihadism, which is the state of affairs there today). In the 1990s pundits and average citizens alike began to speak openly of an American empire, while of course stressing its liberal and benign aspects. “We run the world” was the view espoused across a broad spectrum of public opinion, with dissent from this view confined to a few libertarians and traditional conservatives on the right, and some principled thinkers on the left.

At the same time, Deep State actors were attempting, both openly and covertly, to prevent any return to normalcy (if I may use that term), while promoting their agenda of American supremacy. Certain academics and intellectuals, lobbyists, defense contractors, and government officials with their eyes on the revolving door were all working assiduously to convince the Congress and the people that a return to something like a normal country in a normal time was a dangerous proposition. In fact, of course, there was no longer any need for America to maintain a huge military establishment and a worldwide network of bases — for there was no longer any existential threat. Russia was at that time virtually prostrate (nor did it ever have to become an enemy again), China as a danger was at least 25 years away, and Islamic terrorism was in its infancy — and could moreover have been sidestepped if the US had simply withdrawn from the Middle East, or at least evacuated Saudi Arabia and ended its one-sided support for Israel. But in the end these facts were either ignored or obscured by influential people with foreign policy axes to grind, assisted by others who had a financial stake in the maintenance of a global American empire.

The almost bloodless Gulf War of 1991 (paid for by our allies) seemed to indicate that empire could now be done on the cheap.

One group, The Project for the New American Century, stands out for its persistence and drive in seeking to advance a particularist agenda. It is no exaggeration to say that the members of this group — which included not only such faux intellectuals as Bill Kristol, but men with real power inside and outside of government, such as Dick Cheney and Donald Rumsfeld — prepared the way for the Iraq War and the Patriot Act. The blueprints for both the war and the Act were prepared by these men even before 9/11. September 11, 2001 was of course a turning point, just as 1947 had been. The neocons, the Deep State, had won. When the towers came down it meant that “full-spectrum dominance” had triumphed over “a normal country in a normal time.”

The Project for the New American Century closed its doors in 2006, but the neocons live on, and persist in calling for more defense spending, more interventionism, and more government restrictions on civil liberties. And they are joined by other voices. The liberal interventionists who surround Hillary Clinton are best characterized by the term neocon-lite. They, like the neocons, see Obama as far too passive a commander-in-chief, even as he wages war by proxy and drone in the Middle East, Afghanistan, and Pakistan, and continues the state of national emergency first declared by George W. Bush on September 14, 2001. The state of emergency gives extraordinary wartime powers to the executive, even in the absence of a declared war. Some of the powers that the commander-in-chief possesses under the declaration are actually secret. Obama, who has the authority to end the state of emergency, has instead renewed it annually since taking office. The Congress, which is required by law to meet every six months to determine whether the state of emergency should be continued, has never considered the matter in formal session. (The Roman Republic, in case of a dire emergency, appointed a dictator whose power automatically expired after six months’ time. Only under the empire was a permanent autocracy instituted.)

At the same time, the systematic domestic surveillance authorized under the Patriot Act, far more extensive than anything J. Edgar Hoover or James Jesus Angleton (CIA Chief of Counterintelligence, 1954–1975) ever dreamed of, has been left virtually intact by the Obama administration and the Congress.

Obama’s successor, whether Republican or Democrat, is almost certain to be more interventionist abroad, and equally or more unfriendly to civil liberties at home (Trump seems mainly concerned with getting our allies to pay more for the protection we give them, as opposed to cutting back on our worldwide commitments, while his apparent views on civil liberties are not encouraging). America, it appears, is incapable of dialing back on imperial overstretch. Yet what vital American interest is served by meddling in places like Yemen or Ukraine? What ideals are fulfilled by supporting the suppression of democracy in, for example, Bahrain? It seems clear that American elites, both inside and outside government, simply cannot bring themselves to let the world be, cannot abandon the concept of a global order organized and run by the United States.

With distance comes perspective. As time passes it becomes ever clearer that George W. Bush’s war in Iraq represented a second American Syracuse, a defeat with catastrophic consequences. It is quite true that, as in Vietnam, our forces were not beaten in the field. But the greater truth is that the political objectives in Iraq, as in Vietnam, were not achievable, and that this could and should have been recognized from the start. Today most of Iraq is divided between a corrupt and incompetent Shia-led government under the influence of Iran, and an ISIS-dominated territory in which obscurantism and bloodthirsty brutality hold sway. Such are the fruits of the successful march on Baghdad in 2003. Trillions of American dollars — every penny of it borrowed — were thrown down the Iraqi rathole, as the Bush administration abandoned the principle of balanced budgets and the prospect of paying off the national debt, something that appeared eminently possible at the beginning of its term in office. The dead and the maimed, Americans and Iraqis, suffered to no purpose.

The liberal interventionists who surround Hillary Clinton are best characterized by the term neocon-lite.

Americans are a resilient people. America’s institutions, despite obvious flaws, are superior to those of its enemies and rivals. America recovered from the Syracuse of Vietnam and not only salved the wounds of that war but went on to defeat its main competitor in the arena of world politics. But can America recover from a second Syracuse?

Compare the state of the nation today with that of 1945, or even 1965. Admittedly, not everything has gone to rot. The advances achieved by women and minorities — racial and sexual — have given us a better, freer society, at least on the social plane, compared to 50 years ago. Advances in technology have in some respects brightened our lives. But the heavy hand of government and the machinations of the Deep State have brought the country to the brink of bankruptcy, enmeshed us in foreign lands where we ought never to have trespassed, and put limits on basic freedoms of speech and privacy. Broad-based prosperity and the economic optimism of the past are gone, perhaps forever, because of adventurism abroad and elite mismanagement of the economy at home.

The current ruptures in the governing duopoly, Republican-Democrat, are clear evidence of dysfunction at the highest level, and of the citizens’ discontent. Yet the election of 2016 will be fought out between a bloviating, ignorant real estate tycoon and a tired, corrupt ex-First Lady. The former knows little of the Washington machine or the intricacies of the Deep State; I predict that, if elected, he will be reduced to a virtual puppet, and the fact will never dawn on him. Hillary, on the other hand, is very comfortable with the status quo, no matter what she may say to placate the supporters of her rival Bernie Sanders. Neither Trump nor Clinton — or anyone else with power, either — appears to have a clue about the real nature of the crisis we are in, much less how to bring us out of it.

A normal country in a normal time? Never again, I think. The future appears quite dark to me.

* * *

Author’s Note: Some readers of Liberty may be unfamiliar with the concept of the Deep State, or may reject it as mere conspiracy-mongering. In fact, the Deep State (or parts thereof) has been discussed in several well-researched books. A newcomer to the idea might begin by reading Philip Giraldi’s article, “Deep State America,”which appeared on the website of the American Conservative on July 30, 2015. Read it. I take issue with Giraldi in one respect: his total focus on the New York-Washington axis of power. The Sun Belt also plays a huge role in the Deep State. Jeane Kirkpatrick’s 1990 article, by the way, cannot be read free online, but is available through JSTOR.




Share This


Unintentional Truth

 | 

“The plaintiffs in the Trump University case, filed in 2010, accuse him and the now-defunct school of defrauding people who paid as much as $35,000 for real estate advice. Mr. Trump said Friday that Trump University received ‘mostly unbelievable reviews’ from its 10,000 students.” — “Judge Unseals Trump University Documents,” Wall Street Journal online, May 31, 2016.

Trump’s statement may well be true.




Share This


Hollywood Fights Market; Market Wins

 | 

Money Monster isn’t billed as a comedy (in fact, it’s supposed to be a thriller), but it is still one of the silliest films I’ve seen in ages.

Lee Gates (George Clooney) is a cable TV investment personality of the Jim Cramer school, with a shtick that includes dancing girls, funny hats, crazy film clips, party noisemakers, and outlandish recommendations that often turn out to be profitable investments. He doesn’t think much about his viewers’ actual profits and losses because he never sees his viewers — that is, until Kyle Budwell (Jack O’Connell) shows up on the set with a figurative axe to grind and a literal gun in his pocket. He also has a funny explosive vest to go with Lee’s funny hat. He makes Lee wear it.

We are expected to believe that Budwell, the terrorist, would be able to wander onto a live set, simply because he is dressed like a deliveryman and carries a couple of cardboard boxes.

I’ll warn you here that this review is going to contain a few spoilers, but knowing some of the plot twists is not going to ruin the film for you; it’s pretty much ruined on its own, and these are mad meanderings, not genuine twists. Besides, I don’t recommend that you waste your money or your time on this monster of a movie, and revealing some of the plot is the only way I can demonstrate to you just how silly and unbelievable the premise is.

Hollywood will go to great lengths to cast aspersions on Wall Street, business, and the free market, even greenlighting a movie with a script with more holes than a Chuck E. Cheese Whack-A-Mole (and a lot less entertaining). First we are expected to believe that Budwell, the terrorist, would be able to wander onto a live set, simply because he is dressed like a deliveryman and carries a couple of cardboard boxes. Sorry, folks, the days of Cary Grant sneaking into the boss’s office carrying a florist’s bouquet are long gone, and security at a television station is much tighter than that.

Then we are expected to believe that the cameras would continue to roll and the signal would continue to be broadcast while a lunatic holds a gun to the head of a nationally known journalist — or anyone, for that matter. Regardless of what the terrorist (and the voyeuristic television consumer) might be demanding, someone — anyone — would have pulled that plug immediately.

We are also expected to believe that Kyle invested all his money — all his money — in a single hedge fund. The SEC has rules about that. Under the Dodd-Frank Act, “qualified investors” must have a net worth of at least a million dollars, not counting their personal residence, or an income of at least $200,000, in order to purchase shares in risky investment vehicles such as the one in the script. Kyle makes $14 an hour as a sanitation worker. He is not a qualified investor. The hedge fund would not have accepted Kyle’s money. George Clooney and Jodie Foster (the film’s director) probably don’t realize this because they have managers who invest their money for them. They’re qualified investors; they just aren’t qualified to play with investors in the movies.

Next is Lee Gates’ ridiculous solution to Kyle’s problem. It seems that Kyle invested his money in a hedge fund that Lee recommended a few weeks ago, and the fund’s price tanked, taking Kyle’s money with it. Lee turns to the camera and asks his viewers to start buying the stock in order to pump up the price for Kyle and his fellow losers. First, viewers would smell a rat if a showman like Gates made such an outlandish plea. Remember Soupy Sales? “Kids, take a dollar out of your mother’s purse and send it to Soupy at this address . . .”

Kyle's girlfriend bawls him out and dares him to pull the trigger on the bomb — while she is in the studio. Who in the world would be that crazy?

More importantly, Lee’s idea wouldn’t help Kyle or the others who have lost money, even if the stock did return to previous levels. Stock prices rise and fall as new buyers purchase shares from current owners. It’s the ultimate example of supply and demand. In this case, the people who sold on the way down don’t own any shares anymore, so they aren’t going to get their money back, even if prices climb to the sky. They’re just going to feel worse. The only people who could make money on Lee’s new deal are the ones who buy at the bottom and sell at the new top. And believe me, Lee Gates would be investigated for investment fraud after these shenanigans were over. (Assuming he made it out of the exploding vest in one piece.)

The cops are just as stupid. They bring Kyle’s girlfriend to the studio to talk some sense into him and calm him down, even though they know she’s fit to be tied about him. And she’s just as stupid. Instead of calming him down, she bawls him out and dares him to pull the trigger on the bomb — while she is in the studio. Who in the world would be that crazy? And then there is the usual Hollywood inanity of having SWAT teams or, in this case, bomb squads enter a highly volatile location without wearing helmets. I know, it’s a film technique considered necessary so that we (the audience) can see their pretty faces while they talk.

In such situations, we’re supposed to suspend our disbelief, and usually I do. But in this movie my disbelief was suspended so far above reality that I became positively giddy from lack of oxygen.

The denouement is just as ridiculous as the build-up. We are supposed to believe that the greedy director of the hedge fund has manipulated a mining strike in South Africa in order to buy low and then sell high when the strike is called off, but a glitch in his plan resulted in a loss of $800,000,000. That’s a lot of platinum for two weeks’ digging.

I’m sure that George Clooney, who produced the film as well as starred in it, thinks he’s doing the world a big favor by pointing out the evils of greed and investing, but all he did with Money Monster is point out his own monstrous ignorance. He still has the dark swoony eyes, though. Maybe he should leave the social justice films for a while and make a nice romantic comedy.


Editor's Note: Review of "Money Monster," directed by Jodie Foster. Tristar Pictures, 2016, 98 minutes.



Share This


We Are All Victims Now

 | 

On April 30 a 19-year-old Arizona man was arrested on 70 criminal charges after it was discovered that, in a picture taken last August of his high-school football team, the tip of his penis was protruding from the top of his pants. Although the photo, joke included, appeared in his high school yearbook and in programs distributed at sports events, it took all this time for someone to notice the little flash of penis. Nevertheless, “Mesa [Arizona] police booked Osborn [that’s the kid] on one count of furnishing obscene material to minors, a felony, and 69 counts of indecent exposure. Ten faculty members and 59 students were present when Osborn exposed himself and are considered victims, according to police and court documents.”

This happened in a country in which Prince, a musician who appeared on stage and in videos with his naked butt protruding from his costume, while dancers mimicked sex acts, was mourned as a national hero after his death from an apparent drug overdose; a country in which the most profitable music lyrics are so obscene and violent that journals not labeled “adult” never quote them; a country in which, over two decades ago, the Surgeon General suggested that young people be taught to masturbate; a country in which hundreds of thousands of young women are exploited as “baby mamas” by irresponsible men; a country in which major corporations boycott a state because it does not stipulate that people can enter any restroom that matches their own idea of their gender; a country in which . . . Add your own examples. This is the country in which 70 people became sexual victims without even knowing that anything happened to them.

By the way, the charges against the young man have now been dropped. There was a public outcry, thank God. Now I hope we can all focus our attention on our national schizophrenia about sex.




Share This


Can This Be Real?

 | 

Like millions of other people, I’m used to regarding the current presidential campaign as something I see on television — a long-running show that isn’t nearly as good as the original Law and Order, and is much farther removed from reality.

But now I’m convinced that this thing is real. It isn’t just a drama about Martians invading the earth. The Martians are actually here. Beings called Donald Trump and Hillary Clinton will actually be nominated for the highest office in the secular world.

I have only some scattered thoughts to offer.

1. If the establishment “conservatives” had done what they promised to do, and could have done, instead of giving veto power to Harry Reid and every pressure group in the country, this never would have happened.

2. If the establishment “liberals” thought that funding universities to teach people nonsense would not produce a perennial crop of agitators, they were stupider than I thought. But yes, they were stupider than I thought. You can see this in the amazement on Hillary’s face whenever somebody hits her with a slogan that comes right out of Democratic Party 101.

If Donald and Hillary were people of responsible character, they would not be the presumptive nominees for president of the United States.

3. It has been said that if you subsidize something, you get more of it. Both parties have spent the past generation subsidizing the arrogance of the rich, the illusions of the poor, and the ignorance of everyone. Can you imagine Donald Trump reading a book? Can you imagine Hillary Clinton reading a book, even a book she “wrote”? Now imagine one of these illiterates in the seat of Adams and Jefferson.

4. If you went for personal advice to Donald Trump or Hillary Clinton, what do you think you’d hear? Can you think of anyone, not criminally insane, who would give you more spiritually debilitating counsel?

5. Does character count? Yes it does, but in politics it often counts in ways we wouldn’t like it to count. If Donald and Hillary were people of responsible character, they would not be the presumptive nominees for president of the United States.

6. Some libertarians believe that this amusement park election will expose the evils of American politics. I’m sure they’re right. In fact, it has already done so. The question is, what condition will we be in when we stumble off the roller coaster at the end of this ride?




Share This


A Fun Day for Hillary

 | 

Maybe you have already witnessed these things, but on April 3 I finally saw videos of the end of Muammar Gaddafi and the rejoicing of Hillary Clinton about it.

The date is October 20, 2011. Gaddafi, deposed dictator of Libya, is being lynched by a mob of Muslim “militants.” He is crying and his face is covered with blood. One of his dirty and insane countrymen is overcome by the glory of tearing off Gaddafi’s shoe. It is evident that Gaddafi’s tortures will continue until he is dead.

Now for video no. 2. Clinton, Secretary of State of the United States, is sitting in a comfortable chair, surrounded by her aides and a television crew. She is being interviewed by a CBS reporter. She hears the news of Gaddafi’s death, under what circumstances she can well imagine. She jiggles and rolls her eyes like a high-school cheerleader and emits a parody of Julius Caesar: “We came, we saw, he died.” She laughs and preens.

The two sequences are peculiarly disturbing, tawdry, painful, vile.

What had happened?

Gaddafi, a violent eccentric, had ruled Libya for 42 years. At first an opponent of the West and a sponsor of terrorism, he later helped to repress our crazed Islamic enemies and made a good start at liberating his economy. His reward was to be set upon by rebels encouraged by the United States and its NATO allies, under the direction of President Obama and Hillary Clinton. Then, when the rebels demonstrated that they could not beat him, he was deposed by the “humanitarian assistance” granted to them by NATO, in the form of weapons supplies and bombing. The lynch mob that seized him was able to do so because his convoy of vehicles had been attacked from the air and disabled by NATO. Hence Mrs. Clinton’s pride in his death. It seems to have been her most valued achievement.

What was the result?

Libyans split into rival factions, much worse than before. Many of them went over to the forces of radical Islam. Some of those people mobbed the United States embassy and killed our ambassador, using weapons that the US had supplied. What was once the nation of Libya is now a scene of chronic civil war in which ISIS and other terrorists have found a congenial home. Libya’s neighbor, Egypt, was also the target of American intervention, which helped to install a government run by Islamic extremists who began a reign of terror against Christians and dissidents. Contrary to the mandate of the United States, the extremists were kicked out by other Egyptians. The Libyan mess remains, and to a large degree the Egyptian mess.

Hence Mrs. Clinton’s pride in Gaddafi's death. It seems to have been her most valued achievement.

The Obama administration’s involvement in these circumstances is still being investigated. Mrs. Clinton is still being investigated. Gaddafi is dead. The videos of his sickening death and her sickening laughter remain.

Here is a snapshot of our world, and of the Obama administration’s place in it. It’s a world of competing evils, in which the United States, for all the supposedly best reasons, chronically favors the worst. Obama, we hear, wanted to end US imperialism. He wanted to end America’s habit of dominating other countries for their own good. He wanted to end . . . all that. So, like Woodrow Wilson, or Bill Clinton, or George Bush, he meddled forcibly with other countries. Including Libya.

And you see what happened. You don’t need to have it explained to you. You see it.




Share This


Ideas Have Consequences

 | 

It probably couldn’t be any worse. The current presidential candidates are about as bad as bad can be.

Just look at them.

  • Ted Cruz, who called a press conference to say that he would not “copulate” with a rat like Donald Trump.
  • Donald Trump, who had every opportunity to gather all anti-establishment voters into his fold but insisted, instead, on alienating as many as possible — e.g., stipulating that in some hypothetical world in which abortion was outlawed, women who had abortions should be “punished,” then putting out a press release saying that he didn’t really mean that, and then saying what he didn’t mean again.
  • Bernie Sanders, spouting non-facts 24/7.
  • Hillary Clinton — say no more.

The temptation is to attribute the horror of 2016 to the candidates’ abominable personalities, or at most to the failures of the electoral system, which is warmly responsive to televisable personalities (Trump), and to the indefatigable pressure groups that gave us Clinton and Sanders (and Jeb Bush and a few other sparklers).

I think that those factors are important, but they are as nothing when compared with the ideas that are insisted upon by the pressure groups and are projected so abominably by the personalities.

All the problems that are used to justify the literally insane campaigns now being waged were the direct results of unlimited government.

The ideas aren’t many. We’re not dealing with the intellectual intricacy of the questions that Lincoln and Douglas debated. Most of what passes for ideas in today’s campaigning results from a handful of crude, outdated assumptions, as follows:

1. The idea that work produces wealth, and therefore ought to be rewarded — an idea that had the stuffing knocked out of it by the discovery of the principle of marginal utility, a mere 14 decades ago.

2. The age-old idea that wealth should be apportioned by political means; i.e., by force.

These two ideas provide most of Bernie Sanders’ intellectual equipment, if you want to call it that.

3. The pre-1830s idea that free trade is bad for the economy.

Here you will recognize Donald Trump’s motivating idea, and one of Sanders’.

4. The 1970s idea that racial — and “racial” — sensitivities have rights that government must enforce.

This belief, which is merely the flipside of the much older belief that white racial sensitivities must be enforced by government, is the basis of the grievance industry which fuels both Sanders and Clinton, and without which their candidacies might not be able to exist.

5. The idea that, as H.L. Mencken said, “the people know what they want and deserve to get it, good and hard.”

This is populism, which fuels the preposterous windbaggery of Trump and Sanders, and to a degree that of Cruz. It was adequately discredited by the idiotic behavior of the ancient, direct democracies, if not of modern Detroit, Chicago, and New York City.

Now, you may say, and you would be right to say to it, these fallacious notions get a lot of their steam from the true, or sort of true, ideas that are associated with them. Sanders’ people and Trump’s people are right in believing that the financial system is rigged against the majority of Americans. Trump’s people and Cruz’s people are right in thinking that the country is being run into the ground by small groups of wealthy, or otherwise privileged, self-serving apostles of political correctness, seemingly bent on outraging all feelings but their own. Trump’s people are right in thinking that a welfare state cannot admit hordes of immigrants without grossly disadvantaging its own citizens. Clinton’s people are right in their visceral aversion to populism.

It’s remarkable that Clinton’s supporters, though undoubtedly the best “educated” of any of these groups, has the fewest ideas, right or wrong. It’s certainly a commentary on elite education.

But the most remarkable fact is that all the problems that are used to justify the literally insane campaigns now being waged were the direct results of unlimited government. If the American people had voted to increase income inequality, strangle the middle class, create racial tensions, ship jobs overseas, enlarge the permanent underclass, and grant a permanent veto power to an unelected class of well-paid parasites, they couldn’t have gotten better results from their decades of votes for people who wished to expand the government.

Now people of common sense and what used to be common knowledge are seeing (the cliché is unavoidable) the chickens coming home to roost. Are you happy? I’m not.




Share This


Blast Radius

 | 




Share This


The Olympics and Liberty

 | 

I’m often asked what makes a film “libertarian.” Does it need to be set in a dystopian totalitarian future? Must the protagonist be fighting a government bureaucracy or authority? Many libertarian films do contain those features. But my favorites are those in which a protagonist achieves a goal or overcomes obstacles without turning to the government to fix things.

Two such films are in theaters now, and both are based on true stories about Olympic athletes who achieved their goals in spite of government interference, not because of government aid. Race tells the Jesse Owens story, and Eddie the Eagle tells the Michael Edwards story. Both are worth seeing.

Race is the perfect title for this film that focuses on both racing and racism. Owens was one of the most famous athletes of the 20th century. Historian Richard Crepeau (who spoke at FreedomFest last year) described the 1935 college track meet at Ann Arbor in which Owens, in the space of 45 minutes, set three world records and tied a fourth as “the most impressive athletic achievement since 1850.” Nevertheless, Owens (Stephan James) is not welcome at the 1936 Olympics in Berlin. Adolf Hitler (Adrian Zwicker) intends to use “his” Olympics as a propaganda piece to highlight the physical superiority of the Aryan race, and he does not want any blacks or Jews to spoil his plan. He hires filmmaker Leni Riefenstahl (Carice van Houten) to document the glorious event.

The film reveals the backstage negotiations between Olympic Committee representative Avery Brundage (Jeremy Irons) and the German organizing committee at which Brundage insisted on assurances that Jews and blacks would be allowed to compete. Brundage’s insistence is somewhat hypocritical, considering the treatment Owens and other black athletes were enduring at home, but he was successful in forestalling a threatened American boycott of the Games.

What makes a film “libertarian”? Does it need to be set in a dystopian totalitarian future? Must the protagonist be fighting a government bureaucracy or authority?

Owens faces similar pressure from the NAACP, as he is warned that he ought to boycott the Games to protest racism in Germany. Owens feels the weight of his race as he considers the conflict, but in the end he delivers the most resounding protest of all, winning four gold medals and derailing Hitler’s plan in short order. This is as it should be. What good would it have done if Owens had stayed home to protest German policy? Would it have made any difference? Would anyone even have noticed? I felt the same way when President Carter made the opposite decision in 1980 and forced American athletes to boycott the 1980 Games in Moscow to protest Russia’s invasion of Afghanistan. What good did it do to destroy the dreams of hundreds of American athletes who had trained their whole lives to compete in a tournament that comes along only once every four years? Did it help anyone in Afghanistan? Did it hurt all those Russian athletes who took home more medals because the Americans weren’t there? I think not.

In the movie, Owens also faces the pressure of athletic celebrity, and Stephan James skillfully portrays the ambition and the temptations of a small-town boy chasing big-time dreams. He is anchored in his pursuits by his college coach Larry Snyder (Jason Sudeikis) and his girlfriend Ruth (Shanice Banton), who would become his wife and partner until the day he died in 1980. As with most sports films, the outcome of the contest is known from the beginning. The real story is how the hero gets there, and how he conducts himself along the way. Owens was a hero worthy of the title.

Eddie the Eagle tells the story of an Olympic hero of a different sort — one who is remembered for his tenacity rather than his innate skill. Michael Edwards (played by Taron Egerton as an adult and by brothers Tommy Costello, Jr. at 10 years old and Jack Costello at 15) simply dreams of being an Olympian; he doesn’t care what sport. His mother (Jo Hartley) nurtures that dream, giving him a special biscuit tin to hold his medals and praising his accomplishments, even if it’s just holding his breath for 58 seconds. Ironically, Eddie is motivated by a picture of Jesse Owens in a book about the Olympics. By contrast, Eddie’s father (Keith Allen) is a pragmatist, encouraging Eddie to give up his silly dreams and become a plasterer like him. His father isn’t a bad man; he just wants to protect his son from disappointment and financial waste. Fortunately for Eddie, he has the kind of optimistic personality that simply doesn’t hear criticism.

Owens feels the weight of his race as he considers the conflict, but in the end he delivers the most resounding protest of all, winning four gold medals and derailing Hitler’s plan in short order.

Eddie settles on skiing as his sport and manages to qualify for the British Olympic team, but the Committee cuts him because he “isn’t Olympic material.” Read: you don’t dress well or look right and you’re rather clumsy. Undaunted, Eddie turns to ski jumping because — well, because no one else in Britain competes in ski jumping. If he can compete in an international event and land on his feet, he can qualify for the Calgary Olympics. This is the same year that the Jamaican bobsled team slipped through the same loophole — a loophole that was quickly closed before the following season. Now athletes must compete internationally and place in the top 30% of finishers in order to qualify. But in 1988, if you could find a sport that few people in your country competed in, you could literally “make the team.”

With his father’s van and his mother’s savings, Eddie takes off for the training slopes in Germany. There he tackles the jumps, crashes on his landings, and tackles the jumps again. When he lands the 15-meter jump successfully, he moves on to the 40 and the 70, crashing more than he lands. Low camera angles during the jumps create the sensation of height and speed, providing a rush of adrenaline for the audience. Frequent shots of Eddie tumbling after a crash emphasize just how risky this beautiful sport is. We admire Eddie’s persistence, even as we cringe at his crashes. He believes in himself, no matter what.

Eventually he meets up with Bronson Peary (Hugh Jackman), a chain-smoking, hard-drinking slope groomer who looks incredibly lean and buff for an alcoholic. Peary turns out to be a former ski jumper who lost his chance for Olympic glory by not taking his training seriously. This, of course, sets us up for the perfect sports metaphor movie: unskilled amateur with indomitable heart meets innately talented athlete who threw it all away, and both find redemption while training for the Games.

Eddie turns to ski jumping because — well, because no one else in Britain competes in ski jumping.

It’s a great story about overcoming obstacles, sticking with a goal, and ignoring the naysayers. It demonstrates the power of a mother’s encouragement, and the possibility that even a poor, farsighted boy from a working-class neighborhood can achieve his dreams — if he doesn’t kill himself practicing for it.

All this allows us to forgive the fact that the movie mostly isn’t true. Yes, Michael Edwards did compete in the Calgary Olympics. He did set a British record for ski-jumping, despite coming in dead last in both events, simply because, as the only British jumper, his was the only British record. His exuberance and joy just for participating in the Olympics led to his being the only individual athlete referred to specifically in the closing speech that year (“some of us even soared like an eagle”). But Bronson Peary never existed, and Michael Edwards actually trained with US coaches at Lake Placid, albeit with limited funds that caused him to use ill-fitting equipment. But that wouldn’t have given us such a feel-good story.


Editor's Note: Reviews of "Race," directed by Stephen Hopkins. Forecast Pictures, 2016, 134 minutes; and "Eddie the Eagle," directed by Dexter Fletcher. Pinewood Studios, 2016, 106 minutes.



Share This


Pandora’s Book

 | 

What would you do if you were told that something you believe is not true? It would depend on who was telling you, I guess. It would also depend on how important the belief was to you, and on the strength of the evidence offered, wouldn’t it?

Suppose the belief in question had shaped your career and your view of how the world works. What if you were offered strong evidence that this fundamental belief was just plain wrong? What if you were offered proof?

Would you look at it?

In his 2014 book, A Troublesome Inheritance: Genes, Race and Human History, Nicholas Wade takes the position that “human evolution has been recent, copious, and regional.” Put that way, it sounds rather harmless, doesn’t it? In fact, the book has caused quite a ruckus.

What if you were offered strong evidence that this fundamental belief was just plain wrong? What if you were offered proof?

The following is not a review of Wade’s book. It is, instead, more a look at how the book was received and why. There are six parts: a story about Galileo, a summary of what I was taught about evolution in college, a sharper-edged rendering of the book’s hypothesis, an overview of some of the reviews, an irreverent comment on the controversy over Wade’s choice of a word, and, finally, an upbeat suggestion to those engaged in the ongoing nurture vs. nature debate.

1. It is the winter of 1609. In a courtyard of the University of Padua, Galileo Galilei closes one eye and peers at the moon through his recently improved telescope. As he observes the play of light and shadow on its surface, there comes a moment when he realizes that he is looking at the rising and setting of the sun across the mountains and valleys of another world. He is stunned.

Galileo hurries to tell his friend and colleague, Cesare Cremonini, then drags him to the courtyard, urging him to view this wonder. Cesare puts his eye to the scope for just a moment, then pulls his head back, pauses, frowns, and says, “I do not wish to approve of claims about which I do not have any knowledge, and about things which I have not seen . . . and then to observe through those glasses gives me a headache. Enough! I do not want to hear anything more about this.”

What a thing to say.

A little context might help. Cesare taught the philosophy of Aristotle at Padua. Aristotle held that the moon was not a world but a perfect sphere: no mountains, no valleys. Furthermore, the Inquisition was underway, and a tenured professor of philosophy who started rhapsodizing about “another world” would have been well advised to restrict his comments to the Celestial Kingdom. The Pope, you see, agreed with Aristotle. To him, and, therefore, to the Roman Catholic Church, the only “world” was the earth, the immobile center of the universe around which everything else moved. Any other view was taboo. Poor Cesare! Not only did he not want to look through the telescope; he did not want there to be mountains on the moon at all.

The question in the present drama is this: who is playing the role of Cremonini?

It would get worse. Soon Galileo would point his scope at Jupiter and discover its moons, heavenly bodies that clearly weren’t orbiting the earth. Then he would observe and record the astonishing fact that Venus went through phases as it orbited not the earth but the sun. So: Ptolemy was wrong, Copernicus was right, and Cesare Cremonini would go down in history as the epitome of willful ignorance. Galileo, of course, fell into the clutches of the Inquisition and became a hero of the Renaissance.

To be fair to Cesare, the story has been retrospectively streamlined into a sort of scientific morality tale. While the part about Galileo’s discovery is probably more or less right, Cremonini’s remark wasn’t made directly to Galileo. It was reported to him later in a letter from a mutual friend, Paolo Gualdo. The text of that letter is included in Galileo’s work, Opere II. And while those jagged borders of light and dark on the moon, imperfectly magnified, were certainly thought-provoking, to say that the case against Ptolemy was closed on the spot, that night in Padua, would be too neat.

It makes a good story, though, and a nice lens for viewing reactions to scientific breakthroughs. Changing our focus now from the moons of Jupiter to the molecular Rubik’s cube we call the human genome, the question in the present drama is this: who is playing the role of Cremonini?

2. In an undergraduate course, taken decades ago, I was taught that human evolution had more or less stopped when the glaciers retreated about 10,000 years ago. Evolution had been driven primarily by natural selection in response to a changing environment; and, as such changes had, for the time being at least, halted, so too had the evolution of man.

I was taught that races exist only as social constructs, not as meaningful biological categories, and that these constructs are only skin deep. They told me that the social behavior of an individual is not genetic, that behavioral and cognitive propensities just aren’t in our genes.

I was taught that the differences among social organizations are unrelated to the genetic differences of the populations that comprise those various organizations, and that social environments have no influence on human evolution.

3. To show how Wade’s book stirred things up, I will present his central hypothesis with an emphasis on the controversial parts. I’ll avoid scientific jargon, in an effort to make the meaning clearer to my fellow nonscientists.

Wade believes that humanity has been evolving rapidly during the past 30,000 years and continues to evolve rapidly today. It is not just our physical characteristics that continue to evolve. The genes that influence our behavior also evolve. (Yes, that’s what the book says, that our behavior is influenced by our genes.)

is humanity rapidly evolving? Is there such a thing as race in biological terms? Nicholas Wade believes that the answer is “yes.”

He also believes that humanity has evolved differently in different locations, most markedly on the different continents, where the major races evolved. (Yes, the book calls them races.)

These separately evolved genetic differences include those that influence behavior. (Yes, the book says that race is deeper than the skin.)

Furthermore, these genetic differences in behavioral propensities have contributed to the diversity of civilizations. The characteristics of any given civilization, in turn, influence the direction of evolution of the humans who compose it.

Oh, my.

We now know that the earth goes around the sun. But is humanity rapidly evolving? Is there such a thing as race in biological terms? Does the particular set of alleles in an individual’s genome influence how that person behaves? Does the particular frequency of alleles in the collective genetic material of the people who compose a civilization influence the characteristics of that civilization? Do the characteristics of a civilization influence the direction of the evolution of the humans that compose it? Nicholas Wade believes that the answer to all these questions is “yes.” While he does not claim that all of this has been proven, he is saying, in effect, that what I learned in college is not true. Am I now to be cast as Cremonini?

4. There are those who disagree with Wade.

In fact, lots of people didn’t like A Troublesome Inheritance at all. I’ve read about 20 reviews, few of them favorable. Even Charles Murray, writing in theWall Street Journal, seemed skeptical of some of Wade’s arguments.Most of the others were simply unfavorable, among them reviews in the Washington Post, the New York Review of Books, Scientific American, the New York Times, The New Republic, and even Reason. Slate and The Huffington Post piled on. While Brian Bethune’s review in MacLean’s was gentler than most, it was gently dismissive.

The reactions run from disdain to anger to mockery. Nathaniel Comfort’s satirical review Hail Britannia!,in his blog Genotopia, is the funniest. Donning the persona of a beef-fed, red-faced, pukka sahib at the height of the Raj, he praises Wade’s book as a self-evident explanation of the superiority of the West in general and the British in particular. (I once saw a retired British officer of the Indian Army being told by an Indian government official that he had to move his trailer to a remote area of a tiger preserve to ensure the security of a visiting head of state. He expressed his reluctance with the words, “I’m trying to be reasonable, damn it, but I’m not a reasonable man!”)

There’s some pretty heated language in these reviews, too. That the reviewers are upset is understandable. After all, they have been told that what they believe is not true. And the fellow doing the telling isn’t even a scientist.Sure, Nicholas Wade was a science writer and editor for the New York Times for three decades, but that doesn’t makehim a scientist. Several of the reviews charge that Wade relies on so many historical anecdotes, broad-brush impressions, and hastily formed conclusions that it’s a stretch to say the book is based on science at all.

Of course they’re angry. Some of these guys are professors who teach, do research, and write books on the very subject areas that Wade rampages through. If he’s right, then they’re wrong, and their life’s work has been, if not wasted, at the very least misguided.

The consensus is that Wade has made a complete hash of the scientific evidence that he cites to make his case: cherry-picking, mischaracterizing, over-generalizing, quoting out of context, that kind of thing.

Another common complaint is that, wittingly or not, Wade is providing aid and comfort to racists. In fact, the animosity conveyed in some of the reviews may spring primarily from this accusation. In his review in the New York Times, David Dobbs called the book “dangerous.” Whoa. As I said, they don’t like A Troublesome Inheritance at all.

So, is Nicholas Wade just plain wrong, or are his learned critics just so many Cremoninis?

5. While the intricacies of most of the disagreements between Wade and his critics are over my head, one of the criticisms is fairly clear. It is that Wade uses the term “race” inappropriately.

The nub of the race question is that biologists want the word “race” as it applies to humans to be the equivalent of the word “subspecies” as it applies to animals. As the genetic differences among individual humans and the different populations of humans are so few, and the boundaries between the populations so indistinct, biologists conclude that there are no races. We are all homo sapiens sapiens. We are one.

Several of the reviews charge that Wade relies on so many historical anecdotes, broad-brush impressions, and hastily formed conclusions that it’s a stretch to say the book is based on science at all.

Just south of Flathead Lake in Montana is an 8,000-acre buffalo preserve. One summer day in the mid-’70s, I walked into its visitors center with my wife and father-in-law and asked the woman behind the counter, “Where are the buffalo?” She did not hesitate before hissing, “They’re bison.” Ah, yes: the bison-headed nickel, Bison Bill Cody, and, “Oh, give me a home where the bison roam . . .” You know the critter.

Put it this way: to a National Park Ranger, a buffalo is a bison; to a biological anthropologist, race is a social construct. That doesn’t mean there’s no such thing as a buffalo.

I don’t mean to make light of it. I’ve read the explanations. I’ve studied the printouts that graph and color-code populations according to genetic variation. I’ve studied the maps and charts that show the differences in allele frequencies among the groups. I’ve squinted at the blurry edges of the clusters. I get all that, but this much is clear: the great clusters of genetic variation that correspond to the thousands of years of relative isolation on the various continents that followed the trek out of Africa are real, and because they are genetic, they are biological. In any case, we are not in a biology class; we are in the world, where most people don’t talk about human “subspecies” very often, if ever. They talk about human “races.” To criticize Wade’s use of the term “race” seems pedantic. Whether to call the clusters “races” or “populations” or “groups” is a semantic dispute.

Put it another way: If you put on your “there is no such thing as race” costume for Halloween, you’ll be out trick-or-treating in your birthday suit, unless you stay on campus.

Besides, use any word you want, it won’t affect the reality that the symbol represents. The various “populations” either have slightly different genetic mixes that nudge behavior differently, or they don’t. I mean, are we seeking the truth here or just trying to win an argument?

6. While Wade offers no conclusive proof that genes create behavioral predispositions, he does examine some gene-behavior associations that point in that direction and seem particularly open to further testing. Among them are the MAOA gene and its influence on aggression and violence, and the OXTR gene and its influence on empathy and sensitivity. (The symbols link to recent research results.)

What these have in common is that the biochemical chain from the variation of the gene to the behavior is at least partly understood. The chemical agents of the genes in question are L-monoamine oxidase and oxytocin, respectively. Because of this, testing would not be restricted to a simple correlation of alleles to overt behaviors in millions of people, though that is a sound way to proceed as well. The thing about the intermediate chemical triggers is that they could probably be measured, manipulated, and controlled for.

We are in the world, where most people don’t talk about human “subspecies” very often, if ever. They talk about human “races.”

The difficult task of controlling for epigenetic, developmental, and environmental variables would also be required but, in the end, it should be possible to determine whether the alleles in question actually influence behavior.

If they do, the next step would be to determine the frequency of the relevant allele patterns in various populations. If the frequency varies significantly, then the discussion about how these genetic differences in behavioral propensities may have contributed to the diverse characteristics of civilizations could be conducted on firmer ground.

If the alleles are proven not to influence behavior, then Wade’s hypothesis would remain unproven, and lots of textbooks wouldn’t have to be tossed out.

Of course, it’s not so simple. The dance between the genome and the environment has been going on since life began. At this point, it might be said that everything in the environment potentially influences the evolution of man, making it very difficult to identify which parts of human behavior, if any, are influenced by our genes. Like Cremonini, I have no wish to approve of claims about which I do not have knowledge.

But the hypothesis that Wade lays out will surely be tested and retested. The technology that makes the testing possible is relatively new, but improving all the time. We can crunch huge numbers now, and measure genetic differences one molecule at a time. It is inevitable that possible links between genes and behavior will be examined more and more closely as the technology improves. Associations among groups of alleles, for example, and predispositions of trust, cooperation, conformity, and obedience will be examined, as will the even more controversial possible associations with intelligence. That is to say, the telescope will become more powerful. And then, one evening, we will be able to peer through the newly improved instrument, and we shall see.

That is, of course, if we choose to look.




Share This
Syndicate content

© Copyright 2018 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.