Face Time

 | 

I was not an early adopter of Facebook. And I joined for commercial reasons. For a short time a few years ago, all the smart people in book publishing were saying that social media was the future of book promotion. Of course, at that point, the smart people in every industry were saying that social media was the future of promoting any product or service. Some of those smart people may have been in the employ of Zuckerberg & Co.

That conventional wisdom, like most such, turned out to be an exaggeration of a minor observation. My firm’s efforts at promotion through Facebook have yielded modest results. (The well-worn triad of direct mail, author spots on local talk radio, and carefully-chosen display ads remains the most effective way to promote books.)

Despite this, I still use Facebook. And may use it more than ever. It’s a pleasant diversion, a low-maintenance way to stay in touch with family, friends and a group of “Facebook friends” — acquaintances from high school, college and other points in my life. It offers the interactivity of a chat room with the promise of enough vetting to keep out the most egregious cretins and child-molesters.

It’s also an interesting laboratory for measuring people’s attitudes about sports, politics, pop culture and the news.

One thing that I’ve learned is how presumptuous — and erroneously presumptuous — people are about the means and motives of online entertainment. Many of my acquaintances presume that there’s some system of consumer-protection law that applies to their dealings on Facebook. This applies especially to matters of “privacy.”

Facebook is, like Google, an advertising company at heart. The business model is to create an online space that people will visit regularly — and then to sell access to those people. Many of the activities on Facebook are designed to capture information about users likes and dislikes, so that Facebook can create detailed consumer profiles and sell precisely-calibrated access to advertisers.Yet multitudes of Facebook users rage childishly when this or that detail comes to light about how the site collects information.

Another lesson (and the real reason for this Reflection): the politics and beliefs of most Americans are so ill-formed and erratic that it’s difficult to engage them in a meaningful way.

Recently, several of my Facebook friends posted approving comments about Warren Buffett’s “integrity” and “bravery” in calling for higher taxes on the wealthy. I pointed out — as I have in this space — that there’s no integrity or bravery in Buffett. At least on this issue. He’s acting in self-interest, and being cagey about it. His company’s holdings include several life insurance companies that sell annuities and other tax-avoidance mechanisms. The higher the federal tax rates, the more his products sell. He’s like an arsonist who owns the fire-extinguisher shop across the street from a theater that he sets afire during a sold-out performance of La Boheme.

Despite the ugly truth, some of my Facebook friends insisted that Buffett looks out for the working man. So, I pointed out that he is also a large shareholder in the Washington Post Company — whose highly-profitable Kaplan Education unit destroys the lives of working-class idiots by selling them worthless degrees financed by costly student loans that aren’t dischargable in bankruptcy.

At this point, a friend of one of my Facebook friends — who could read the comment thread through his connection to my friend (such is the nature of a social network) — commented that my use of the term “working-class idiots” was offensive. And that he knew better than I how predatory Kaplan Education is because he had borrowed tens of thousands of dollars to get a useless certificate in 3D animation from that very company. And that, several years later, he remains unemployed. But he wasn’t as angry at Kaplan or Buffett as he was at me for describing his ilk unkindly.

The What’s the Matter with Kansas wing of the American Left argues that presumedly right-leaning corporate interests brainwash the middle class into voting against its own interests. But that brainwashing isn’t a Right/Left phenomenon. The same argument could be made of the presumedly left-leaning Warren Buffett and the unemployed friend of my Facebook friend.

We who value liberty have a long way to go in explaining our case to the American masses. We have to assume our fellow citizens know nothing. Or, worse, we have to assume that most of what they know is affirmatively false. And we have to do it nicely.

I use Facebook as a tool to sharpen my skills in this effort.




Share This


The Passing Paradigm

 | 

The latest much-ado-about-nothing crisis passed, with a result that should seem familiar. In 2008, Americans were told that if the TARP bill (a $787 billion taxpayer-funded welfare handout to large banking institutions) wasn’t passed, the stock market would crash and massive unemployment would follow. After an unsuccessful first attempt to pass the bill amid angry opposition from constituents, the bill passed on a second vote. Subsequently, there was a stock market crash followed by massive unemployment.

This time, our political-media cabal told us that if Congress was unable to pass a bill to raise the debt ceiling, the government would not be able to meet its short term obligations, including rolling over short term bonds with new debt. US debt would be downgraded from its AAA status, and a default would be imminent. After the melodrama, Congress passed the bill raising the debt ceiling. Standard and Poor’s subsequently downgraded US Treasury debt anyway, and deep down everyone knows that a default is coming as well, one way or another.

We are seeing the end of a paradigm. Thomas Kuhn argued in The Structure of Scientific Revolutions (1962) that anomalies eventually lead to revolutions in scientific paradigms. His argument holds equally true for political paradigms.

A paradigm is a framework within which a society bases its beliefs. For example, people at one time believed that the forces of nature were the work of a pantheon of gods. Sunlight came from one god, rain from another. The earth was a god, as was the moon. With nothing to disprove the premises of the paradigm, it persisted. People went on believing that sunlight and rain were the work of sunlight and rain gods because there was no compelling reason for them to believe otherwise.

However, within any paradigm there are anomalies. Anomalies are contradictions — phenomena that cannot be explained within the framework of the paradigm. People have a startling capacity to ignore or rationalize away these anomalies. While it may defy logic to continue to believe that rain comes from a rain god even after evaporation and condensation has been discovered and proven, people would rather ignore the anomalies and cling to the paradigm than face the fact that the paradigm is false.

There is at least one thing that will be quite obvious: centralized government is insane.

But once there are too many anomalies, the paradigm fails, and a new one must take its place. This new paradigm renders the old one absurd, even crazy. At some point in the future, people will look back on the political paradigm of the 20th and early 21st centuries. There is at least one thing that will be quite obvious to them: centralized government is insane.

Consider the premises upon which this present paradigm relies: all facets of society must be planned and managed by experts. The judgment of the experts trumps the rights or choices of any individual. The choices made by the experts will result in a more orderly society and greater happiness for the individuals who compose it. There will be better results from one small group of experts controlling everyone than multiple groups of experts controlling smaller subgroups of society.

Of course, libertarians reject every one of these assumptions on its face. A free society does not tolerate “planning” or “management” by anyone. All choices are left to the individual, as any attempt to plan or manage his affairs amounts to either violation of his liberty, looting of his property, or both. However, let’s assume that the first three assumptions of the present paradigm are valid and merely examine the last. Even that does not hold up to scrutiny.

Suppose an entrepreneur starts a business. At first, his market is local. He opens retail outlets that are overseen by store managers. The entrepreneur is the CEO of the company and manages the store managers. Even at this point, the CEO must trust day-to-day decisions to his managers. He has no time to make everyday decisions as he tries to expand his business. The managers do this for him and he concentrates on strategic goals.

His business is successful and soon he begins opening outlets outside of the original market. He now has a need for regional managers to manage the store managers. He manages the regional managers and leaves the details of how they operate within their regions to them.

The business continues to expand. With retail outlets in every state, there are now too many regions for the CEO to manage directly. The CEO appoints executive directors to manage larger regions, each composed of several smaller ones. There is an executive director for the West Coast, another for the Midwest, and another for the East Coast. Of course, the CEO has the assistance of his corporate vice presidents who manage sales, operations, human resources, and other company-wide functions from the corporate office.

Now, suppose that one day the CEO decides to fire the executive directors, the regional managers, and the store managers. He will now have the salespeople, stock clerks, and cashiers for thousands of retail outlets report directly to him and his corporate vice presidents. Would anyone view this decision as anything but insane?

As silly as this proposition sounds, this is a perfect analogy for how we have chosen to organize society for the past century. The paradigm rests on the assumption that every social problem can be better solved if the CEO and his corporate staff manage the cashiers and the salespeople directly. As in all failed paradigms, anomalies are piling up that refute its basic assumptions.

This paradigm assumes that centralized government can provide a comfortable retirement with medical benefits for average Americans, yet Social Security and Medicare are bankrupt. It assumes that a central bank can ensure full employment and a stable currency, yet the value of the dollar is plummeting and unemployment approaches record highs (especially when the same measuring stick is used as when the old records were set). It assumes that the national government’s military establishment can police the world, yet the most powerful military in history cannot even defeat guerrilla fighters in third-world nations. It assumes that the central government can win a war on drugs, yet drug use is higher than at any time in history. It assumes that experts in Washington can regulate commerce, medicine, and industry, yet we get Bernie Madoff, drug recalls, and massive oil spills.

Hundreds of years ago, the prevailing medical science paradigm assumed that illnesses were caused by “bad humors” in the blood. Operating with that assumption, doctors practiced the now-discredited procedure known as “bleeding.” They would cut open a patient’s vein in an attempt to bleed out the bad humors. As we now know, this treatment often killed the patient. Most rational people today view the practice of bleeding as nothing short of lunacy.

Ironically, this is a perfect analogy for the paradigm of centralized government. The very act of a small group of experts attempting to manage all of society drains its lifeblood. It is the uncoerced decisions of millions of individuals that create all the blessings of civilized society. It is the attempt by a small group of people to override those decisions that is killing society before our very eyes. Someday, people will look back on our foolishness and laugh as we do now at the misguided physicians who bled their patients to death. The present paradigm is dying. The revolution has begun.




Share This


The End Is Nigh

 | 

In an article in the December 2010 issue of Liberty, I alerted readers to the fact that a leading network of Christian radio stations was predicting that the end of the world was absolutely, positively going to happen in 2011. According to Family Radio, which broadcasts in many countries, and which probably has a station near you, Judgment Day will begin on May 21 with the Rapture of the true believers and will conclude on October 21 with the total destruction of the physical universe. In the process, almost all the inhabitants of the earth will perish.

This is the message of Family Radio’s “Bible teacher,” a retired businessman named Harold Camping. His interpretations of Scripture are explained — as well as I, or probably anyone else, could explain them — in the article just mentioned, “An Experiment in Apocalypse.” You can download it here. For a less critical perspective, see Family Radio’s own website, which offers a list of stations where you can hear the apocalyptic message for yourself.

For students of human nature, especially American human nature, this particular religious prediction is a matter of great importance, interest, and (let’s face it) fun. It is, for them, what a transit of Venus is to astronomers, what the sighting of an ivory-billed woodpecker is to ornithologists, what an eruption on the scale of Mt. Saint Helens is to volcanologists. It’s the kind of thing that happens much less than once in a generation.

Of course, experts, religious and secular, make predictions all the time, and other people believe them. Generals predict that if they are given appropriate resources, they will be able to accomplish their mission. Scientists predict that if their policy advice goes unheeded, the environment will be subject to further degradation. Politicians predict that if you vote for them, they will initiate a new age of prosperity for the American people, and if you don’t, you will be visited by spiraling unemployment and a continuing decline in the American standard of living. Preachers say they are confident that the signs of the times point to an early return of our Lord Jesus Christ. Economists assure us that if trends continue, we can expect to see whatever they themselves have been trained to expect.

According to Family Radio, Judgment Day will begin on May 21 with the Rapture of the true believers and will conclude on October 21 with the total destruction of the physical universe.

All these modes of prophecy are potent. They have effects. They get people’s attention. They lead some people to do things that otherwise they would not do — vote, go to war, buy a house, pledge more money to the church. But they are all escapable and forgettable. They never include a date on which something definitewill happen. What you see, when you look at the words and figure out what they really mean, is just a suggestion that something that is capable of almost any definition (“prosperity,” “job creation,” “depression,” “desertification,” “a world in which our children will have more (or less) opportunity than we do,” “the fulfillment of God’s plan”) will manifest itself at some time that is really not a time: “very soon,” “in our generation,” “by the end of the century,” “earlier than predicted,” “much earlier than anyone would have thought,” “with a speed that is startling even the experts,” “at the end of the day,” “eventually”).

Of course, the less definite a prediction is, the less meaning it has; but the more definite it is, the less likely it is to come true. Real economists and real theologians can tell you why. A real economist will show you that human events result from individual human choices, their motives unresolvable into quantifiable data, their possible sequences multiplying from the original motives in fantastic variety. Real theologians will tell you, in the words of the old hymn, that the Deity is not a writer of op-ed forecasts: “God moves in a mysterious way, his wonders to perform; / He plants his footsteps in the sea, and rides upon the storm.”

Nevertheless, it is impossible that someone’s announcement that “California is more vulnerable than ever to a catastrophic earthquake,” or that “this administration will meet the problem of the deficit, and solve it” could ever be completely disconfirmed. If the big one doesn’t destroy San Francisco during your lifetime, as you thought had been predicted, don’t use your dying breath to complain. You’ll just be told that California is even more vulnerable “today” than it was “before,” because more years have passed since the last prediction. If the politician you helped elect disappoints you by not having “solved the problem,” whatever the problem is, you’ll be told that “our plan for the new America ensures that appropriate solutions will be implemented, as soon as Congress enacts them into law.”

How can you disconfirm the ineffably smarmy language of the exhibits in the California Academy of Sciences? Once an intellectually respectable museum, it now adorns its walls with oracles like this: “If we don’t change our actions, we could condemn half of all species of life on earth to extinction in a hundred years. That adds up to almost a million types of plants and animals that could disappear.” Should you decide to argue, you can’t say much more than, “If you don’t stop purveying nonsense like that, your museum has seen the last of me, and my $29.95 admission fees, too.”

Thirty years ago there was a considerable emphasis among mainstream evangelical Christians on the prospect of Christ’s imminent return. There was a popular religious song, urging people to keep their “eyes upon the eastern skies.” Less mainstream religionists said that all indications point to the probability that the present system of things will end in 1975. Meanwhile, a very popular book, Famine 1975! America’s Decision: Who Will Survive? (1967), predicted that the world would soon run out of food; and scientists worried the world with predictions that “global cooling” would soon be upon us.

For students of human nature, especially American human nature, this particular religious prediction is a matter of great importance, interest, and (let’s face it) fun.

Among these predictions were a few that, wonder of wonders, actually could be disconfirmed, and were. Though no one said that the Egyptians (often thought to be especially “vulnerable”) would begin starving to death precisely on May 21, 1975, some people came close enough to saying that; and events showed they were wrong. Yet there are escape routes available for all predictors, even those proven to be wrong. Two escape routes, really: memory and interest.

Jesus knew about this. He told his followers that no man knows the day or the hour of his Return, but that people would always be running around predicting it (Mark 13:32, Luke 21:8–9). The failed predictions, which seemed so snazzy before they failed, wouldn’t really be perceived as failures, because the failures wouldn’t be remembered, or considered interesting enough to be remembered. Watch out for predictions, he said.

There are two things going on here. One is that people die, and their enthusiasms die with them — often to be revived by the next generation, before being forgotten again. Quack medical treatments, as someone has pointed out, have a generational life, and so do quack economic and religious treatments. The Bates Eye Method, a way of using exercise to improve your eyesight, doesn’t work, and when people find that it doesn’t, they abandon it. They eventually die, and another group of people “discovers” the great idea, wants to believe it, and makes a big deal out of it, temporarily. The phony (and I mean “phony” not in the sense of “mistaken” but in the sense of “created to make money and impress people”) religious ideas of the I Am Movement have had a similar life cycle among New Age types. And when it comes to economics, where would we be without such recurrent notions as the idea that unions are responsible for “the prosperity of the middle class,” the idea that the minimum wage lifts people out of poverty, and the idea that general wellbeing can be created by forcing the price of commodities up by means of tariff regulations?

The gullible people who endorse such ideas often die with their convictions intact, although they may not succeed in passing them along to others, at least right away. In April we witnessed the death of the oldest man in the world, a gentleman named Walter Breuning. Before his death at the age of 114, Mr. Breuning gave the nation the benefit of his more than a century of experience and learning — his belief that America’s greatest achievement was . . . Social Security! Yes, if you retire at 66, as Mr. Breuning did, and collect benefits for the next 48 years, I suppose you might say that. But it’s an idea that’s likely to be ignored by people who are 30 years old and actually understand what Social Security is.

And that’s the additional factor: lack of interest. Failed ideas, and failed predictions, aren’t always forgotten — many of them have a second, third, or fourth advent. But they may be ignored. They may not be interesting, even as objects of ridicule. I suspect that most young people would say that Social Security is “good,” but it’s not as important to them as it was to Mr. Breuning. The same can be said of mainstream Christians, who agree that Christ will return, but pay little or no attention to any current predictions.

Right or wrong, as soon as an idea reveals even a vulnerability to disconfirmation, it often starts to dwindle in the public’s mind. Global cooling is a perfect example. Once, cooling was fairly big in the nation’s consciousness; then it didn’t seem to be happening, right now anyway; then it began to seem unimportant; then it disappeared, without anyone really noticing its absence.

This is what tends to happen with political and economic predictions. The smart people, and the political fanatics (such as me), go back to what Roosevelt said or Kennedy said or Obama said, and notice how wildly their promises and predictions varied from the accomplished facts; but the people in the street go their way, unshocked and unaffected. They may not have expected specific accuracy from their leaders’ forecasts, but if they did, they forgot about it. Initially, they were foolish enough to be inspired, or frightened, but they were canny enough to realize that other forecasts — equally inspiring, frightening, and vulnerable to failure — would succeed the present ones.

The subject changes; the language does not. It’s always apocalypse on the installment plan.

It’s like Jurassic Park, where the dinosaurs seem certain to devour the heroes, and almost manage to do so — about 1100 times. After the first few hundred near-death experiences, you realize that the only logical reason this won’t go on forever is that the theater has to be cleared for another showing of Jurassic Park. Expectation diminishes, long before the show is over — although you may be willing to see it again, in a few years, once the specific memory wears off. That’s the way the language of prediction often works, or fails to work.

As long as the idea of socialism has existed, its priests have predicted the downfall of the capitalist system. When each seemingly fatal contingency proved not to be fatal, another contingency was identified; when that failed to produce the climax, a third came into view . . . and so on. Some people were willing to return for Downfall of Capitalism 2, 3, and4; but others sensed that the plot had no logical ending, after all, and sought something different.

So new performances began in the Theater of Prognostication. Followers of Malthus demonstrated that civilization would perish through “over-breeding.” Eugenicists showed that it would end by the over-breeding of the “unfit.” For many generations, journalists computed the size of “the world’s proven fuel resources” and demonstrated that unless alternative sources of energy were found, the engines of the world would stop. In 1918, the world was assured that it was about to be made safe for “democracy.” Then it was assured that it was on the brink of unimaginable prosperity, to be produced by “technocracy.” After that, it learned it was about to be completely destroyed by a new world war. When the war came, but did not succeed in destroying the world, optimists prophesied an imminent “era of the common man,” while pessimists prophesied annihilation by the atom bomb. For generations, the “doomsday clock” of The Bulletin of the Atomic Scientists stood at a few minutes till midnight. It still does — because now it registers not only the purported danger of atomic war, but also the purported likelihood of destruction by “climate change.” In other words, another movie has hit the theater.

The subject changes; the language does not. It’s always apocalypse on the installment plan. You buy it in little doses. First, “evidence seems to show”; then, “all the evidence shows”; after that, “experts are all agreed.” The only thing lacking is clear language about exactly how and exactly when the event will happen.

In the early 20th century, many millions of words were spilled about (A) the world’s inevitable political and social progress; (B) the world’s inevitable destruction in a great war. But when the first great war began, there was nothing of inevitability about it. If Russia, France, Germany, and Austria-Hungary had decided, as they might easily have decided, not to bluff one another, or to call one another’s bluffs, about the insignificant matter of Serbian nationalism, there would have been no World War I. In the 1930s, world war was regarded as inevitable by people terrorized by new types of weapons and by the traditional bogeys of “monopoly capitalism” and “western imperialism.” When war came, it wasn’t ignited by any of those things, but by the least predictable of world historical factors: the paranoid nationalism of the Shinto empire, and the weird appeal of Nazism, embodied in the unlikely figure of Adolf Hitler.

If you refuse to be gulled by apocalyptic lingo, what will happen to you? Here’s what. You’ll be told that you are “in denial.”

There’s nothing much to the predictive power of human intelligence. But if you refuse to be gulled by apocalyptic lingo, what will happen to you? Here’s what. You’ll be told that you are “in denial.” Even more repulsively, you will be told that “denial is not just a river in Egypt.” (Isn’t that clever?) You will be accused of not believing in Science, not respecting the Environment, not caring about History, and so forth. You will be accused of all the characteristics exemplified by the prophets of doom themselves: ignorance, arrogance, and not-listening-to-others.

But let’s see how Harold Camping, Family Radio’s leader and prophet, compares with the other leaders and prophets we’ve considered. In one way he is exactly similar — his use of what Freud called “projection.” In every broadcast Camping warns his audience against arrogance, unloving attitudes toward other people, impulsive and subjective interpretations of the Bible, and submission to the mere authority of so-called Bible teachers. And in every broadcast he denounces members of ordinary churches for failing to heed his prophecies; rejoices in the pain they will suffer on May 21, when they realize that he was right and they were wrong; and suggests that anyone who disagrees with him is denying the authority of the Bible itself. On April 28, his radio Bible talk concerned the dear people in the churches, whom he reluctantly compared with the 450 priests of Baal who were slaughtered by Elijah because they trusted in their own ideas and wouldn’t listen to the true prophet — a prophet who, like Camping, was favored by God because he was absolutely certain that what he said was true.

But — and this is the most important thing — Camping has a dignity and intellectual integrity denied to most other predictors, including those most esteemed in our society. He doesn’t speak in generalities. He predicts that Judgment Day will come, without any doubt or question or problem of definition, on a particular day: May 21, 2011. In all likelihood it will begin with a great earthquake, which will devastate New Zealand and the Fiji Islands, at sundown, local time. After that, the wave of destruction will circle the globe, with the setting sun. By May 22, the Rapture will have been concluded; “it will all be over!”, and everyone will know that it is; the whole thing is “completely locked in.” In making his prophecies, Camping is actually risking something. He is actually saying something, not just uttering fortune cookie oracles.

I use that phrase advisedly. The other night, Mehmet Karayel and I dined at the Mandarin, and as always we were interested in seeing what our fortune cookies had to say. Mehmet’s fortune was a typically New Age, ostensibly precise manipulation of words. It said, “You may attend a party where strange customs prevail.” Yeah, right. Or he may not attend such a party, or the customs may be “strange” only to ignorant people, or the customs may be present but not prevail, etc.

Mehmet is a real intellectual, not a person who plays one on TV, so he was not taken in by this exciting forecast. Then came the unveiling of my fortune. It was, believe it or not, “Your future looks bright.” Can you imagine a feebler thing to bake into a cookie? But Mehmet is aware of Mr. Camping’s prophecies, so he knew how to strengthen the message: “It should say, ‘YourMay 22 looks bright.’”

So Mehmet and Harold Camping, though disagreeing firmly on specifics, stand together in demanding that they be produced. Mehmet is certain that my May 22 can be bright; Camping is certain that it can’t. But they both know that only one of them can be right about a proposition like this, and that we’ll soon find out which one it is. How refreshing.

I must admit that not everybody in Camping’s outfit is up to his high intellectual standard. On April 26 I received a plea for contributions to Family Radio (which, by the way, has a good deal of wealth and doesn’t really need many contributions — but why not ask?). The plea came in two parts. One was a brief letter from Camping, requesting my “continued involvement” in FR’s ministry, because “Time is running out! The end is so very near, with May 21, 2011, rapidly approaching.” The second was a pledge card, where I could check the amount of money I planned to give “each month.” As I said in my December article, there is evidence that some people at FR are biding their time, trying to keep the organization together so it can continue — under their leadership — after the failure of May 21. My speculation is that the pledge card is their product, and they don’t mind contradicting Camping’s message in the letter. They may even be alerting “supporters” like me to keep the faith: my May 22 does indeed look bright.

But that’s a digression. Harold Camping is not a politician or a professor of environmentalism, whose prophecies can never be proven wrong because they’re ridiculously non-specific. No, he has said exactly what he means by the end of the world, and he has said exactly when the end of the world will happen. You can check it. I hope you do. Go to Family Radio’s website, find out where its nearest radio station is, and tune in during the evening of May 20 (US time), when, Camping believes, Judgment Day will begin in the Fiji Islands. Then listen through the next few days, as Family Radio responds to the disconfirmation of its prophecies. Or does not respond — until it figures out how to do so (and that should be interesting also).

As I’ve said before, this is a once in a lifetime opportunity. It will be much more interesting than listening to the constant din of the secular prophets — politicians, historians, economists, and environmentalists — whose intellectual stature, compared to that of Harold Camping, addlepated prophet of the End Time, is as nothing, a drop in a bucket, and the small dust that is wiped from the balance.

/em




Share This


Count Down

 | 




Share This


Where Do We Stand Today?

 | 

Major events concentrate the mind on major issues.

At this moment, we are all trying to analyze the results of the great American election of 2010. We are also celebrating the beginning of Liberty magazine’s online edition — proof of the continuity of libertarian ideas across all movements and events of history.

Much has happened in American politics since Liberty first went to press, back in the summer of 1987. This is a good time to ask how well liberty itself has fared during the past quarter century.

It's sad to realize that the history of these years can most readily be divided into periods, not by great new inventions or movements, but by presidential personalities — the age of Reagan, the age of Bush the First, the age of Clinton, and so on. Let's start by looking at the major features of the world in which Liberty was born, the age of Reagan (second administration).

The most prominent political feature of that world has passed away — the threat of nuclear annihilation of the West by the empire of communism. That threat had overshadowed a generation of Americans, sometimes manifesting itself as a chronic anxiety, sometimes rising to a pitch of hysteria, but always costing mightily in emotion and in wealth. For anyone who came to conscious life after, say, 1991, the effects of a threat like this are probably impossible to understand. I hope they remain that way. Nevertheless, the danger went away. The grand threat of communism was replaced by the nasty threats of Muslim fanaticism and creeping nuclear proliferation; but while these are worth worrying about, they are not quite comparable.

Perhaps the most powerful cause of the collapse of communism was the burden of its own inefficiency, a flaw that libertarian thinkers had never ceased to emphasize.

Why did the communist empire fall? An event of this kind has many causes, and you are free to emphasize one or another of them, depending on your politics. One was probably President Reagan's determination to out-spend and out-invent the communist military machine. Another was Reagan's use of essentially libertarian arguments about the benefits of individual freedom to inspire the West with a new determination to resist the propaganda of defeatism. (A well-advertised determination to resist is itself a powerful counter-threat, and in this case it seems to have had a major effect on collectivist morale, everywhere.) Perhaps the most powerful cause of the collapse of communism was the burden of its own inefficiency, a flaw that libertarian thinkers had never ceased to emphasize, even as their arguments were laughed to scorn by "progressive" thinkers in the West, and even as conservative American leaders worried that the communists were about to "catch up" with us. They didn't; they fell on the track, victims of the astonishing skill and inventiveness of individual enterprise.

Those of us who were politically conscious (or in my case, semi-conscious) in 1989–1991 recognized the communist collapse as a tremendous victory for libertarian ideas. If this be “triumphalism,” make the most of it; the echoes of our triumph are still heard, most recently in America’s general revulsion against the idea of a “single-payer” (that is, communized) national insurance scheme. Contrast the favorable reception of a single-payer retirement system — Social Security — two generations before.

Though not all libertarians would agree, real progress was also made by Reagan’s forthright defense of what is now called American “exceptionalism.” There is indeed something exceptional about America, and Reagan didn’t say that the exceptional thing was religion, ethnic diversity, immigration, community spirit, or anything else that is considered politically correct on either the Left hand or the Right. He said it was freedom, free enterprise — and he was correct.

Yet by summer 1987 it had become obvious that Reagan’s own legacy was more conservative than libertarian. He simplified the tax brackets, which had been designed to extract the maximum possible out of every nickel added to your income, and in so doing he reduced the tax rates; but this, as anticipated, actually raised total government income. Then his administration proceeded to spend much more than its income. That was not a libertarian thing to do.

One thing that limits state power in America is the individual states, which in the federal system are supposed to check and balance the overweening might of Washington. Reagan believed in federalism — but only when it fitted his own purposes. He was responsible for nationalizing the drinking age at an absurd 21, by using federal highway funds as a bludgeon against states that, very rationally, didn’t want to go along. And while he was a mighty foe of regulation, he was also a friend of the ridiculous war on unregulated recreational drugs. You can say the same thing about every other president, except the current one; but we might expect more from a conservative president who once told Reason magazine that “the very heart and soul of conservatism” was libertarianism.

In the case of both Presidents Bush, “conservative” should be placed in quotes. Ideological labels don’t stick very well to sheer incompetence.

Reagan was also to blame for some serious sins of omission. He intended to abolish the Department of Education, but he paid little attention to the person he appointed as secretary of that department; and when the appointee turned out to be a public foe of abolition, Reagan let the project drop. The result: three decades of enormous and destruction educational spending and meddling by the national government. In addition, Reagan, a man of deep personal loyalty (a good thing), permitted his vice president, George Bush, to be anointed as his successor (a very bad thing).

I don’t have to connect all the dots that outline the political profile of George H.W. Bush, although each of them contributed to the success of Liberty in its attempt to distinguish grassroots libertarians from conservatives in power. But probably, in the case of both Presidents Bush, “conservative” should be placed in quotes. Ideological labels don’t stick very well to sheer incompetence.

The first President Bush nominated Clarence Thomas, a firm and deep libertarian, to the Supreme Court, and stuck by him when he refused to yield to the most violent opposition that any Court nominee has ever endured. That was a good thing — indeed, a great thing — but it didn’t respond to any interest in judicial philosophy on the part of the “conservative” president. It responded, again, to a sense of personal loyalty, which is not to be blamed but can hardly be depended upon as a means of advancing liberty. Bush’s other appointee was David Souter, who was one of the most anti-libertarian, and certainly one of the stupidest and least qualified, people ever to roost on the Supreme Court bench. A political crony vouched for Souter, so Bush nominated him, as Eisenhower had nominated the outrageous judicial activist William Brennan.

Someone, someday, will write a book called “The Mystery of George H.W. Bush.” It will attempt to answer the question, “How can a hero of World War II, and subsequently an observer of every seamy transaction in the wars of American politics, emerge as such a sap?” Bush won office by promising that he would veto any attempted tax increase: “Read my lips: no new taxes!” He then agreed to raise taxes, on the promise of his political opponents to lower government expenditures, something that they had no intention of doing. It’s hard to think of any other president who would have been foolish enough to make such an agreement, and it very appropriately cost Bush his presidency.

Bush showed great ability at persuading foreigners to unite with the United States in asserting Kuwait’s independence after the oil-rich kingdom had been forcibly annexed by Iraq. He also showed great fortitude. I well recall watching news coverage of the buildup to the first Gulf War. I was in the company of other libertarians, all very bright people. Their reaction, as they saw the troops walking onto the ships: “Poor kids! They’ll never come back alive.” And that was a possibility. Yet Bush took the risk and fought a successful war in the Gulf, a war that actually came to a conclusion.

He also fought a successful, though much less honorable, war against Manuel Noriega, dictator of Panama — allegedly for his involvement with drug trafficking, actually for his general antagonism to the United States. Bush had Noriega snatched from Panama and tried in the United States, where he was convicted of crimes against the laws of a country that was not his own. You don’t have to be sympathetic to Noriega to sense that President Bush wasn’t a deep thinker about international law. Nor do you need extraordinary intelligence to perceive, in the wars of the first Bush administration, the seeds of wars in the second.

If ideas count — and they do — a modern liberal Democrat president had admitted that libertarians, the foes of every idea he endorsed, had won the argument.

It is hard to find a libertarian feature of the first Bush regime, and harder still to find one in the early regime of Bill Clinton. The only good thing about it was Hillary Clinton’s elephantine attempt to socialize the nation’s healthcare system, and the failure of her attempt. It failed, not principally because of the Republican Party’s opposition, but because of the people’s response to a well-calculated television ad campaign, supported by some intelligent and interested people on the Right. To the political pros, the campaign seemed to have “come out of nowhere,” yet it re-energized the forces of limited government. A year later, these forces united behind new ones within the Republican Party and ousted the president’s party from its leadership of the House of Representatives for the first time in 40 years. Clinton’s response was to proclaim that “the era of big government is over.”

Of course, that was a lie. As several people commented at the time, all he meant was that the era of lots and lots of little government was continuing. Nevertheless, if ideas count — and they do — a modern liberal Democrat president had admitted that libertarians, the foes of every idea he endorsed, had won the argument.

Unfortunately, his remarks were a warning, for those who would listen, that future government aggressions would be finessed, not announced. In future, modern liberals like Al Gore would pretend that they had a “lock box” in which to put Social Security taxes, and that the box would never be raided for the use of any other government project — a mythological concession to the people’s desire to limit the state’s depredations. And in future, modern socialists like Barack Obama would claim that even their most Ozymandian schemes, such as Obama’s healthcare “reform,” would “pay for themselves” or even “reduce government expenditures.”

La Rochefoucauld said that hypocrisy is the tribute that vice pays to virtue. If so, we can do without any more tributes. The truth of libertarian ideas is admitted, in principle. Still, it’s the false ideas that get put into practice.

Clinton commissioned countless military adventures abroad, in Eastern Europe and in the Near East; they seldom amounted to much, although they asserted the kind of bellicosity that his own party now wants to run away from. But Clinton did two good things for the cause of limited government: he made an alliance with the Republicans for a sweeping, and successful, reform of welfare; and by his scandals he so diminished the prestige of the presidency as to make people significantly less likely to believe and trust elected officials. Bad news for government is usually good news for liberty.

What shall we say about the second President Bush? Unlike Bill Clinton, he wouldn't be a bad person to have as a neighbor — unless, of course, he decided that you might be secreting a weapon of mass destruction. R.W. Bradford, the founder of Liberty, once published an article in which he tried to account for Bush's invasion of Iraq. After a close review of the evidence, he concluded that Bush really believed his own account of the dangers that Saddam Hussein posed to the world. I found Bradford's reasoning persuasive. Bush was not an evil man; he was a gullible man, and he was usually gullible in the ways in which modern liberals are gullible. Until recently, they too believed in solving problems, real or perceived, by projecting military power abroad; indeed, more leading Democrats initially supported the second Gulf War than had supported the first one.

If Bush had happened to be a member of the Democratic Party (which, except for his family identification, he could easily have been, given his general political ideas), no one on that side would have quibbled about his vast government expenditures and vast government indebtedness, or his blithe disregard for any limitations on the power of the federal government. It was Bush who engineered one of the greatest federal takeovers in history, Washington’s massive intervention in local education, under the title of "No Child Left Behind."

In 2008, George Bush the modern liberal was succeeded by Barack Obama, another modern liberal — but a much more self-conscious and socialistic one. People in the eighteenth century used to analyze people by reference to their “ruling passion,” to whatever it is about them that they are willing to sacrifice everything else to. President Obama’s ruling passion is intellectual arrogance, the kind of arrogance that finds its equal, among American presidents, only in the disastrous mentality of President Wilson. Wilson never understood why he was deserted by the people over the issue of the League of Nations; after all, his ideas were correct. For Obama, as for Wilson, “correct” means “progressive,” and “progressive” means “maintaining unquestioned faith in the uninformed notions of the leader.”

No hypocrisy here: Obama believes sincerely in the ideas he enunciates. He believes implicitly in Keynesianism, minus Keynes’s qualifications of his theories; in the most naïve form of dirigisme, minus the glitter of Louis Napoleon and Baron Haussmann; in the managerial economics of Henry Wallace, minus Wallace’s wonderful goofiness (though Obama followed the Reverend Mr. Wright as Wallace followed his “guru”). In short, Obama is not an intellectual, no, not by a mile. He simply makes the mistake of believing that he somehow arrived at his naïve ideas through a long process of thought and experience, and that his inspiring “philosophy” is his ticket to success.

For Obama, as for Woodrow Wilson, “correct” means “progressive,” and “progressive” means “maintaining unquestioned faith in the uninformed notions of the leader.”

Clearly, it’s not. If “success” means “being elected,” right now he couldn’t be nominated as an alderman in Chicago. If it means “moral superiority,” why are you laughing right now? Obama’s administration demonstrates the truth of an important libertarian idea, developed by Friedrich Hayek in the chapter of “The Road to Serfdom” that he called “Why the Worst Get on Top.”

I’ll summarize the argument in this way: socialism attracts people for many reasons. One is a desire for unearned material rewards. Another is a lust for the power that socialized economies automatically convey to an elite. But yet another is the humanitarian idealism that is felt by some of the world’s morally “best” people. They enlist in the socialist cause because they think it will make a better world. These are the “hopey-changey” people. But when a socialist regime comes into power, it inevitably demonstrates, as Obama’s regime has demonstrated, that its promises cannot be fulfilled, especially in the terms originally proposed. At this point, the best of the hopey-changey people hop off the train; the worst stay on, making their way toward the front by means of lies and intimidation.

Consider the leading personnel of the Obama regime — the Nancy Pelosis and Harry Reids, the David Axelrods and Rahm Emanuels — and you will see the principle in action. Because our tradition of limited government has been preserved in many important respects, the “worst” in America are not allowed to be as bad as the “worst” could get in, say, Stalinist Bulgaria; but they are as bad as bad can be, in American terms. They are living demonstrations of the intellectual and moral vacancy of socialism, American style, and so is their boss, Obama.

So where do we stand today?

We stand at the end of a quarter century of confirmations of libertarian ideas. We stand in the midst of an enormous popular rebellion against the state, a rebellion conducted almost entirely in libertarian terms. The Tea Party movement is not the only example. In every state, in virtually every county, ordinary intelligent Americans are calling for retrenchments of government. Sometimes their protests are united with demands that run contrary to libertarian ideas, demands motivated by conservative religious dogmas or opposition to international trade or the “outsourcing” of industry. But these notions are not the rudder on the ship. In most cases, they are scarcely heard.

If nothing else, the elections of 2010 showed that the American people are tremendously dissatisfied with the performance of Obama and his party, and on thoroughly libertarian grounds. The results of the election indicate a massive revolt against both the arrogance and the enabling ideas of the modern state.

The vehicle of this revolt has not been the Libertarian Party, which is no longer the most conspicuous political manifestation of the freedom movement. The main vehicle is now the venerable Republican Party. Despite the absence of a self-described libertarian president, despite the presence of time-serving apparatchiks as leaders of the congressional Republicans, the G.O.P. is as much infused with libertarian ideas as the Democratic Party is infused with socialist ones — and that’s saying something.

This remarkable development was made possible by three other developments, two of them quite unexpected.

One was the internet revolution, a supreme technological application of the principle of spontaneous order that libertarians have always advocated. The internet’s creation of a new kind of spontaneous order ended the hegemony of the government-authorized radio and television networks, which in 1987 allowed barely a hint of libertarianism to surface in the national discourse. Their dominance has been utterly destroyed. Now, anyone who has a good idea, an idea that works — and libertarian ideas do work — can reach out to other people and give the idea a potent political expression. Add to this the growth of cable TV, hungry for real ideas that will interest real people.

Another unexpected development was the growth in influence of libertarian journals, think tanks, and other voluntary organizations, making their way in the new channels of the internet and cable TV. The Cato Institute, the Mises Institute, Reason magazine, Liberty magazine, FreedomFest . . . these are only a few purveyors of libertarian ideas reaching out to a broad audience of Americans and giving them intellectual ammunition to continue the war against the coercive state. Gone are the days when the New York Times quoted someone from the Ford Foundation, and CBS quoted the New York Times, and “public opinion” resulted. Now libertarian ideas and libertarian research and the evidence of a successful libertarian society, as embodied in the internet itself, are as close as anyone’s keyboard, where they compete very successfully, thank you, with the ideas of the closed society.

What’s the third development? It’s simply the persistence and continual confirmation of essential libertarian ideas. The arguments of Locke and Madison, Friedman and Mises, Paterson and Hayek, haven’t changed during the past 23 years; but they have been ratified by more, and more conclusive, events and understood by more, and more informed, people. This is not the moment for regret or despair; this is the moment for confidence in the future, in our country, and in ourselves.

ldquo;reduce government expenditures.rdquo; of industry. But these notions are not the rudder on the ship. In most cases, they are scarcely heard.

ldquo;reduce government expenditures.




Share This
Syndicate content

© Copyright 2017 Liberty Foundation. All rights reserved.



Opinions expressed in Liberty are those of the authors and not necessarily those of the Liberty Foundation.

All letters to the editor are assumed to be for publication unless otherwise indicated.